Will A.I. Think in Logic or English
Will A.I. Think in Logic or English
We have seen a lot of progress in A.I. during 2023. Every week something new has happened.
ChatGPT is probably the most famous example. It is a chatbot that can talk about anything and generate code.
I started studying A.I. during the symbolic A.I. era, and assumed that when A.I. started producing results it would eventually think in logic. ChatGPT is a big black box consisting of billions of parameters. Trained on a huge corpus of text, by hiding a word and asking the model to predict the missing word. So the opposite of logic.
While this is impressive it is hard to extract business value from this. I see two ways forward for LLMs in the near future.
- Return of Symbolic Logic
- Chat is All You Need
Return of Symbolic Logic
We find a way to translate the result from large language models into logic, say RDF, OWL or a knowledge graph. This is a good medium for storing large amounts of knowledge in a form that you can query and reason about.
Chat is All You Need
It is also possible that LLMs will reinvent some rudimentary form of logic, and that you will just be able to ask it questions.
This seems like a brittle way to represent larger amounts of knowledge. In that case we would have invented a weird new form of ad hoc intelligence, that knows how to chat but is hard to reason about or have confidence in.