Discussion about this post

User's avatar
Ondřej Frei's avatar

Thank you Erik for another insightful piece, and hope everything is good now with your health!

> Real intelligence is embodied. It exists within a living system, interacting dynamically with an environment. AI, on the other hand, is an abstraction. It predicts text sequences, not causes and consequences.

For some time now, I've been encountering approaches to "embody" AI by equipping it with sensors etc. But fundamentally, my instinct tells me that any such attempts are doomed to still fail because conceptually, pairing an abstraction with another abstraction cannot produce a concretion. (If the underlying assumption was right, "autonomous" vehicles would be the closest thing to AGI, right... ?) Do you have any thoughts on this? Is my intuition off and can these "artificial embodiment" attempts succeed in an unexpected way, similarly to how LLMs got great at mimicking understanding without possessing any?

Expand full comment
Josh Harrison's avatar

Have you studied any of the work of Yan LeCun? He gave a really interesting interview in which he said LLMs are rather “stupid”. From what I understood, he seemed to say the LLM model lacks an intuitive physics about the world. He said the typical 4 year old has absorbed more data about the world than the largest LLMs. Seems to track with a lot of things you’ve written.

Expand full comment
7 more comments...

No posts