The embodied AI conference was amazing. This is it’s third year and it was refreshing to see people who take a holistic view of intelligence — not as some narrow neurocentric hallucinatory system but as a fully embodied, actuated, and iterative system.
My own thesis posits that intelligence is mostly driven by a world model rather then the specific controller for a body but that all the ingredients are needed — because a body or set of bodies is an efficient way to refine your world model through experiment.
I especially found the following presentations fascinating:
- Yukie Nagai’s description of intelligence as a prediction machine that predicts environment states from prior examples using a body — in this case a robot arm — was highly salient
- Mattej Hoffman’s talk about the need to truly embody AI and how much of the brain tailors to the body i.e. bats don’t allocate much power to vision reinforced my idea of world models mattering most because fundamentally all brains work the same, they just use different sensors and bodies to tweak the environment
- Jonathan Hurst gave a galvanizing practical talk about robots in the workplace being human-centric and gave practical insights into the system being built by Digit.
- Rika Antonova’s talk on differentiable simulation was fascinating because this would enable learning from simulation i.e. your Innerworld and updating this model as you go. The central piece of intelligence.