Imagine you were a baby. From birth, you are put into a dark space. You can see a strange world around you, lights above you, passing you. The grey ground around you. Boxes that rise up around you. A washed-out ceiling above.
You are told that there are lines you need to stay between and these box objects on wheels you need to stay apart from. There are strange, arbitrary rules about red symbols on top of sticks and colored lights in the air.
You have to toggle your speed, whatever that is, based on these signals. You don’t really know if that’s right or what it all means, but you try and memorize and find patterns in all the examples of the right behavior, you want to please your parents and do the right thing.
You get it wrong though, because you just don’t know what the world is. What are those lights? What are those lines? The boxes? The boxes on wheels? The grey ground? The blue above?
That baby is a self-driving car neural net. The sad fact is it will never be intelligent because it will never encounter enough variation to learn generalizations about the world. It won’t realize that it is, itself a car. There are people. People are fragile. The car can hit people. It has a limited perspective because it is blinded by its creators within the spotlight they shine for it searching for the truth of reality by only looking at a tiny patch of it.