Robotics Transforms Digital Algorithms to Physical Algorithms
Sensors feed in data from the physical world into the digital world. For example, light can be digitized via camera sensors, audio via microphones, and impulse via torque or pressure sensors. Most importantly, thoughts can be digitized via the keyboard and mouse - think writing or programming - and in doing so, thoughts become data.
Why input physical data? Because computers transform, distribute, and output that data at scale in useful ways. Code, audio, images, or anything other input are transformed into web interfaces, predictions, iPhone applications, virtual worlds.
There's just one problem: Computing outputs so far have been constrained to the screen. Insights could only be read from a screen. Virtual worlds must be experienced through a screen. Web applications and iPhone apps take place within screens. All data is read and interpreted by a human, via screens, before the physical world can be changed.
Robotics allows computers to break free of the screen. Sorting a linked list becomes sorting packages in a warehouse. Pathfinding algorithms become driving Sally from San Francisco to Menlo Park. Procedurally generating worlds becomes 3D printing and assembling buildings.
Robotics will leverage all of the progress computers have made in the digital space and bring it to the physical space. The knowledge automation we've seen so far will pale in comparison to the physical automation that's yet to come.