From the New York Times: researchers implanted tiny sensors directly into two monkey’s brains, which were wired to a robotic arm. The monkeys were able to learn to control the arm and feed themselves with it using just their thoughts. Quite a stunning example of brain-machine interface technology. Of course, this is nowhere near ready for prime-time (human use) given it still requires a brain implant and a wire through the skull and scalp, but as a proof-of-concept for what is possible, pretty heady stuff. (Sorry, couldn’t resist the pun.)
I’ve written before on my blog and the Foundry Group blog about our view that human-computer interaction (HCI) will undergo a substantial evolution in the coming years as increased computational power and new sensors and input devices allow us to move beyond the mouse and windows UI paradigm. While this particular example points to a future beyond the typical VC investment time horizon, there is ample opportunity in new interfaces and applications that do not require a direct neural interface, and it shows we are in for a wild ride and drastically increased intimacy with our machines.