Back in the mid-1970s I decided that the type of computer I really wanted was one implanted in my body and attached to my central nervous system, so that I could interact with it via nerve impulses back and forth. I wouldn’t be able to really use it immediately as I would have to learn to emit signals along certain nerves, those that instead of being connected to various physical muscles were connected to my computer, replacing input devices like a keyboard or pointing device. In the other direction, I would learn to process the signals emitted by that computer, as I process the vision and sound emitted by my eyes and ears. In time it would happen without thinking, just as I don’t have to think about what nerves to activate in order to turn my head to the left, or how to interpret the signals from my eyes in order to form an image.
I suppose the computer-to-brain direction could be called a sixth sense. (But it wouldn’t allow me to see dead people
The “Sixth Sense” wearable computer recently developed at MIT introduces a new method of practical interaction that doesn’t use a conventional physical screen. It’s not an implant, but it’s a big step along the way. Watch the video of the TED 2009 presentation here
. 8 minutes 42 seconds of mindblowingness.