The Apple Tablet, which it seems at this point Apple will reveal to the world on Wednesday, will very likely be a respectable accomplishment in the engineering of computing hardware. It will be light, have good battery life, have touch-sensitivity, and a nice big screen – it will probably also have some great processing speed and power for its form and size. It will surely also keep a lot of laps very warm!
But what’s really impressive about Yet Another Computer? Sure it will be beautifully designed, but what’s it going to change about the way people use computers?
Computers are inherently limited by their input capabilities. I would argue that the reason the iPhone has been so innovative is simply because it incorporates more types of sensor and allows for different kinds of measurement and input. The early iPhone had an accelerometer, GPS, and a still camera all in one – at the time this was impressive because the higher computing ability of the platform allowed for some impressive applications that turned the data from these inputs into even more interesting data. Social networking via mobile device finally became possible.
The next iPhone introduced a compass, video recording, and Voice Control – and all of these were also very interesting additions. Augmented reality applications now entered the fray, thanks to the compass. Streaming video and live broadcasting applications are now commonplace. Dictation applications are also starting to pop up (voice translation applications can’t be too far behind).
So my guess with the new Apple tablet coming out is that it will only truly be innovative if it incorporates more sensor inputs. I don’t think this is actually too likely, because I’m not sure that there are many other input devices that are “mainstream” in devices at this point. Maybe infrared sensors, in the sense that the Wii uses them to detect motion and distance from Wiimotes? But I’m not seeing that just yet. Perhaps a gyroscope of some sort, to improve accuracy compared to an accelerometer? Both are possible, but neither really opens up new opportunities for innovative ways to use personal computing devices.
Here’s a short list of additional types of inputs and innovative ways they could be used:
1) EEG/neural interfaces – biofeedback to determine my mood; an iPod that plays sad or happy music to match my feelings when appropriate. On/Off control by thought, change volume by thinking “louder” and “softer”, the ultimate remote control. Agents that suggest things to do based on my current mental state. (“Go to bed, you’re tired.”)
2) Temperature sensors – computer wakes up when I enter the room and the temperature rises. Automation of heating and cooling and lighting systems.
3) Biosensors – detect my internal/external temperature, my heart rate, blood pressure. Health applications are obvious. But how about using these to detect moods? Or lies, for that matter? Who doesn’t want to carry around a personal lie detector in their pocket that buzzes them when the person across from them changes their tone of voice, captures their microexpression on camera, and notices their skin temperature rising slightly? EMG (Electromyogram) to detect when my muscles are flexing and react to a twitch or flicking movement of a hand.
All of this doesn’t even approach the topic of innovative outputs. Projectors pop immediately to mind, and projection technology is just peeking over the top of the horizon now. Personal projectors are not far off. And painting an image onto a wall with a projector, and then feeding it via camera into an augmented reality app for even further enhancement could create some really interesting positive feedback loops.
I don’t believe that the new Apple tablet will incorporate any of what I’ve described above. But this is the path that future innovation needs to take, to keep adding more and more input devices to collect more and more types of data, as clearly demonstrated by the advancements made by the iPhones. I do think that it’s likely that the tablet will have gesture and handwriting recognition, in the form of pen computing. It would be ridiculous for Apple to release a tablet that didn’t have that, and there was some previous evidence of work at the OS and API levels showing recognition of handwritten Chinese and Japanese characters. I think it’s also possible that there might be increased haptic sensitivity – maybe better measurement of pressure on the screen surface, maybe even temperature measurement, and possibly even vibrational feedback. Having a front facing camera might also be a no-brainer; dual cameras are better than one or none.
The bottom line is this: the future of computing is nigh, and we are all going to become Batman. And it’s going to be awesome.