It has been 136 years since the first commercially successful typewriter was launched, by E Remington & Sons. Various machines for writing had been around for years; an early patent dates from 1714. But the Remington had a keyboard laid out in the now-familiar QWERTY format, designed to stop the typebars that carried the characters getting tangled as the typist got faster.
For the first time, a machine enabled a person to type faster than they could write by hand, and the device rapidly became a part of everyday life. So when the personal computer came along in the 1970s, the keyboard was a natural choice for the means to input data; apart from the mouse, which came along in the 80s, the dominant way of talking to our computer is still a device that was invented in the year Napoleon III died.
Today, though, we do very different things with computers. We watch TV on them, play games, make phone calls and remix music on them. The poor old keyboard is starting to look a bit past its sell-by date.
Now, we can touch, stroke, talk to, wave and even think at our computers to get them to do what we want. It is changing our relationship with technology and fuelling the pervasiveness
of computing as it is unshackled from the cumbersome keyboard.
Multi-touch devices such as the iPhone have started to wean us off our keyboards - although, while 100% of creative directors have them, just 2% of UK humans do, so we tend slightly to exaggerate their ubiquity.
Nonetheless, touchscreen interfaces appear on handsets from all manufacturers, and touch is built into Microsoft's new operating system, Windows 7. Right now, touchscreens are available on high-end laptops and monitors from several manufacturers, but within two years it will be as common as the built-in camera above the screen.
Even this starts to look dated compared with a $300 Epoc headset. Its built-in sensors detect brain activity, while a gyroscope senses head movements and uses them to control the user's computer. Initially expected to gain popularity with gamers, the headset makes gaming more immersive: frown and your avatar frowns; smile, shake your head, nod or blink and your little friend follows suit.
It can also detect cognitive control actions - where you ‘think' something to happen. Push, pull, lift, drop and rotate items you are looking at, the blurb claims, in six axes - three more than the usual three we are lumbered with.
Next year, perhaps. But the input method that has changed the gaming world in recent years has been gesture.
Pioneered by Nintendo's Wii, accelerometers have become the way to tell your device to do something - turn your phone on its side to display landscape pictures, shake your MP3 player to move to the next song.
Next year, Microsoft is expected to move the technology on further. Project Natal comprises two cameras on top of your TV, connected to an Xbox. The cameras can recognise individuals, and you can change channels or alter the volume by waving at the screen from your seat on the sofa.
What this will do for family fights over what to watch remains to be seen, but, if you saw Professor Beattie's persuasive presentation at the recent Marketing Society Conference on the capacity of our gestures to give away our true feelings while our words deceive, then the introduction of gesture-control to computers has the potential to generate an interesting outcome.
Will our computers know what we really mean, despite what we say? Will they be able to detect lies, and carry on selling us that car when we have said we can't afford it? And will they tell us?
Andrew Walmsley is co-founder of i-level
30 seconds on body language
It is estimated that humans send out up to 700,000 different body signals. Research suggests that the face is capable of more than 250,000 expressions, while 5000 distinct hand gestures have been identified.
Some researchers in this field believe that non-verbal signals may account for as much as 80% of all of our communication, although the consensus puts the figure at about 55%.
When a person folds their arms in an amicable situation, they may be thinking about what is being discussed, but in a confrontational scenario it may be an expression of opposition.
Communication through body movement is known as kinesics, a term coined in the 1950s by anthropologist Dr Ray L Birdwhistell.
When we are excited or afraid, our pupils dilate due to the release of adrenaline.
The brain tells the body how to position itself. Body language is subconscious and is intended to create a balance between inner feelings and outer appearance.