Most of us still use a PC with a QWERTY keyboard and mouse but the first IBM PC arrived in 1981, QWERTY dates back to the 1860s and Douglas Engelbart demonstrated the mouse in 1963.

But things are speeding up. The iPhone was the first touchscreen device to capture the imagination of buyers and now the squeeze, pinch and expand metaphor is well established across phones, tablets and screens.

Microsoft's Surface PC uses a combination of touch and gesture to control a computer that looks more like an electronic coffee table. Bump is an app which lets iPhone and Android handset users share contact information by 'bumping' their phones against each other to invoke a sensor-based response that synchronises encrypted data.

Most intriguingly of all, perhaps, has been the success of the Nintendo Wii and Microsoft X-box Kinect that use cameras, sensors and microphones to make gesture recognition a physical form of input that.

But what does this mean for business?

Quite a lot, as it turns out. At Logica, Danny Wootton, our UK Innovation Director, has been sponsoring a project where one of our developers has been able to use a Kinect box-connected Windows PC so that gesture can be applied to business apps.

So, for example, you can wave through a change of slide in PowerPoint or rotate a map of the globe and zoom in on countries to discover local information.

We foresee this deep, visceral interactivity leading to some amazing, movie-like scenarios.

Think of security scanning that would let staff search through baggage without personal risk, employees practising First Aid on virtual patients, sophisticated robotics being physically controlled or being able to voyage through the body to examine diseased organs.

Some of these might appear far-fetched but gesture as part of day-to-day business might not be more than couple of years away as applications become gesture-enabled and the technology improves so that even finger and eye movement can be precisely tracked.

As Danny says, the only possible downside is that health and safety rules will need to be tweaked to deal with the dangers of employees waving their arms and legs around!

And more forms of input are coming. The demand for hands-free input will surely lead to the eventual success of speech recognition and natural language processing as a means of controlling and dictating to systems rather than just accessing them.

Work in projects such as the Typhoon Eurofighter jet and French Puma helicopter that have used speech recognition have helped the technology mature.

But having speech embedded and working in a mass-market product will be the 'Eureka' moment. Similarly, handwriting recognition is overdue a breakthrough app. Once cursive script can be instantly converted to ASCII text, a whole new computer usage model will appear.

Some pundits imagine computers that are more like cars that we 'drive' using other parts of our bodies. We might use a wheel to control the system and pedals to accelerate forward in simulations, or use gloves to add a tactile experience while goggles enhance a 3D effect.

At this point, the business computer blurs the lines with an arcade game or certain scenes from the film Minority Report.

In the future we can go even further and neurological computing will see our brains control what we can make systems do. There will be ethical concerns and we might find out more about the mind than we gain from useful applications but when we reach the stage of the brain-computer interface we may feel that we really have come a long way in terms of input.

Craig Boundy is UK CEO of Logica

Pic: SMI Eye Tracking cc2.0