“Look at those cavemen go.” When Mr Bowie sang this, I’m not sure it was a positive comment, but I’m severely limited in my ability to undertake such literary criticism. I wrote my last essay when I was 16, in stark contrast to some of my university friends who could produce a convincing argument that a RAID array controller manual was a commentary on the struggles of the individual in modern society. Actually, when I hear about those cavemen, it strikes me that they were rather impressive.

So put aside your day-to-day concerns – that virtualisation project or new firewall can wait a few minutes. Step back and let’s consider those cavemen and their defining quality: their information. After all, information is the ‘I’ in ‘CIO’.

I live in a Suffolk village; nothing happened here for 1000 years or so before it all hotted up. The church tower was used to look out for Vikings – gatecrashers who would turn up, have a party, take the booze and leave with the cute women. Even this only happened once every 50 years or so, so information flow was very limited. OK, when the Vikings did arrive, it did require real-time data processing.

Not a lot changed for another 500 years before a major informational leap forward with newspapers, practical post and the railway, which meant that for the first time people moved more than 10 miles from their birthplace. Even in my mum’s 1940s rural Ireland, there were so few phones in the town that communications revolved around boys fetching the relevant person when a call came in.

By the 1960s, the local accountant got 10 letters a day and some phone calls. That same person will now get 200 emails, work with people all over the world and be on the phone anywhere and anytime.

If we plotted the amount of information being processed by our Suffolk dweller over history, it would look like the curve you get when you drop a match into a can of petrol – not much for a long time, then an explosion.

As the information flow or Input-Output (I/O) has grown in orders of magnitude, the ability of our brains to produce output has kept up. This tells us that the human being throughout history has been I/O-bound, not processor-bound.

As we fix this I/O blockage with the internet, videoconferencing, mobile communications and so on, the brain gets more and more done. In fact we can now network those brain processors into much more powerful, multiprocessor brain systems as modern collaboration systems allow distributed teams to act as one.

These changes, however, are not without serious effects. The productivity of the best land tiller in medieval Suffolk would not have been so different to the worst, whereas the ability of the most creative, skilled and experienced people is now magnified by the idea I/O explosion, making them, in a productivity sense, many times more valuable than the least skilled. This will lead to further questions for society, faced with the digital genius who creates value worth tens of millions alongside the disadvantaged underclass from a sink estate. These questions will only get more complex for us to address.

When I first started out, I worked with hardware. To design a chip, you sent off for a data sheet. Four weeks later it arrived and you started work. Now not only is the data sheet instantaneous but I can find 20 people online who can tell me their experiences of the device.

So are we becoming processor-bound? Is there now so much idea I/O that the problem is switching over to being processor-bound, snowed under by email, tweets and instant messages. If that is the case, it’s time for the I/O network to become intelligent, filtering out the idea packets we do not need to process. These packets might be obvious in the case of things like spam, but I have discovered how many idea lumps I receive that are wasting my processor time. In a recent crisis of throughput, I did the usual thing of dragging all my inbox to the trash, the theory being that anything important will reappear. In fact, only about half did.

For the future of IT, this is a key question. Up until now, IT has moved, stored and routed content, but never had to actually understand its meaning, which was left to the human processors. If those human processors are running out of capacity to handle all the stuff arriving at their input buffer, then we are forced into the next era of IT in which, in order to progress, the systems must become intelligent and in some sense start to understand the content to filter, process and route what we need by virtue of its meaning. It can no longer treat content like tins of peaches to be moved around – the tins will have to be opened and tasted.

As for Life On Mars, if it is there, it’s microbial, so to catch up with the cavemen those microbes have a long way to go.
Next month, Bohemian Rhapsody and its role in business process management.

About the author:

Mike Lynch is the founder and CEO of UK software company Autonomy