In the good old days, managing the infrastructure could not have been simpler, goes the story.

Infrastructure was numerically equivalent to whatever was in the datacentre; the network was the cabling from there to the six dumb terminals; and storage was inside the cabinet. Ah, les beaux jours! Of course, this picture is far from accurate and rose seems to have become the de facto tint in these memories. If everything was so rosy why did we have the mini revolution of the 1970s?

Organisations were fed up of lack of response, expense of mainframe cycles and general rigidity. Infrastructure back then may have been easier to manage because it was geographically tightly bound but it was not good enough for commercial data processing. Admittedly now the picture is more complicated. Companies report an endless struggle to effectively manage their networks, especially in the areas of security and infrastructure management.

In recent years, the increased demands for effective access controls on data have highlighted the difficulty of tightening security without crippling the business. Instead of the IBM diktat now we have the Microsoft equivalent. Increased dependence on Active Directory and standardisation on Windows desktops has brought new challenges.

End of the line

One response might be to say enough is enough – this fleet of diverse devices, connections and security breakpoints is too much.

Indeed, that has been the approach of a number of organisations in this year’s MIS 100. In early 2006 energy and exploration giant BP went public with a bold plan to take 18,000 of its 85,000 laptops off the LAN, claiming it will make the business more secure as a firewall gives organisations a false sense of security. BP is a member, along with other UK industrial heavyweights like ICI and Rolls-Royce, of a user group called, tellingly, The Jericho Forum. The body is convinced security can only be secured through more ‘deperimeterisation’ and open standards. In this view infrastructure management and better security comes from a ‘less is more’ decentralised approach.

This has also been a solution flagged by Gartner. Last year, at the firm’s midsized enterprise summit, Tom Austin suggested that in the face of the coming explosion in wi-fi devices, 3G-enabled laptops and customised desktops, the intelligent IT manager’s response should be to let users take some of the strain since there’s going to be too much to manage and secure. “To avoid drowning in management overhead a new policy is needed: don’t touch,” he told delegates. In this scenario the central IT function puts budget out to allow business units and users to buy and manage their own kit, so long as it conforms to some loose form of central standard.

This puts the security onus on to individuals, with the ultimate aim of having a self-managing network. I suspect this will be a concept too far for many IT leaders and how it would play in the more conservative boardrooms heaven only knows.

So what’s on the agenda today is the management of complexity. One option is to give it to someone else to look after, which is why God invented outsourcing and managed services. But if you don’t wish to go down that route – perhaps you can’t see what specific savings would be made this way, or that your network is too core an asset to be put outside your control – then the best advice seems to be the more central guidelines on offer, the better.

An example here is the successful network refresh at Hampshire County Council, whose new thin-client based topology is generating cost savings of £4 million a year – half of which comes from that improved infrastructure. There was investment made up front to get there, says the council but having a simplified architecture seems to be the best way to improve efficiency. A bit like the good old days, you ask? Stop showing your age.