There has been an interesting evolution in the way that software is deployed in enterprises.

When I started as a database administrator in the 1980s, everything was on the mainframe. This gave a high degree of central control and was efficient in many ways, but proprietary hardware and operating systems meant high costs.

Minicomputers offered dramatically reduced capital costs and boosted creativity among software developers. The dawn of the PC era meant that significant applications were now deployed directly on to desktop devices, or in client/server configurations. This greatly lowered the entry costs for software developers but also gave considerable headaches to those having to manage corporate software deployments. Getting even a simple update deployed became a major exercise, users installed all sorts of goodies on their PCs, and the picture was further complicated by a fleet of PCs of varying ages which made it hard to keep OS versions in line and further complicated application deployment.

Companies tried to reel in the chaos by rolling out ‘standard desktops' as pre-configured bundles of operating system and approved applications that had been tested to work together. But the process of rolling out such a package to a corporate environment was a major project in itself, and end users inevitably chafed at having their freedom restricted, wishing to deploy the latest applications quicker than their colleagues could test them.

Web-based applications have further reduced the barriers to entry for software developers but have brought a new diversity of operating environments.

Letting the application genie out of the mainframe bottle also complicated life for software developers, as has the profusion of rival operating systems. As an enterprise software developer you have to worry about getting your software to work properly on different versions of Microsoft and Unix operating systems, on a range of database platforms (Oracle, DB2, SQL Server, MySQL and so on) and a range of application and web servers (Apache, Tomcat, WebSphere, JBoss). The sheer number of permutations of these make comprehensive testing, even using automated testing suites, very difficult.

This complexity slows everyone down. Software developers have to spend more time testing while corporate IT staff are constantly battling to maintain a reasonably standardised technical environment in the face of the demands from their customers for a work computer that's as up-to-date as their home PC.

However, the worm is turning with the advent of cloud computing. Pioneered by companies such as that have vast datacentres that are only partially used outside of peak times (such as in the run-up to Christmas), the idea is to offer a hosted deployment environment over the internet via a web browser. Because there is no software deployed on the end-user PCs, a standardised environment exists.

This represents an opportunity both for end user organisations and software developers. The software developer knows exactly which operating systems, database and app servers are used in the cloud configuration, while end user companies essentially devolve to someone else the problems of maintaining an secure and available infrastructure.

I feel that this has the potential to be a dramatic shift in the way that applications are delivered to enterprises. A great part of the success of, for instance, was the ease of deployment, the software sitting as it did in the cloud.

There is a significant change in economics too. Instead of perpetual up-front licences, the model is usually a rental one, reducing capital costs. Companies have the prospect of avoiding capital expenditure on servers and can start to contemplate the prospect of reducing, or even eliminating, their existing IT operations environment. Clearly there is a great deal of trust required in such a move. At present, if there is a problem with the corporate application there is someone to shout at down the corridor, but in the cloud world you are one step removed. Some early cloud adopters found this out recently. In October the presciently named Danger, the Microsoft subsidiary that provides the software platform and hardware behind T-Mobile's Sidekick phone, managed to lose user data due to an apparent failure to back up data properly.

Even if the data is eventually recovered this is a wake-up call to those who have an entirely rose-tinted view of cloud computing.

But despite such early glitches, I believe that the sheer economics of cloud computing are compelling and that all but the most conservative corporations will begin to explore greater use of cloud computing facilities in the coming years. This has the potential to change the face of corporate computing, taking it full circle to something akin to that of the centralised environment of the mainframe that I grew up on.