In his recent piece in CIO (Private Clouds on Parade), Mike Altendorf wrote about the distinction between private and public cloud computing models. Both use the internet to deliver applications, but the former are delivered by IT departments for use by their own customers only, whereas public clouds are usually run by third-party providers for a range of customers.

In the private sector, part of the benefit of cloud computing is the ability to take advantage of multiple suppliers who provide utility applications that seamlessly scale, powered by virtualised datacentre resources delivered dynamically over the internet.

But how relevant is cloud computing to the public sector? After all, the public sector seeks traditionally to impose its own procurement and architectural models rather than taking advantage of what already exists in the marketplace. It also imposes its own silo functions onto third-party providers rather than modelling services around available commodity services, or overall public service outcomes and citizen needs. None of this, on the face of it, is a good fit with the potential of cloud computing.

Despite this, we're already seeing some encouraging take-up of cloud services within the public sector, notably in universities that are increasingly closing down their in-house email infrastructures and delivering them instead through third-party cloud providers such as Google. Benefits include the ability to reduce utility infrastructural costs and redeploy universities' limited resources into more relevant areas of investment. This new approach also helps to foster a flexible, competitive marketplace of utility suppliers, all keen to secure the business.

Clouded judgement

But it's unclear whether these benefits can scale more widely in the public sector unless there are also changes in its approach to governance, architecture and procurement. The wrong approach to cloud computing could reduce the already limited competition in the UK IT marketplace, further consolidating power in the hands of privileged, elite IT suppliers.

Given that the public sector apparently accounts for some 55 per cent of all UK IT spend, its behaviour has a profound impact on the dynamics of the wider market. It's therefore important that public-sector IT leadership not only obtains the best deal for the UK taxpayer, but also helps ensure that the market dynamics and international competitiveness of a vital UK industrial sector are retained and enhanced.

Cloud computing presents a potential lever of reform to help deliver just such a competitive marketplace to the public sector, helping to jump-start a move away from the limited number of suppliers who currently run the majority of its IT services. The restoration of a functioning market is long overdue. Some 11 IT companies alone control over 80 per cent of the public-sector market, with the top supplier rumoured to have around 60 per cent market share on its own. Key to achieving improvements will be an effective governance regime underpinned by interoperability and open standards, including the use of openly published reference codes for data formats and interfaces with which all suppliers will need to comply.

So is this likely to happen? The current signs are not promising. The Cabinet Office is currently leading work in this area that takes an approach to cloud computing with the explicit aim of just six or seven companies providing IT services to the public sector. It's hard to see how this sits with the innovation agenda and the government's policy commitment to support UK SMEs rather than letting power further consolidate in the hands of a group of hand-picked large suppliers.

The ‘G-Cloud' (Government Cloud) is defined as "a private cloud for the public sector that provides the services and benefits of public cloud offerings". Yet public cloud offerings are global, 24x7 and delivered over the internet by a range of competitive providers. The usual requirements of the UK public sector, meanwhile, are for UK-based datacentres compliant with UK security regulations operating over protected networks delivering heavily customised silo services.

In reality, the current concept of the G-Cloud seems more akin to an old-fashioned datacentre consolidation exercise. Far from taking advantage of the new approaches now possible through utility computing models, it could further reduce market competition as well as raising challenges to the UK's critical information infrastructure resilience and diversity. A cloud is not a cloud if it is going to be constrained to the degree that is likely under current G-Cloud proposals.

The G-Cloud approach seems a bit like the government saying it wants to take advantage of commodity electricity suppliers, only to then go to the market with its own definition of voltages and plug standards. Layered on top of this then come further stipulations that providers to the public sector must build and run their electricity plant and distribution systems according to a bespoke government design rather than those that exist in the marketplace, with just a handful of suppliers then to be given exclusive supply contracts. This seems to be driven by the mistaken assumption that demand aggregation leads to the best deal for the UK government, without regard to the consequential impact this has on the supply side.

The G-Cloud proposals illustrate the inverted approach to IT in Whitehall, which starts with a low-level technical solution (cloud computing/datacentre rationalisation) operating in abstract from the business, and imposes it without clear business drivers or an overall strategic vision.

The recent Digital Britain report appears to confirm that the G-Cloud concept is a bottom-up idea adrift without a business case when it states that "provided that the business case [for the government cloud] can be properly developed, the adoption of the G-Cloud will be a priority for Government investment". It's hard to think of many businesses in the private sector that would build a cart with square wheels and then look around wondering what's happened to the horse.

Neither is this the first attempt at consolidating government datacentres. An ambitious £83m project called True North previously aimed to rationalise government requirements into a few professionally run datacentres. The plan, as now, was to begin with a few key cross-government services, such as the Government Gateway and Directgov (then called UKOnline) and then expand outwards.

The result was one of many public-sector IT procurement disasters to have been buried, apparently ending up with an out-of-court settlement when the main supplier sued the government. Some of the thinking and current slides doing the rounds look eerily similar to that earlier, failed approach.

Model reforms

Taking our head out of the clouds for a moment, perhaps the wrong question is being asked. This should not be about how to force cloud computing into a distorted implementation. The real issue is to understand the IT capabilities required by the public sector. And then to go to the market to procure that capability, not to once again constrain and specify to the market how best technically to deliver a desired outcome. Government also needs to be willing to accept an 80 per cent fit if the service can be delivered cheaply and quickly from standard services sourced from multiple viable vendors.

Such a change in approach will require reform to the prevailing model of IT governance in Whitehall, one that integrates it properly with policymaking. And which in turn will also help lead to more effective models of architecture and procurement.

The idea of a G-Cloud could re-invigorate the supply of IT services to the public sector and provide a long-awaited and much-needed market correction. But unless public-sector IT is rooted in the needs of public policy there is every prospect that the current thinking around G-Cloud will go the way of its predecessor, True North, and end up down south.

About the author:

Jerry Fishenden is a director of the Centre for Technology Policy Research and a visiting senior fellow at the LSE