Datacentres began to grow in importance during the dotcom boom when organisations began to realise the true potential of the internet. But unlike many of the headline-grabbing online upstarts of that time, datacentres are still here and here to stay.

Regardless of the scale of a datacentre, they will share common issues: an increasing pressure to improve their green credentials for one thing, and a need for CIOs to actually understand that they have and what they are working with for another.

Datacentre consolidation through acquisition or need has also had an impact, to the extent that when firms are pressed to provide details on the exact details of their server performance and footprint, they aren’t always able to. Not because the information is not there, but because they do not know how to access it.

Datacentre infrastructure management­ (DCIM) tools are an important part of the CIO’s armoury, but can, when used corr­ectly, boost server performance, reduce energy consumption and improve an enterprise’s bottom line.

Much of the talk surrounding datacentres today concentrates on the cloud and virtualisation, the less tangible sides of the architecture. But there are other elements that can be overlooked as firms try to do more with their server estate, or indeed less with it, and it is here that DCIM tools come into their own.

Although virtualis­ation and the cloud turn servers into utilities, DCIM looks at them as part of a larger structure, as building blocks if you like, within a much more widespread creation.

DCIM tools and software give a much needed and very finely detailed view across a datacentre, analysing everything from the way that power is being consumed to performance issues and problem hotspots. They can be used to notify users of current performance metrics, but also to predict what impact any changes could make, and can help in identifying where power or systems might be better used.

As well as providing this key information, it can be used to drive down costs, particularly at the moment when rising power costs are an issue for CFOs. Being able to act on this also helps firms meet their own corporate social responsibility objectives.

The number of vendors offering DCIM services, which are pegged as somewhere between datacentre management and fac­ilities management, is predictably large.

In a report released at the end of last year, research firm IDC produced a list of the top 15 players, those firms that it sees as leading the market and offering the best products for enterprises.

The make-up of the list, which includes 1E, Arch Rock, CA, Cirba, Emerson Electric, HP, IBM, Modius, nLyte, PowerAssure, Raritan, Rittal, Sentilla, Viridity and VMware, shows just how complex the choice is, since it is made up of acknowledged enterprise firms such as HP and IBM as well as other, newer, server players such as virtualisation darling VMware.

At the most basic level DCIM services should at least meet some of the following organisational requirements: they should allow for asset discovery, for analysis of power consumption and trends, enable real time analysis and measurement of ­capacity and reporting.

A later comment from Katherine Broderick, the IDC senior research analyst responsible for the report, said that the market was in no danger of diminishing thanks to its worth to CIOs as a cost-saving opportunity, and will create an impressive marketspace in its own right.

“DCIM will grow to shape datacentre facilities and IT operations for years to come,” says Broderick.

“DCIM’s combined software and services revenue will grow from $179.4m (£110m) in 2010 to reach $557.7m (£342.1m) by 2015.”

Analyst Gartner backs up the IDC findings, saying in a report last year that an increased push for standards around datacentres and their energy use would make the need to cut power consumption and wastage more pressing. The report estimated that using DCIM tools could lead to a drop in power consumption of around 20 per cent.

According to Gartner, adopting the technology will require some changes, predictably in the way that IT and facilities monitor their systems, but said that the ­experience would be rewarding in a number of ways.

The author of the Gartner report, David­ Cappuccio, told CIO that DCIM had evolved into a mature proposition from an early by-product of the green IT agenda into something far more sophisticated.

“DCIM is an offshoot of the green IT initiative and originally was designed to do basic energy monitoring, reporting and management at the datacentre level. This was really not done in the past, or if it was, it was rudimentary,” Cappuccio says.

“Facilities teams would monitor the power income, air conditioning and the like, but nobody monitored server consumption, storage or networks from an ­efficiency point of view – only from a performance viewpoint.”

Efficiency drive
The early benefits of DCIM were immediate, according to the Gartner analyst, who added that for decades all that server admins have been able to tell the CIO was whether a system was alive or dead. Surprisingly, he added that this was often enough information.

“Think of it this way – for the past 40 years systems management tools have fundamentally told us two things: is the system up, or is the system down. And that’s all IT cared about. Availability was the key mantra – 99.96 per cent uptime for everything was the focus, yet nobody said ‘Oh, and make it efficient too’,” Cappuccio explains, adding that as well as making the CIO’s job easier, the benefits would also reach into other ­departments, including finance.

“DCIM tools are focused on efficiency and consumption – essentially improving the compute-per-kilowatt ratio,” he says.

“When a 10,000-foot datacentre can easily consume $2.3m (£1.4m) in energy in a single year, CFOs are starting to ask why, and what IT can do about it.

“Without monitoring you can’t identify the problem and solve it.”

By working with what they already have to the best of its abilities, firms can start to worry less about where they are wasting money in a datacentre and concentrate on how to make it run even better where it is. This should help solve the perennial problem that the average server is only running at around 15 per cent capacity.

“The whole focus is going to change towards increased compute density per square foot in order to defer capital spend on that new datacentre, so understanding consumption can highlight where workloads should most optimally be placed, and how best to allocate resources,” adds Cappuccio, who explained that CIOs and CFOs could expect to see considerably less digits in their year-end server hardware and datacentre costs, while IT departments would find themselves dealing with much less ferocious beasts.

“Cost savings are really pretty surprising. Used appropriately we have seen companies improve overall energy efficiency in datacentres by between 15 and 30 per cent. Improved efficiency also implies improved usage on equipment (higher and more consistent utilisation levels) and thus longer timelines between new installs,” says Cappuccio.

Physical concerns
Gartner recommends that firms evaluate the technologies early with a few to rolling them out more considerably at a later date, and in its paper, DCIM: Going Beyond IT, says that CIOs should work closely with their facilities teams on any projects in order to make sure that they are getting the best performing systems that they can.

Working with facilities may seem an obvious point, but it’s one that could easily be overlooked, according to Colin Richardson, CEO at datacentre specialist On365. The finer parts of the datacentre, which include the way that it is set out, must be taken into consideration.

“The old adage is still true that what you can’t measure, you can’t manage,” says Richardson. “Firms invest in a lot of datacentre technology but do not consider the physical infrastructure.”

Not doing this, he warns, means that organisations may miss out on many potential benefits, such as an improved use of energy, better corporate social responsibility ratings and much better measurement.

“Firms must be more agile,” he adds. “They demand better management and control and with this you get power savings. We know of firms that can save as much as $1m a year on power alone.”

Datacentres are of course about deliver­ing services to the business, and they should not be a drain on a firm’s resources. As Richardson points out, datacentres do take a lot of money to build and a lot to manage. In such a situation, he adds, “Costs are increasing and these are very expensive animals. If you can make any efficiencies you can save money.”

“Virtualisation gives you the ability to look into a server and to move around its resources; DCIM does the same, but at the physical layer,” Richardson says.

That last point is interesting, because according to Gartner’s Cappuccio, without DCIM even virtualisation becomes a moot point: how can you get the best from a server when you do not know whether that server is running to its full capabilities?

“Most companies today are virtualising a lot — some say 60 to 70 per cent, but there are two underlying issues that pop up all the time. First is that even virtualised servers are not running at decent utilisation levels on average — we’re hearing 25 per cent is the norm. [This] means there is still a lot of wasted performance out there,” says Cappuccio.

“DCIM tools can monitor server consumption and performance to help highlight this,” he explains.

“Second is the issue of virtual sprawl, whereby it becomes so easy to spin off another image for a customer (for example, for a quick test/dev project) that the number of dead images is growing rapidly. Some of the DCIM tools can now monitor virtual instances and based on consumption trends they can highlight productive versus non-productive instances as well.”

Monitoring or measurement then is key to the DCIM proposition, and should be part of any plans to save power of cut datacentre costs. It allows organisations to accurately measure their datacentre ­performance and carry out analysis on what they find.

This puts the IT team in a much better position to report on performance, but also to intelligently respond to problems such as server overloads or potential issues caused by power hotspots.

Plugging the gaps
The remedies for power wastage sound obvious, like taping up the gaps between rack servers to prevent air leakage or running cold air through a server room. Yet such common-sense cures have been lost somewhere along the way, and DCIM can help bring them back into focus.

“People put a lot of thought into their datacentres without thinking about the power,” says On365’s Richardson.

“Profiling workloads and temperatures means that failure scenarios can be tested or spotted. As can unauthorised access through monitoring tools.”

On365 has recently completed work at managed networking company Telstra ­International, a part of the global telecoms organisation. After the implementation of DCIM tools Telstra was able to cut its yearly energy spend by 15 per cent.

The impact of these savings was recognised by the industry and the energy efficiency project at Telstra’s managed hosting centre walked away with the Environmental Project of the Year gong at the Data Centre Solutions Awards ceremony in May.

Kevin Sell, head of technical facilities at Telstra, says that the benefits of the project extended well beyond a reduction in power spend.

“We have benefited from not only a reduction in power costs, but also a more ­energy efficient and environmentally friendly setup,” he explains.

“The project is helping Telstra to support its compliance reporting targets such as the Carbon Reduction Commitment (CRC) and identify key areas for further infrastructure efficiency improvements in the future.”