Among the continued maturation of Cloud computing and increasing competition between tablet manufacturers, 2011 is expected to see higher use of multi-terabyte datasets for business intelligence and analytics, otherwise known as "big data".
According to statistics from IDC, data use is expected to grow by as much as 44 times, amounting to some 35.2 zettabytes (ZB - a billion terabytes) globally. However, the file size of individual datasets has also climbed, leading to an increased need for greater processing power to analyse and make sense of them.
Storage giant EMC notes up to 1000 of its customers currently utilise more than a petabyte of data on its arrays, a figure expected to grow to 100,000 by the year 2020. Some customers will also begin to utilise a thousand times that - an exabyte or more - within the next two years.
"Every time we've estimated the growth rates, we've been wrong, and we've always been wrong in the wrong direction," president of EMC's unified storage division, Rich Napolitano, said.
"It's always growing faster. Use cases expand; it's been true for 30 years and the data types are richer and richer."
According to Melbourne IT CTO, Glenn Gore, the big data phenomenon has been on the radar for some 18 months already.
"A lot of our current innovation and research is around analytics of these big datasets and how you manage them, and how you get meaningful information from what has traditionally been too big to view or handle in traditional technology," he said.
The hosting company has seen various companies in the areas of health, geospatial imagery and digital media begin to build datasets that extend into multiple terabytes, often taking more than a day to analyse, split over several systems.
While current hardware is certainly capable, Gore said the awareness and commercial models used to handle such datasets were not yet being properly considered.
Some analysts have pinned continued data growth on disparate increases in video traffic over the internet but according to Ideas International storage analyst, Christian Ober, the rising feed of data from smart devices, such as new electricity meters, will likely contribute to a greater extent toward these huge datasets.
"It's about having a sea of sensors out there, having real-time data coming through to be analysed," he said.
"The wind is brewing around [big data]. My personal opinion is that the drive behind this is... probably from some of the very large US-based organisations - like Google - in terms of wanting to go through and get the compute working with the storage in a highly optimised way."
Vendors are vying for the prize too. IBM's $USD1.7 billion acquisition of Netezza in September last year coincided with simultaneous acquisitions from EMC of Isilon for $USD2.25 billion and Greenplum for an undisclosed sum between July and November 2010.
Oracle too has begun to build its optimised data warehousing and analysis capabilities, in hopes of securing contracts as soon as companies voice their requirements.
"We're already talking with a lot of companies in Australia and in other markets," president of EMC's Asia Pacific and Japan region, Steve Leonard, said.
"The expectation for us is, as we do a good job of communicating [big data] to the market, and as the market says 'we can do that', that's going to be one of those sort of growth slopes."
The vendor also hopes the new version of its flash caching technology will push the use of big data among large enterprises. The feature, which has been available since December last year, was beta tested by Melbourne IT among 60 others, including two global banks who, according to some at EMC, clamoured to get in on the trial.
On the other end of the scale, small storage arrays are also likely to increase in use throughout the year, with EMC's recently announced VNXe entry-level products taking on similar offerings from long-time rival, NetApp.
Though primarily targeted at small business users unable to afford larger arrays, Computerworld Australia understands representatives from the likes of Woolworths, Macquarie Infrastructure, Telstra and Melbourne IT expressed interest in utilising the smaller arrays for remote offices and areas with lighter on-site storage requirements.