This article was originally published on Technology.Info.
As part of our continuing strategy for growth, ITProPortal has joined forces with Technology.Info to help us bring you the very best coverage we possibly can.
It’s no secret that bloated data centre infrastructures cost the organisations that run them small fortunes in power and cooling costs, but still IT teams are expected to oversize kit in order to stay ahead of business demands for processing power and storage.
In recent years, this conundrum has prompted a wide range of IT suppliers to launch data centre optimisation services that promise to help customers streamline infrastructure, slash energy bills and, at the same time, continue to deliver on end-user expectations. These suppliers include, for example, Cisco, Fujitsu, Oracle and IBM.
The services provided typically incorporate an assessment of the current environment, along with a host of suggestions as to where improvements might be made: by investing in converged infrastructure, for example, or using virtualisation to consolidate many small servers into fewer numbers of larger one. Much depends, naturally, on the technology specialisations of the company delivering the service - and what kit their sales guys are under pressure to sell that quarter.
At the same time, the problem has also enabled a newer category of software - data centre infrastructure management or DCIM - to find favour with data centre owners and operators. Originally conceived as basic registers of data centre assets - a task that many data centre managers still perform using humble Excel spreadsheets - DCIM products are evolving rapidly, increasingly incorporate sensors and are starting to integrate IT and facility management disciplines to provide centralised monitoring, management and capacity planning for critical systems.
Right now, however, the market for these tools is vastly overcrowded: many vendors are unprofitable, and while there are more than a dozen VC-backed suppliers, almost half of revenue goes to a few top earners. Leaders in the market, say analysts at IDC, include CA Technologies, Emerson Network Power and Schneider Electric, while ‘major players’ include Cormant, FieldView Solutions, iTRACS, Nlyte Software, Panduit and Raritan.
As data centres continue to evolve, managers must address a growing list of pain points, according to IDC analyst Richard Villars. “Data centre facilities and IT executives are dealing with delays in application rollouts, disrupted service to customers, unplanned spending for patches, an inability to roll out new products or services, and unplanned downtime,” he says. DCIM can help, he adds, “by providing consistent and complete information about data centre infrastructure.”
It’s an interesting market to watch, with plenty of announcements and activity among vendors trying to grab their slice of customer attention and budget. In August, for example, Schneider Electric released a new module for its Struxure DCIM suite that uses Intel’s Virtual Gateway technology for remote access to servers without the need for additional hardware. Emerson’s latest release of its Trellis DCIM product, announced in early September, incorporates new visualisation capabilities that make it easier to establish what impact an individual device’s failure might have on wider operations. And HP’s OneView product, while largely designed to manage compute and storage capacity, is starting to tackle some DCIM functions, using visualisation technologies to help IT staff identify ‘hot spots’ of higher temperature in data centres.
DCIM is the hottest topic in today’s data centre sector, according to analysts at DatacenterDynamics. Conservative estimates place a market that was worth around $430 million in 2012 rising to some $1.2 billion by 2015. “This competitive landscape shows a rapidly growing and expanding supplier base needs to reach a knowledge-hungry global audience,” they say.