Today the energy requirements of hyperscale and colocation data centers has attracted much public attention, often amid concerns that the strains they place on electrical generation and their energy consumption presents another challenge in attempts to drive decarbonization.
However, an issue rapidly growing in importance is the efficiency, or energy demands, of smaller data centers at the edge of the network. Not nearly as visible as their larger counterparts, edge computing accounts for an ever-increasing share of the data processing upon which the digital economy depends. Industry analyst Gartner, for example, estimates that by 2025, 75 percent of enterprise data is expected to be created and processed at the edge. While IDC predicts the worldwide edge computing market will reach a value of $250.6 billion by 2024, with a compound annual growth rate (CAGR) of 12.5 percent between 2019–2024.
There are several factors driving the proliferation of data and its consumption at the edge. First among them is the demand for low-latency applications, including digital streaming from film, TV and music platforms. Secondly, the rise in IoT-connected devices, artificial intelligence (AI), and machine learning are driving digital transformation in almost every industry. Many organizations are designing new experiences, reimagining business processes, and creating both new products and digital services that rely on innovative technologies to underpin them. This is leading to more data being created and shared across the network, ultimately causing huge delays in transmission and download speeds.
Thirdly the emergence of 5G technology is also beginning to transform the sophistication of business and mission-critical digital services across multiple industries. However, the short-range that is required to gain maximum benefit from these new technologies means that an inevitable consequence is that ever more data generation and consumption will be driven to the edge. As such, Mobile Edge Clouds (MECs) will see a huge increase in the number of localized edge data centers needed to support them.
The installed base of 3G and 4G cell towers, for example, is approximately 5m with associated base stations. Schneider Electric, which is a member of a group formed by the World Economic Forum to investigate the implications of widespread 5G adoption, expects that there will be three times as many additional 5G clusters enabled by MECs, with around 7.5 million new micro data centers, installed by 2025. Traditional telco networks were never designed for 5G, which means the estimated global investment in the buildout of associated 5G infrastructure will be around €1.3 trillion.
Today various analysis suggests that data centers represent 1-2 percent of global electricity usage, and by 2030 as much as 3,000 TWh of energy will be used by IT. The implications of energy consumption at the edge are considerable, meaning that managing the carbon impact effectively must begin with the systems design.
At the edge, deploying 100,000 data centers, each consuming 10kW of power would cause a consumption of 1,000MW for the IT power energy alone. Assuming a moderate power usage effectiveness (PUE) ratio of 1.5 would mean these systems also emit the equivalent of 800k tons of CO2.
However, if each edge facility was standardized and designed for a PUE of 1.1, we could reduce CO2 emissions by 25 percent to 600k tons annually. Clearly, there is a need to apply the same due diligence to reducing power consumption at the edge as there has long been in the case of larger data centers.
Given the sheer scale of micro data center deployments, not least to form the basis of MECs, the design of this infrastructure must consider electrical, or energy efficiency as a priority. A corollary of the high volume is that installation speed and time to market is essential. Consequently, there is a clear benefit in producing pre-integrated systems where standardization, modularity, performance and sustainability form fundamental components.
Designing for efficiency
Building greater energy efficiency into infrastructure that will be rolled out in such high quantities will have a multiplier effect, not just in terms of lower cost, but in terms of producing increased energy savings and reduced carbon emissions.
At the start, energy-efficient technologies should be part of the basic micro data center design, which may include Lithium-ion (li-ion) uninterruptible power supplies (UPS) that offer increased reliability, longer and more efficient lifecycle, greater charge and discharge cycles, and increased energy savings.
Liquid cooling also offers great potential for energy saving at the edge. Using chassis-level immersion cooling can provide reduced energy expenditure, while removing the need for electrically powered components such as fans, which can be prone to failure. For edge applications, especially those powering 5G MECs, and which are often located in urbanized areas, there is the added benefit of reduced noise pollution caused by air cooling.
AI software is an enabler
Applications deployed at the edge are also mission-critical, meaning that the ability to ensure uptime and reliability is essential. Due to their distributed nature, one of the biggest challenges is managing such a great volume of sites, where it’s not possible to place permanent, on-site personnel. As such, remote monitoring software is fundamentally crucial, not just from an energy efficiency perspective, but to enable greater real-time visibility.
Such next-generation data center infrastructure management (DCIM) platforms offer the ability to monitor multiple edge facilities from any location. Should an outage or failure be experienced, operators can quickly despatch service personnel to perform maintenance or repairs as necessary.
Voltage, system temperature, and network traffic, for example, can also be monitored and managed, offering both optimal performance and insight into energy usage. Machine learning and artificial intelligence (AI) capabilities can also offer data-driven insight into how the equipment is functioning and offer proactive recommendations on how to improve its performance.
With dependency on mission-critical infrastructure continuing to increase at a dramatic rate, and as many as 7.5m new micro data centers expected to be operational globally by 2025, there’s no doubt that energy efficiency and sustainability must become critical factors in the quest to roll out 5G. Operators cannot adopt the same approach they once did with legacy data centers and learn to become more efficient as they go.
While energy management software remains critical, it is the design of these systems which offers end-users a truly practical means of managing energy demands at the edge. It requires standardization, modularity, resilience, performance and efficiency to form the foundational building blocks of edge computing infrastructure.
Further, by considering adaptive technologies, and embracing a culture of continuous innovation, operators can harness the benefits of transformative applications and services powered by 5G. What’s more, by choosing a more sustainable approach to edge computing they can play their part in the quest to drive to net-zero carbon emissions, and that will benefit us all.
Marc Garner, VP, Secure Power Division, Schneider Electric