Corridors of power

07 February 2019

The IoT, 5G, ‘Industry 4.0’… As demand for data continues to skyrocket, RAHIEL NASIR wonders whether the energy efficiency challenges for data centre operators are set to worsen, especially as we move into the hyperscale era.

The environmental impact of data centres has been well documented over the years. But are they actually getting ‘greener’?

pic1

Next Generation Data claims it was Europe’s first data centre to source all its power from renewables.

“We’re seeing a growing divide between increasingly larger hyperscale centres supporting the latest cloud technologies, and much smaller Edge locations where the picture is much less clear,” says Dr. Stu Redshaw, CTO at data centre energy optimisation specialist EkkoSense. “It’s fair to say that, given the huge volume of IT loads being processed, the larger centres have less of an environmental impact than their predecessors.”

Lex Coors is likely to support this view. He’s the chief data centre technology and engineering officer at Netherlands-based Interxion which is one of the world’s largest data centre operators and currently runs 50 facilities in Europe.“If we look at the hyperscale companies in the US, there is a push towards establishing off-grid and greener sustainable energy sources, such as wind and solar. In Europe, sustainable energy sources are lower in cost than traditional ‘grey’ power energy sources. Due to the rise in popularity of sustainable energy sources, the cost per kWh has been going up, but the trend towards sustainability remains. As a result, data centres are increasingly committed to getting ‘greener’ when it comes to improved energy efficiency and the use of alternative energy sources.”

Coors claims Interxion has two unique design examples where the company has tapped nature, rather than the grid, to offset energy expenditures. 

“In Copenhagen, we’ve taken advantage of the local subterranean geology to build two reservoirs. There, water is naturally cooled by the Earth and pumped from one reservoir to the other through a heat exchanger in the data centre. At our centre in Stockholm, a seawater cooling system pumps cold seawater through the HVAC system but unlike typical water-cooling systems, Interxion runs the same seawater through multiple data centres, saving even more energy and reducing energy costs by 80 per cent at the campus.”

Rik Williams, data centre operations manager at Node4, agrees that the use of such adiabatic cooling solutions in dealing with higher operating temperatures represent the biggest moves in data centre infrastructure. He says: “For hyperscale and cloud providers, the use of more energy efficient servers and storage is improving the services that can be delivered for a set amount of power. So overall, businesses are now getting a lot more performance per KWh.”

However, according to data centre design and build firm Secure I.T. Environments, while there are many specifications that look for efficiencies at maximum design, it is also important to look at the greater potential of efficiencies at part load conditions. “[This is] hugely overlooked in the industry,” says company owner Chris Wellfair. “Our experience tells us that data centres never get to their maximum design load.

Wellfair continues by saying that he is also seeing a greater desire amongst clients to source products with greater efficiency. “The more eco-friendly customers have a greater appetite to spend more capex to achieve greater opex, based on a robust business payback model. So yes, data centres are getting greener as clients either spend on more enhanced products or by default as manufacturers are forced to innovate and become more efficient to maintain their market share.”

SPIE UK works with data centres across Europe to help them manage their operations and energy efficiency with a claimed emphasis on green technology. The company’s data centre director Peter Westwood says indications from industry surveys suggest an increase in using green service providers over the last three years, approaching a 10 per cent improvement. He also points out that while free cooling solutions have increased, many existing centres continue with more traditional cooling solutions.

However, Westwood says the key issue in the coming years is that data centres are likely to increase global energy consumption by a factor of four. 

“With global population expansion leading to more people going online, and more nations striving for more technology-based societies, it is inevitable that global power consumption will increase by up to 300 per cent by 2024.

“Data centres already consume about three per cent of global power and the global exchange of data is more than doubling every four years. With this, data centres as we currently know them could reach more than 20 per cent of global power usage by 2025.”

Westwood goes on to warn that if current trends continue, data centres could account for 14 per cent of global emissions by 2040.

 

The greening of the Fourth Industrial Revolution

So as we head into the so-called ‘Fourth Industrial Revolution’ (a.k.a. ‘Industry 4.0’), fuelled by tech innovations that will be predicated on 5G, the Internet of Things, machine learning, etc., how will data centre operators continue to keep their cool?

“Whilst the management of the data centre gets harder as the resources increase, the actual basics of achieving efficiency remain the same,” says Wellfair. “Equally, one of the key characteristics of 5G, IoT and other technologies, is that they are designed to be more efficient than the generation before. In the case of IoT, these devices are all about energy efficiency, as many operate in isolated locations, or their physical size is so restricted that power consumption has to be minimised.”

EkkoSense’s Redshaw also sounds a note of optimism when he says that, if anything, anticipated additional load demands on data centres should help to increase the need for centres to become more energy efficient. He reckons thermal optimisation has a key role to play here, particularly as making existing data centre cooling more efficient will help unlock additional capacity without the requirement for expensive new cooling infrastructure.

Redshaw also expects the shift towards hyperscale to ease the overall energy efficiency problem thanks to increased cloud and data centre optimisation. Node4’s Williams is likely to support this view when he says that it’s “relatively easy” to be energy efficient in large data centres hosting hyperscale cloud platforms because you can site them in optimal locations (if latency is not overly critical). 

Carlini adds his voice to this aspect of the discussion when he says that one trend Schneider has witnessed is a desire among hyperscale operators to move medium voltage power closer to the IT load. “Technically that makes sense; there is less copper wiring needed and circuit breakers can be smaller, but the problem is that even with medium voltages you are talking about 10kV and it is extremely dangerous to place such equipment in close proximity to people, so there are safety concerns that must be addressed.”

He says another technology beginning to gain traction as a means of reducing IT power consumption is liquid cooling. “This is not a particularly new technology and it has been deployed in HPC (high-performance computing) environments for some time. But it provides great efficiency benefits, as water and other liquids are more effective cooling media than air. However, there is a trade-off to be made in terms of maintenance and serviceability. Hyperscale data centre operators have become extremely efficient at servicing IT equipment and replacing malfunctioning parts with the minimum of downtime, but having servers immersed in liquid complicates that process. 

“Nonetheless, as procedures become more efficient over time we expect that liquid-cooling of data centres, especially large hyperscale facilities, will become a major part of the drive towards greater efficiency.”

Interestingly, the growing size of data centres is not an issue when it comes to future cooling needs. Both EkkoSense and Node4 believe it is Edge computing that will actually present a greater problem.

Redshaw says: “The energy efficiency challenge will actually evidence itself at the other end of the spectrum – the edge of the IT estate – as processing power divides into small modules at the point of use. This is not only difficult to monitor from a thermal and power perspective but is also complex to manage as part of an end-to-end data centre estate.”

Williams concurs: “The challenge is for services using edge data centres to minimise latency and bandwidth, as we may begin to see more small scale and inefficient data centres in urban areas which can reduce the overall energy efficiency of the solution.”

 

Is PUE enough?

When measuring a data centre’s energy efficiency, does the oft-used PUE (power usage effectiveness) rating provide the best gauge?

“PUE is of course still a useful metric to determine the overall electrical efficiency of a data centre, but it does little to address the issue of overall power consumption,” says Carlini. “Other metrics have been tried such as CUE (carbon usage efficiency) and WUE (water usage efficiency) but they have not proved as popular as PUE, which remains the most effective way to benchmark data centres of any size against comparable installations.” 

Dr. Stu Redshaw, 
CTO, 
EkkoSense

“You probably need to avoid getting sucked into expensive and over-complex DCIM investments.”

Dr. Stu Redshaw, 

CTO,
EkkoSense

Many of the commentators we spoke to agreed that PUE is not a measure for sustainability. For instance, Redshaw says it’s never enough just to calculate a PUE score and consider yourself ‘green’ if you’ve achieved an impressive PUE rating. “What’s more relevant is whether you actually have a strategy to improve and reduce your PUE rating. If that’s in place, you should be well on your way to continuously improving your data centre performance across all metrics, including PUE.”

Furthermore, others such as Williams, point out that the “classic PUE” will need to be adjusted to take better account of the energy saved by using water with adiabatic cooling, and to factor in onsite generation through solar and wind power.

Wellfair also supports this view: “Unfortunately, the industry has only developed PUE as a metric to measure data centre efficiency, and the diversity of cooling products can skew this figure to the advantage of some providers. 

“PUE is only a ‘point in time’ measurement. Therefore, caution should be exercised when comparing claims on tenders that are very low. Secure I.T. Environments will often offer competitive energy leading solutions for projects but with an option for the enhanced and more energy efficient solution. This way a client can decide to spend more to ensure greener credentials, or pick the more common solution accepted in the industry.

“In providing this, we offer an ‘annualised’ PUE figure which shows an average of winter, summer and two seasons of mixed conditions. This demonstrates an open and honest approach for the whole year’s performance.”

“We expect that liquid-cooling of data centres will become a major part of the drive towards greater efficiency.”

Steven Carlini, 

VP innovation and data centre IT division,
CTO office Schneider Electric

“We expect that liquid-cooling of data centres will become a major part of the drive towards greater efficiency.”

Steven Carlini, 

VP innovation and data centre IT division,
CTO office Schneider Electric

 

Interxion’s Coors also points that regulations vary considerably region by region. For example, he says that in the Netherlands, and more specifically the Amsterdam area, regulations are fairly rigid. “Government organisations only give a permit for the construction and operation of data centres once it can be proven that the design has a maximum PUE of 1.2. In other countries, the rules are not as strict.”

Meanwhile, Next Generation Data (NGD) claims it was Europe’s first data centre to source all its power from renewables. This is said to support what’s described as the company’s “industry leading” PUE credentials – in 2014, NGD announced what was said at the time to be the first PUE rating of 1.0. In addition to this, commercial director Simon Bearne says NGD also has BSI ISO 14001 and UK Government Climate Change Agreement (CCA) certifications, making it exempt from carbon taxes. “Being 100 per cent renewable used to be unusual. But we’ve reached a point where the preference is for renewable energy, and it’s more of a problem to a customer if they can’t get it. We haven’t given customers a choice of anything but renewable energy, and given the levels of power consumed in data centres, we think it’s the right route to have taken.”

 

If you can’t stand the heat…

Going forward, what are the solutions and products data centre managers need in order to stay on top of energy efficiency challenges?

In one word, it’s all about cooling.

Coors says: “When it comes to staying on top of energy efficiency challenges, it is important to take into consideration the different design approaches available for cooling systems. The most expensive solution is chilled water (PUE of approximately 1.18) while the lowest cost solution is direct air (PUE of approximately 1.1). Choosing between one over the other is typically dependent on a mix of customer demands regarding price and sustainability requirements.”

“Cooling and real-time energy monitoring and management are critical to data centre resilience and uptime as well as in determining a facility’s overall PUE,” says Bearne. “With cooling typically accounting for 40 per cent or more of a data centre’s total energy bill, the more that can be done to optimise and reduce cooling the better from cost, environmental and legislative perspectives.”

“Our experience tells us that data centres never get to their maximum design load.”

Chris Wellfair, 
Secure I.T. Environments

“Our experience tells us that data centres never get to their maximum design load.”

Chris Wellfair,
Secure I.T. Environments



On the subject of regulations, one particular significant development last month was the announcement that ASHRAE’s TC 9.9 Data Centre Environmental Guidelines has now been incorporated into the EU’s Regulation for servers and data storage products. As Redshaw explains, this means that for European firms, ASHRAE’s recommendations are now defined within a regulation and can no longer just be considered as guidelines. He says: “This EU regulation will start taking effect in March 2020, with final implementation on 1 January 2023. So over the next few years, we’ll see the data centre energy efficiency guidelines established by ASHRAE TC 9.9 become the de facto standard for EU data centres. With today’s inevitable Brexit uncertainties, it’s perhaps reassuring that there are now common data centre environmental guidelines across both Europe and the US.”

NGD’s Bearne continues by saying that when it comes to cooling, there are various options and alternatives available: “Some [are] only within the grasp of modern purpose-built rather than legacy facilities, including the harnessing of climatically cooler locations that favour direct air and evaporative techniques; installing intelligent predictive cooling systems; using water, liquid or nano-cooling technologies; as well as prerequisite aisle containment techniques.

“Faced with these challenges, best practice dictates that data centre and facilities professionals will increasingly need to apply real-time Big Data analysis and monitoring techniques for optimising cooling systems plant and maintaining appropriate operating temperatures for IT assets, and all without fear of compromising performance and uptime.” 

He adds that central to this, and in order to maximising overall data centre energy efficiencies and PUE, are integrated energy monitoring and management platforms capable of integrating the building management system, PDUs and SCADA. “An advanced system will save many thousands of pounds through reduced power costs and by minimising the environmental impact while helping to ensure maximum uptime through predictive maintenance.”

This is arguably where specialists such as EkkoSense come in. “You need to have a network of sensors to monitor how your data centre is actually performing – ideally in real-time,” advises Redshaw. “A software solution that allows you to visualise this performance, and then improve the real-time management of all your data centre’s cooling, power and space aspects is invaluable.

“And you also need access to data centre optimisation expertise to make sure you’re making the right decisions when it comes to maximising performance. Get all this right and you’ll be well on your way to staying on top of your energy efficiency challenges.”

SPIE’s Westwood agrees that the cooling system is one of the key areas that can make a data centre much more efficient. But he also points out that there are many other areas of design that need consideration together with new technologies such as immersed liquid cooling, Open Compute Program projects, solar farms, powers cells, and software systems to manage efficiency through data centre infrastructure management. “Data centre concepts are changing, and the boundaries of current technology are being stretched. To stay on top of the energy challenge is both a short- and long-term consideration, where creative and competent engineers will be the key ingredient in the data centre managers toolkit.”

Secure I.T. Environments’ Wellfair also says that while there is currently great emphasis on renewable energy and more efficient IT equipment and cooling systems, the most important element will be the increased use of analytical tools and cloud-based data centre management software. He reckons that as hyperscale data centres become larger and contain ever greater numbers of products and components, manual operation and management of such facilities will require greater use of analytics, automation and machine learning to ensure efficiency. 

“Fortunately, the cost of embedding sensors within critical infrastructure has come down, resulting in much greater numbers of metering points which generate higher volumes of data,” says Wellfair. “Furthermore, management tools have improved greatly both in terms of capability and ease of use. Early DCIM tools were complex to install and maintain; today’s cloud-based systems are much more flexible, allowing the user to monitor the data centre, or a number of distributed data centres across the network efficiently, in real-time.”

Carlini also admits that whilst traditional DCIM is still an excellent tool for managing on-premise data centres – citing an example where a DCIM deployment at a UK university yielded an energy saving worth £125,000 per annum – cloud-based Data Centre management as a Service (DmaaS) systems enable today’s businesses to gain greater visibility into a portfolio of facilities across an entire network. “This is an advanced software management solution that will only be viable through the cloud, and as hyperscale facilities continue to grow in number with more edge solutions being deployed to support them, DMaaS becomes a far superior choice to deliver a tangible ROI for today’s operators.”

A different view of data centre energy management: Secure I.T. Environments says despite renewable energy and more efficient cooling systems, the most important element will be greater use of analytical tools and cloud-based management software.

A different view of data centre energy management: Secure I.T. Environments says despite renewable energy and more efficient cooling systems, the most important element will be greater use of analytical tools and cloud-based management software.

 

Redshaw is clearly no great fan of data centre infrastructure management systems. When asked what are the pitfalls to avoid when it comes to choosing cooling solutions, he says: “You probably need to avoid getting sucked into expensive and over-complex DCIM investments. Far too often, these approaches tend to be much too complex and expensive for most data centres.

“Similarly, I wouldn’t rush into CFD (computational fluid dynamics) projects as this often just result in less than objective airflow recommendations that don’t necessarily leave you any further forward in your efficiency journey.” 

According to Redshaw, IT managers need to remember that what they are trying to achieve is the elimination of thermal risk and running their data centres efficiently.

“Where there’s any uncertainty, organisations typically resort to over-cooling their data centres, and in efficiency terms, that’s an expensive and wasteful approach. We’re convinced that effective thermal optimisation means you really don’t need to do this; you’ll save around 25 per cent of your data centre cooling costs if you get it right.”

For Node4’s Williams, one of the biggest pitfalls is rushing for the most ‘efficient’ solution with a view to saving opex on power without understanding any of the other ‘hidden’ costs, such as water consumption and treatment. 

And on the subject of water, when it comes to choosing energy efficient solutions, Coors advises against selecting a design that cannot work without water for adiabatic or open water tower cooling. “Droughts around the world, or the potential for water to be banned or extremely expensive, means that all data centres should be designed so that they can function in the absence of water if needed. 

“Another pitfall to avoid is designing a data centre with direct outside air for cooling, built around city centres. These designs typically have the potential of gaseous contamination –something that should be avoided at all costs.”

 

…stay in the kitchen 

“Data centres as we currently know them could reach more than 20 per cent of global power usage by 2025.”

Peter Westwood, 
Data centre director, 
SPIE UK

“Data centres as we currently know them could reach more than 20 per cent of global power usage by 2025.”

Peter Westwood,
Data centre director,
SPIE UK

 

In conclusion, SPIE’s Westwood says the data centre industry remains extremely active, with hyperscale operators driving PUE down to between 1.0 and 1.2 with new developments and greener designs utilising large solar farms, perfect climates and new IT technologies. However, he says these new technologies will take time to demonstrate benefits for other operators and legacy sites with older technologies.

“Data centre owners and operators have often been keen to develop new or improved energy and engineered solutions and to maximise the efficiency of floor space. As such, creative engineering with the ability to provide robust proof of concept is essential.”

But in any event, data centre energy efficiency needn’t be seen as a cost that data centres have to bear. As Redshaw states, when done right, monitoring, managing and maximising data centre performance leads directly to reduced data centre cooling costs and increased capacity. “At the same time, organisations will benefit from the reduced risk that comes from having a thermally optimised data centre. What’s not to like?” n