Could virtual reality provide the answer to avoiding performance problems in data centres and support the next revolution, at the Edge? Louise Frampton recently visited the UK headquarters of Future Facilities and ‘strapped’ into the 3D world of data centre simulation to explore the potential of the technology.
The use of virtual reality technology could change the way data centre operators design and manage their facilities, as well as training staff to avoid human error. However, according to Future Facilities chief operating officer Jon Leppard, the technology could “come into its own” as we see the next stage of the data centre revolution unfold.
Future Facilities has developed an interactive virtual reality platform that allows users to observe the effects of change to the data centre environment, which has the potential to help improve performance. The company has been pioneering simulation tools, used by data centre professionals to improve thermal management and reduce energy costs, since 2004, and this latest development builds on this expertise.
Simulation using the company’s 6SigmaDCX platform has already helped high-profile operators eliminate hotspots, improve efficiency and increase computing capacity at their data centres. For example, Dell identified tactical and containment changes at its 15,480 square-foot high data centre in Texas to improve PUE from 1.86 to 1.77, reducing chiller power consumption by 12% and overall power consumption by 5%, with potential annual savings of $100K. Capacity per cabinet could also be increased from 2.7kW to 3.2kW aiding expansion.
When CBRE’s global finance customer wanted to improve energy efficiency, it used Future Facilities’ Virtual Facility – identifying improvements to save the bank an estimated $10m-plus through combined efficiency and capacity gains in a single data centre. Cisco has also used Virtual Facility analysis to achieve a 30% reduction in power required for cooling, as well as cost savings of $200,000 per year through an increase in chilled water set point.
Having identified the potential of virtual reality to take simulation to the next level, the latest proof-of-concept platform allows users to explore data centre design in a safe offline environment, enable trouble-shooting of existing sites, as well as run ‘what-if scenarios’ to support changes in infrastructure.
The first time you use the virtual reality program, you are struck by the immersiveness of the experience – it feels very different from seeing an image on screen; you can ‘walk’ through aisles of three-dimensional racks, choose which direction you want to go in, while viewing the assets and crucial information such as air flows – it certainly feels like the dawn of a new era, as Scott Payton, technical director of Global Data Centre Engineering, suggests – after experiencing the program for himself, he described the addition of virtual reality to the 6SigmaDCX simulation platform as doing for “engineering simulation, what the flight simulator did for the aviation industry”.
Future Facilities product manager Mark Fenton explains that the next stage of development will be to make the platform more interactive, so that, when you are immersed in the virtual facility, you can ‘touch’ devices, look at what applications are running, decommission equipment, and select from a menu of what is going to be installed. Rather than passively walking around the virtual environment, you will be able to interact and make changes live in this environment.
The next stage will be augmented reality, where the computer-generated image is superimposed on a user’s view of the real world – in this case their data centre. Operators will be able to see live data with visible air flows, while they walk the floor, to help them understand why they have an issue. “This is the final frontier of where we want to go… It will be a new way of interacting with engineering,” comments Fenton.
So how will virtual reality change the way data centres are designed and managed in the future? The technology has a variety of potential uses, depending on the main goal of the business. Virtual reality makes it possible for designers to give clients a virtual tour of their proposals; colocation operators can show customers a new cage layout and how it will operate; while operational sites can be optimised to improve performance, reliability and costs. Owner operators experiencing hot spots can use the technology to understand why they are having cooling problems, for example, or simulate the deployment of a new piece of hardware to see the impact on their data centre, or anticipate possible outcomes when performing maintenance.
Overlaying simulation and DCIM data enables greater understanding of data centre performance and it can be used for site assessment, analysis and training to reduce human errors and failures. Failure scenarios can also be run to establish how a data centre will cope with a specific cooling or power problem.
Data centres may also want to improve their efficiency profile, or look at the potential of raising the temperature of the facility. Using the technology, operators can trial the scenario, in the virtual world, with the peace of mind that they can avoid actual risk.
“Virtual reality is an exciting area for owner operators – rather than having to walk the physical site they can ‘strap in’ to virtual reality, overlay the simulation, integrate data from DCIM tools and bring everything together in one place,” comments Leppard.
“We can take the data centre operator on a virtual tour: show them the racks, the performance, tell them from the point of view of the environment how it is going to operate, where the access is going to be and the cooling equipment. The colocation provider may have one room segregated into cages. If they sell some high-performance computing in one corner, they can see how it will impact on the other neighbouring cages. They can use it as a marketing/pre-sales tool, as well as for engineering…It takes them on a full sales journey,” Fenton explains.
“Huge hyperscalers, such as Google and Facebook, may do less day-to-day management, but they will undertake big projects – for example, they may decide to retire half of a room and bring in new hardware, so they will use the tools to design and lay it out, to see what the performance is going to be like. If they lose a server because something overheats, the fact that someone can’t poke or like for a few seconds isn’t the end of the world so they are more interested in efficiencies, while a bank will be looking for as close to 100% resilience as possible,” he continues.
The use of computational fluid dynamics (CFD) and engineering simulation, in general, has been steadily growing but the market is changing, according to Leppard: “Three or four years ago, around 75% of our business was in design, but today around half of the business is with owner operators using simulation in-house. Most new data centres have used CFD simulation. Although the percentage of live sites using it on a day-to-day basis is still relatively small, it is significantly growing.
“People are no longer using the technology as a band aid to solve a problem. Rather than IT stating: ‘You have two weeks to install this’, and operators having no idea about what the impact is going to be, following a configuration change, people are getting wiser and utilising it to predict what is going to happen, to avoid problems.
“We are seeing a trend towards operational planning – mission critical facilities in the banking, insurance and government sectors are using simulation on a more regular basis, instead of just troubleshooting or using it for energy efficiency trending.
“You wouldn’t buy a suit without trying it on first, yet we, in this industry, seem to think it is fair game. However, there are tools, that give you an opportunity to ‘try it on’ first, without actually flicking on the switch and waiting for something to happen.”
The number of enterprise sites has also been reducing; businesses such as Coca-Cola and Deutsche Bank have been moving away from owning data centres and are moving into colocation space.
This, according to Leppard, is further driving the need for virtual reality. “Colocation centres base their business on reliability and cannot afford to get it wrong. They need simulation to ensure the decisions they make will not affect their core business. It is their reputation on the line and colocation providers want to distinguish themselves as the best,” Leppard explains.
The technology will also be crucial to supporting the next wave of change in the sector, he claims: “With the future of data centres requiring closer proximity to people and devices, we are seeing large hyperscale data centres supporting thousands of discrete edge sites.
“The next stage is being driven by IoT – the vast amounts of data produced from our phones, cars, watches and other devices will need to be transported back to hyperscale facilities and this is where the birth of small edge sites will emerge. It won’t be possible to rely on a large-scale hub in the US, as there will be too much of a delay. In the future, there will be a box on every street corner.”
This will still be architected around large facilities but they will be underpinned by hundreds and thousands of little edge sites, he believes. The ability to ‘transport’ staff to these remote sites, via virtual reality, will be a major driver for the technology in the future, therefore.
“Ultimately, simulation isn’t voodoo,” comments Leppard. “You can see power; you can see space, but you can’t see cooling. We are trying to find a way to communicate how cooling works and why it may fail. They say a picture paints a thousand words, but virtual reality will give you 10,000 words – you just need to decide which ones you are going to read.
“Making the technology easy for the lay person to use is crucial – all the user wants to know is ‘should I, or shouldn’t I?’, ‘where should it go?’, ‘yes or no?’ If we can achieve this, we have done our job.”