Microsoft submerges data centre in ocean


Wednesday, 03 February, 2016


Microsoft submerges data centre in ocean

A 17,000 kg container was placed on the ocean sea floor by Microsoft and remotely monitored for three months by staff at the Microsoft campus in Washington. Not something you’d typically hear concerning the computing giant’s activities, but the endeavour was all part of the company’s vision to manufacture and operate an underwater data centre.

Project Natick, the name of Microsoft’s research project, involved the deployment of a 3x2 metre spherical vessel — containing a data centre that consumed computing power equivalent to 300 desktop PCs — about one kilometre off the Pacific coast of the United States from August to November of 2015. The project, the company said, reflects Microsoft’s ongoing quest for cloud data centre solutions that offer rapid provisioning, lower costs and high responsiveness, and are more environmentally sustainable.

It is hoped the knowledge gained from the three months this vessel was underwater could help make future data centres more sustainable, while at the same time speeding data transmission and cloud deployment. Perhaps even enabling the possibility of a future where seabed data centres are commonplace around the world.

While the technology to submerge sealed vessels underwater with computers inside isn’t new, the Microsoft researchers believe this is the first time a data centre has been installed under the ocean’s surface. Deploying a data centre underwater is thought to solve several problems: by introducing a new power source; by greatly reducing cooling costs; by closing the distance to connected populations; and also by making it easier and faster to set up data centres.

Ben Cutler, the project manager who led the team behind this experiment, said his team applied science and engineering to the concept of submerging a data centre. A big challenge involved people, since people keep data centres running. However, there is a logistical problem deploying people underwater — they take up space, need oxygen and light, food and water, as well as a reasonably comfortable environment to work in. Eventually they’ll need to go home at the end of the day.

So the team moved to the idea of a ‘lights out’ situation. A simple place to house the data centre — compact and self-sustaining. The team chose a round container. According to Cutler: “Nature attacks edges and sharp angles, and it’s the best shape for resisting pressure,” he said. That led the team towards working out how to make a data centre that didn’t need constant, hands-on supervision.

The experimental prototype test vessel would only be launched about one kilometre offshore, so it was reportedly able to be linked into an existing electrical grid. However, it was discovered that being in the water meant that using the hydrokinetic energy from waves or tides was also possible for computing power. This could make data centres work independently of existing energy sources, located closer to coastal cities, powered by renewable ocean energy.

Cutler added that half of the world’s population lives within 120 miles of the sea, which makes an underwater data centre solution so appealing; one of the big advantages is it reduces latency by closing the distance to populations and consequently speeding data transmission.

This project also shows it is possible to deploy data centres faster. Building the vessel that housed the experimental data centre reportedly only took 90 days. While data centres built on land have different needs and are done so according to building regulations, and varying environments and terrains, it is thought these underwater containers could be mass produced for very similar conditions underwater. Another key advantage is it is also colder the deeper you go under the surface.

Cooling is an important aspect of data centres, which can be rather costly when operating chiller plants to keep the computers inside from overheating. The cold environment of the deep seas automatically makes data centres less costly and more energy efficient.

Remotely monitoring the vessel using cameras and sensors, the team recorded data concerning temperature, humidity, the amount of power being used for the system or the speed of the current.

The team is still analysing data from the experiment, but so far, the results are promising. They are reportedly now planning the project’s next phase, which could include a vessel four times the size of the original container with around 20 times the compute power. The team is also evaluating test sites for the vessel, which could be in the water for at least a year, deployed with a renewable ocean energy source.

Peter Lee, corporate vice president of Microsoft Research NExT, said the project has given them plenty of data to analyse.

“We’re learning how to reconfigure firmware and drivers for disk drives, to get longer life out of them,” said Lee.

“We’re managing power, learning more about using less. These lessons will translate to better ways to operate our data centres. Even if we never do this on a bigger scale, we’re learning so many lessons.”

Top image caption: Project Natick vessel being deployed. Second image caption: Installing server rack into Project Natick vessel. © Microsoft.

Related Articles

Smart cities, built from scratch

With their reliance on interconnected systems and sustainable technologies, smart cities present...

Smart homes, cities and industry: Wi-Fi HaLow moves into the real world

Wi-Fi HaLow's reported advantages include extended ranges and battery life, minimised...

Five ways data storage can advance your sustainability ambitions

With IT a significant contributor to energy consumption, there are considerable sustainability...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd