Cloud computing is far from going under, it’s growing at a staggering rate with SaaS expected to be worth £47.5bn by 2018 and infrastructure & platform services to be worth £30.5bn by the same year, so it’s no wonder that data centres are so large these days. In 2013, data centres in the US alone used the equivalent energy as the output of 34 large coal power plants and are to blame for 2% of total greenhouse gas emissions (roughly the same as the aviation industry).

As a result, large companies who provide these services (Google, Amazon, Microsoft, Apple, and others) are increasingly on the lookout for ways to reduce their own cost in relation to energy consumption whilst also reducing their footprint on the environment.

Microsoft’s Project Natick is looking to build upon this, but will it make a big splash or will it just be a drop in the ocean?

Project Natick Overview

Microsoft manage more than 100 data centres over the globe delivering their various cloud services. Like any other large cloud provider, they are on the look-out for ways to improve efficiencies and minimise environmental impact.

An internal paper was submitted by a Microsoft employee named Sean James (who formerly served on a US submarine) and Project Natick was born in 2014. By August 2015 Microsoft have deployed their first submerged data centre. The ‘data centre’ was made up of a single server rack in a cylindric vessel filled with pressurised nitrogen weighing roughly 17 tonnes in total, and was lowered to a depth of 30ft just off the coast of California. It was fitted with various sensors so the engineers at Microsoft could monitor metrics such as temperature, humidity, and pressure. It was retrieved in November 2015 after serving just over 100 days ‘at sea’ and is currently being analysed and refitted.

So what are the main reasons that Microsoft have pursued this trial?

Cost

Like we have already mentioned, the biggest costs of running data centres lie in the energy they consume, and one of the biggest contributions is the necessity to keep the machines cool. By harnessing a naturally cold environment, companies can reduce their reliance on expensive air conditioning systems. For example, Facebook have a data centre in Northern Sweden where winter temperatures average -20C and Google have a data centre in Finland which uses a sea water cooling system.

Microsoft’s Project Natick trial has used the ocean to naturally cool the machines within the vessel. The naturally cold ocean cools the servers, and nullifies this cost.

Speed

Another benefit of Microsoft’s underwater data centre concept is the speed in which it can be implemented. Microsoft say if they could mass-produce the capsules then it could take just 90 days to set-up. Not only is this beneficial in relation to cost, but it could also be a huge benefit in specific situations. For example, in disaster stricken areas (earthquakes, floods, storms), in places where there currently isn’t any internet, or in places that see an influx of people in a short space of time (Olympics, World Cup).

Environment

Although the Microsoft project was actually connected to land to generate the power needed to run the servers, Microsoft envision future deployments to be tied with renewable energy sources, like wind, tidal, or wave.

Once more, the capsules used to house the servers are recyclable, remaining underwater for 5 years, brought up again to have their servers replaced, and then sunk again (although Microsoft predict a slowing down of Moore’s law, and thus this 5 year period could eventually be lengthened as the need for upgrades would be reduced). The capsules last 20 years in total and are then recycled, and a new one deployed.

What about the effects the vessel had on the ocean? Microsoft say that no additional heating was measured coming off the capsule in into the water beyond a few inches and sea-life quickly adapted to the presence of the vessel. However, this was only one server rack/vessel and people may have concerns about the impacts on marine-life of a more scaled version.

Latency

Latency is the speed at which data travels from source to destination. The longer the journey, the slower speeds can become. Microsoft say 50% of the earth’s population live near the coast and that having data centres that are closer to users would reduce latency and thus improve speeds and user experience.

Sink or Swim?

Conclusive results have yet to be released by Microsoft, so at this stage it is hard to judge whether Project Natick has been a complete success. Although the trial seems to bring positives in terms of latency speed and environmental impact, it will be the more practical elements that determine its advancement. For example, the state of the equipment after being in the sea for so long (salt water is corrosive after all!), and the implications (and costs) if anything was to go wrong whilst servers are underwater (conventional land-based data centres are easily accessible by engineers should something need fixing or replacing).

However, any attempt to speed up cloud computing whilst simultaneously being beneficial for the environment can only be a good thing. More and more people are migrating to the cloud, so having a sustainable solution in the making is admirable from Microsoft, who plan to launch a new trial next year.

Who knows, maybe the Office 365 spreadsheet you are working on in a few years time will actually be hosted under the sea.

Further Reading

  • Google are using machine learning to optimise energy saving & improve data centre performance and have committed to 35% renewable energy use
  • Facebook’s Luleå data centre 100% powered by renewable energy
  • All of Apple’s data centres use 100% renewable energy sources
  • Greenpeace’s ‘Clicking Clean’ report on how each company scores in relation to creating a ‘green internet’
  • new material is being studied that could replace silicon and make data centres more efficient