The past is never dead.
Two years ago, Microsoft had sunk its entire data center to the bottom pit of the Scottish sea; it had 864 servers and around 27.6 petabytes of storage down a 117 feet deep sea. Now, after two years, Microsoft says that this little experiment showed that data can survive underwater.
Now you must be wondering what was the need to throw an entire data to the bottom pit of the ocean, isn’t it a little strange? But don’t worry, because according to the hypothesis formulated by Microsoft’s Project Natick team, placing the servers underwater can result in more reliable and energy-efficient data centers.
There are several issues on land – humidity, temperature fallouts, corrosion, etc. However, when it comes to underwater server quarters, the environment is highly controlled as far as the temperature and crop issues are concerned. Microsoft believes that such servers are easily developed without having to worry about the size of the coast and area near them. They also provide better access to cloud-based resources to nearby areas.
Microsoft has adamantly stressed on the benefits of underwater servers. It confesses that the underwater data centers would have one-eighth the failure rate compared to the land-based data center. This is actually a great signal, as it’s extremely essential to have a lower rate of a power failure because it’s challenging to service a busted server when you know it’s in an airtight container right at the bottom pit of the ocean.
This isn’t the first time Microsoft has tried to explore underwater possibilities. In 2015, the company dunked a data center near the coast of California for a couple of months just to find out if computers are even capable of surviving such a trip. The idea was to evaluate how practical this experiment was to even further the next round of trials and to check whether these experiments had any real-world application.