Data centers should get greener and smaller – knowledge

The data center of the future has gotten a bit dirty. Quite a lot of muck, actually. So much that a worker unpacks the high-pressure cleaner and hoses the data center. The shower leaves the data memory unmoved. It’s waterproof, after all.

“Natick” is the name of the data center of the future – at least if the Microsoft engineers are to be believed. Two years ago, the software company sank the white steel tank, packed with 864 servers, in the North Sea. This summer he brought Natick back to the surface with algae, mini crabs and sea anemones. Internally, the data center was not impressed by any of this. It did, says Microsoft, what it should do: calculate, and do it more economically and efficiently than its counterparts on land.

The software company is not alone with these research activities. Other digital companies are also working to make their data centers fit for the future. They should no longer be cost drivers, energy guzzlers and bottlenecks. Instead, they should become smaller, greener and more decentralized. However, this is urgently needed if the dream of a thoroughly digital world is not to end suddenly.

After all, nowadays nothing works without data centers: the mostly inconspicuous buildings are the backbone of the Internet. This is where the images that people upload to Instagram are stored. The videos from Netflix are played here. Here Google searches are carried out, Amazon orders and the battles of the computer game “Fortnite”. More and more calculations are made here – for all those processes that have been outsourced to the nebulous “cloud”, that global, invisible computer. In short: Without the estimated eight million data centers around the world, no smartphone would work and no digital information would reach its recipient.

But data centers are also huge power guzzlers. The centers around the world pulled a good 200 terawatt hours from the socket in 2018 – around one percent of global electricity consumption. This is the conclusion that Eric Masanet from Northwestern University comes to in a recent study that the engineer and colleagues published in the specialist magazine Science published. It could have been worse: Since 2010, the global computing load in the centers has increased sixfold, Internet traffic has increased by a factor of ten, and storage space has even increased by a factor of 25. Nevertheless, according to Masanet, power consumption has only increased by six percent in the same period . Economical processors and increasingly busy servers have made this possible.

But can it go on like this? In 2023, the number of Internet users worldwide will exceed the five billion mark for the first time, estimates the network equipment provider Cisco. Two thirds of all people can then save their cat pictures online. The amount of data will also rise rapidly, by around 60 percent per year, as the consulting firm IDC has determined. According to this, 175 zettabytes or 175 quadrillion megabytes could be reached by the middle of the decade. IDC estimates that almost half of them will be in the cloud – and thus in data centers. Can all of this be achieved without the energy consumption going through the roof?

Not with today’s technology: So far, data centers have been rather inhospitable places. Anyone who has the opportunity to visit such a data temple is standing in the middle of large cupboards. Everyone is piled to the top with servers, none of which are bigger than a half-height cutlery drawer from the kitchen. Diodes flash, it’s loud, cold, dry and drafty.

Servers run hot as soon as they do complex calculations – much like the old laptop when it plays videos for hours

There is a reason for the uninviting environment: servers overheat as soon as they do complex calculations – much like the old laptop at home on the sofa when it has to play videos for hours. In order to dissipate the heat, data centers have so far mostly blown cold air into the server room through holes in the floor. The air flow is directed over the processors and extracted again from the ceiling – now significantly warmer due to the heat from the computer. The principle works quite well for server cabinets with an output of up to 20,000 watts.

In the future, however, engineers expect 100,000 watts per cabinet or more. Dissipating such amounts of heat with air alone would be extremely inefficient and expensive. The power requirement for cooling, which in today’s data centers is between ten and 20 percent of the total energy consumption, would increase massively. Therefore, the centers are increasingly switching to water-cooled systems.

Microsoft is going a different way. Instead of pumping water through servers, the software company wants to put its servers in the water. In the case of the white computing cylinder called Natick, which was sunk in June 2018 off the Scottish Orkney Islands at a depth of 35 meters, fresh water is fed to the processors from an internal, closed cooling circuit. The water heats up and flows out through pipes, where a heat exchanger transfers the energy to the sea – without the risk of a water bill. A similar system is also used to keep the inside of submarines cool.

The biggest concern in the run-up to Natick was that algae or other sea creatures would settle on the cooling fins of the 14 meter long steel tank and impede the exchange of heat, says project manager Ben Cutler on the Microsoft website. The engineers therefore experimented with different coatings, even the use of sound and ultraviolet light was discussed to drive away marine life. In the end, an alloy of copper and nickel prevailed. The material dissipates heat well and at the same time resists the growth of marine organisms, but is somewhat more expensive than conventional heat exchangers.

Fears that the surrounding water would become very hot due to the power of the submerged data center – after all, 240 kilowatts – were apparently not materialized. A few meters away from the steel cylinder, temperatures were measured that were only a few thousandths of a degree Celsius higher than before, writes project manager Cutler in the trade journal IEEE Spectrum. However, the measurement data have not yet been published in independent specialist journals. It is also unclear what effects huge server farms, composed of many individual computing cylinders, would have on the maritime environment.

For Stockholm’s municipal utilities, however, it can’t get hot enough. The Swedes are going in the opposite direction. They want to use the waste heat from data centers to heat their homes. The up to 85 degrees Celsius hot water from the cooling systems is fed into the city’s existing district heating network. According to the Stockholm engineers, ten megawatts of power are enough to heat 20,000 apartments. For comparison: a modern large data center, such as that operated by Facebook, among others, reaches 120 megawatts. By 2035, ten percent of the city of Stockholm should be heated with the waste heat from data centers.

Nordic countries are already very popular with the operators of the centers: The climate is frosty, which reduces the cost of cooling systems. The electricity is cheap (or, as in Sweden, heavily subsidized) and mostly comes from renewable sources. Facebook, for example, has set up a huge data center in Luleå, Sweden, right next to a hydropower plant. The power for the Natick cylinder on the Orkney Islands also comes from wind, sun and waves. According to Microsoft, it has been shown that a data center can be operated with a power mix that was previously considered “unreliable”.

Based on the weather forecast, the algorithm predicts the hours in which a particularly large amount of green electricity can be expected

Unreliable, but above all impractical: All the major digital corporations claim that they can count on electricity from renewable energies. Most of the time, however, the companies acquire global eco-certificates, while the electricity comes from the nearest coal-fired power station. In order to also become greener locally, Google has recently been experimenting with a new algorithm in its data centers: Based on the weather forecast for the coming day, it predicts the hours in which a particularly large amount of regenerative electricity can be expected, and places unnecessary computing tasks in precisely these Periods. As project manager Ana Radovanovic writes in the internet giant’s blog, this includes editing videos and training in the company’s own translation software.

“The first results show that the climate-friendly load distribution works,” says Radovanovic, but without giving concrete figures on CO2 savings. In any case, there is potential: According to Google estimates, only about two thirds of all calculations by the company have been made with green electricity. Artificial intelligence should also help to better adapt the cooling systems to the predicted computing needs. Three degrees less room temperature theoretically reduce energy costs by a quarter. In practice, Google wants to have reduced electricity consumption by 30 percent in this way.

The problem: Up in the north, where it is cool and the electricity is clean, none of these amounts of data are required. The metropolitan areas are elsewhere. Due to the large distances, however, the latency increases, as computer scientists call the delay in retrieving information: If a data center is 100 kilometers away, it takes a thousandth of a second before it can react to a click. If there are 5000 kilometers between the computer and the server, 50 thousandths of a second pass. This is negligible when playing a movie. However, if the local word processing is moved to the cloud, which requires a lot of interaction with the server, a high latency becomes noticeable.

The trend is therefore towards small, decentralized data centers right on the doorstep. Natick should also contribute to this. More than half of all people live less than 200 kilometers from a coast, according to Microsoft. Data centers sunk in the sea – efficient, quickly connected and without high property costs – could therefore represent a good alternative.

But only if a diver doesn’t have to stop by for repairs every few days. Data centers like Natick, named after a city in the US state of Massachusetts, therefore work autonomously – for years until the end of their planned lifespan. It apparently worked well off the Orkney Islands. According to Microsoft, some servers stopped working during the test run. Overall, however, the failure rate was only one eighth of the value of a comparable land-based data center.

Project manager Cutler blames the dry nitrogen atmosphere in the hermetically sealed cylinder, which prevented corrosion and temperature fluctuations. And he points out that no technicians shuffled through the data center and accidentally bumped into servers, that they didn’t rip cables or cause other chaos.

However, this could also be achieved without messy steel cylinders coated with algae and crustaceans: through autonomous, completely maintenance-free data centers on land. Microsoft, Google & Co. are already working on this.

.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.