With data centers shutting down thanks to extreme temperatures, should we get used to outages?
One of Twitter's main data centers in California was reportedly knocked out of action by extreme heat earlier this month, raising questions about the resilience of tech infrastructure to climate change.
According to internal memos seen by CNN, soaring temperatures in Sacramento led to a shut-down that threatened service to users. Carrie Fernandez, the company's vice president of engineering, warned Twitter engineers that the company was in a “non-redundant state.” If similar outages were to occur at the company's Atlanta and Portland data centers, she wrote, 'we may not be able to serve traffic to all Twitter's users.
The memos say that Twitter has been handling the problem by halting all non-critical updates, such as deployments and releases to mobile platforms, other than those needed to maintain continuity of service or other “urgent operational needs.”
The story supports statements made by whistleblower Peiter Zatko, who recently told the US Congress that Twitter had no plan as to how to recover from simultaneous outages.
Twitter hasn't confirmed the reports. However, if true, it's not the only time this has happened during the recent global heat wave. Earlier this summer, for example, both Google and Oracle were forced to shut down data centers in the UK, causing widespread outages for their clients.
Indeed, according to research from the Uptime Institute, almost half of data center operators have experienced an extreme weather event that has threatened their continuous operation, with nearly one in ten saying service had been disrupted.
And other tech infrastructure has also been affected, with Verizon recently forced to switch to emergency generators and backup batteries at six of its mobile switching centers in order to keep its network running.
Of course, extreme heat is far from the only effect of climate change. High winds, flooding, fire, and sea level rise can all place critical infrastructure under peril.
Firms are reacting in different ways. AT&T, for example, has worked with the US Department of Energy to develop a climate change analysis tool that projects flooding and winds in the Southeastern US over the next 30 years. The data is used to help plan the company's 5G network facilities.
Meanwhile, the National Oceanography Centre is working to determine where climate change is impacting subsea cable resilience and develop strategies to adapt.
Many data center operators are increasing their cooling capacity, with others shifting their facilities to northern climates or even underwater. However, this can't be a universal solution. In many regions of the world – large parts of Africa, for example – temperatures are consistently high, and large volumes of water are unavailable.
There are mitigations: passive cooling, for example, which ensures hot and chilled air do not mix, as well as immersive liquid cooling, where servers are held in a rack filled with coolant a thousand times more effective than air.
However, these solutions can only go so far. What seems likely is that infrastructure developers will build in more redundancy – extra back-up data centers and so forth – but only to the degree they see as absolutely necessary.
According to the Uptime Institute survey, more than a third of respondents said that their management has yet to formally assess the vulnerability of data centers to climate change. And as the planet heats up and extreme weather events become more frequent, the industry will remain one step behind. It seems likely that outages, whether at data centers or telecoms infrastructure, will become a far more frequent occurrence in the future.
More from Cybernews:
Subscribe to our newsletter