In these lockdown times, an epic number of citizens are video-conferencing to work. But as they trade for digital connectivity in a gas-burning commute, their personal use of energy for every two hours of video is greater than the share of fuel they would have consumed on a four-mile train ride. Add to this, millions of students are ‘driving’ to the internet(cloud based) class rather than walking.

Meanwhile, scientists are furiously deploying algorithms to speed up research in other corners of the digital universe. Yet, for a single artificial intelligence application, the pattern-learning phase can consume more computational energy than 10,000 cars a day do.

This grand ‘experiment’ in shifting the use of societal energy is visible in a high-level set of facts, at least indirectly. U.S. gasoline use had collapsed by 30 percent by the first week of April, but the overall demand for electricity was less than seven percent. In fact, that dynamic is indicative of an underlying future trend. While the use of transportation fuel will eventually rebound, real economic growth is tied to our digital future, powered by electricity.

The COVID-19 crisis highlights how much more sophisticated and robust the Internet 2020 is from what existed as recently as 2008 when the economy collapsed last, a century-old internet. If a national lockdown had occurred back then, the nearly 20 million who got laid off would have been joined by most of the tens of millions who now telecommute. Nor would it have been almost as practical for schools and universities to have tens of millions of students learning from home.

Analysts have widely documented massive Internet traffic increases from all manner of stay-at-home activities. For everything from online grocery stores to video games and streaming movies, digital traffic measures have spiked. The system has handled all of this ably so far, and the cloud has been available continuously, minus the occasional hiccup.

During the COVID-19 crisis, there is more to the role of the cloud than teleconferencing and video chatting by one click. In the end, telemedicine was unleashed. For example, we’ve seen apps emerge quickly to help self-assess symptoms and AI tools put to work to enhance X-ray diagnoses and help tracking contacts. The cloud has also enabled researchers to quickly create clinical information “data lakes” to fuel the astronomical capacities of today’s supercomputers deployed to pursue therapeutics and vaccines.

The future of AI and cloud, along with practical home diagnostics and useful VR-based telemedicine, will bring us much more of the above, not to mention hyper-accelerated clinical trials for new therapies. And that doesn’t say anything about what the cloud will still allow in the 80 per cent economy that isn’t part of healthcare.

However, for all the excitement these new capabilities offer us, the bedrock behind all that cloud computing will remain consistent — and steadily growing — in energy demand. Far from energy savings, our AI-enabled future workplace uses more energy than ever before, a challenge that the tech industry needs to assess and consider quickly in the years ahead

The new information infrastructure

The cloud is vitally important infrastructure. That is going to and should reshape a lot of priorities. Only a few months ago, tech titans were elbowing each other apart to issue pledges to reduce energy consumption and promote ‘green’ energy for their operations. Such issues will no doubt remain important. But reliability and resilience — in short, availability — are moving to top priority now.

As Fatih Birol, Executive Director of the International Energy Agency (IEA) reminded his constituency last month, in a diplomatic understatement, of the future of wind and solar energy: “Today, we are witnessing a society with an even greater reliance on digital technology,” which “emphasizes the need for policymakers to carefully assess the potential availability of flexibility resources under the external framework.

Providing high reliability electricity with solar and wind technologies continues to be prohibitively expensive. Those who claim to be at “grid parity” with solar / wind are not looking at reality. The data show that overall grid kilowatt-hour costs in Europe are roughly 200 to 300 percent higher where the share of wind / solar power is much higher than in the US. It bears witness that large industrial electricity users, including tech firms, generally enjoy deep grid average discounts, leaving consumers burdened with higher costs.

READ  Steam drops SteamVR support on Mac

Put in somewhat simplistic terms: this means consumers pay more to power their homes, so big tech companies can pay less for power to keep data-lit smartphones. (We’ll see how tolerant citizens are in the post-crisis climate of this asymmetry.) Many of these realities are, in fact, hidden by the fact that the dynamic energy of the cloud is the opposite of that for personal transport. For the latter, when filling the gas tank of their car, consumers literally see where 90 per cent of energy is spent. However, when it comes to a “connected” smartphone, in the sprawling but largely invisible infrastructure of the cloud, 99 per cent of energy dependencies are remote and hidden.

cloud computing,servers,cloud

For the uninitiated, the voracious digital engines that power the cloud are located in the thousands of out-of-sight, unwritten warehouse-scale data centers where thousands of silicon machine refrigerator-sized racks power our applications and where the exploding data volumes are stored. Even many of the digital cognoscenti are surprised to find out that each such rack burns more electricity than 50 Teslas per year. In addition, these data centers are connected to markets with even more power-burning hardware, which propels bytes along approximately one billion miles of information highways consisting of glass cables and forges an even larger invisible virtual highway via 4 million cell towers.

So the global information infrastructure — counting all its constituent features from networks and data centers to the amazingly energy-intensive manufacturing processes — has grown from an inexistent system several decades ago to one that now uses about 2,000 terawatt-hours of electricity a year. That’s more than 100 times as much electricity as all five million electric cars in the world use every year.

This means that the average electricity used by each smartphone is higher than the annual energy used by a typical home cooler. And all those estimates are based on a few years ago’s state of affairs.

A more digital future will inevitable use more energy

Some analysts now claim that even as digital traffic has soared in recent years, efficiency gains in data-centered energy use have now dampened or even flattened growth. Such claims face recent countervailing factual trends. There’s been a dramatic acceleration in data center spending on hardware and buildings since 2016 along with a huge jump in that hardware’s power density.

Whether or not the growth in digital energy demand in recent years may or may not have slowed, a far faster cloud expansion is coming. Whether cloud energy demand is growing proportionately will largely depend on how fast data use increases, and in particular what cloud use is being used for. Any substantial increase in energy demand will make the engineering and economic challenges of meeting the core operational metric of the cloud far more difficult: always available.

In the past five years, more square feet of data centers were built than during the entire preceding decade. There is even a new category of “hyperscale” data centers: silicone-filled buildings covering more than a million square feet each. In real-estate terms, think of these as the equivalent of the dawn of skyscrapers a century ago. But while the world today has fewer than 50 hyper-tall buildings the size of the Empire State Building, there are already some 500 hyperscale data centers around the globe. And the latter have an appetite for collective energy greater than 6000 skyscrapers.

We don’t have to guess what’s boosting cloud traffic growth. The big drivers at the top of the list are AI, more video and especially data-intensive virtual reality, as well as the expansion of network “edge” micro data centers.

cloud computing,servers,cloud

Until recently, most of the news about AI has focused on its job-killer potential. The truth is that AI is the latest in a long line of productivity-driving tools that will replicate what productivity growth has always done throughout history: creating net employment growth and more wealth for more people. For COVID-19 recovery we’ll need much more of both. But this is another time, a story. From personal health analysis and drug delivery to medical research and job hunting, it’s already clear that AI has a role to play in everything. The odds are that in the end AI will be seen as a “good” net.

READ  Flutterwave joins e-commerce, M-Pesa is linking to Visa

In terms of energy, however, AI is the most data-hungry and power-intensive use of silicon that has yet been created — and the world wants to use billions of such AI chips. The computing power devoted to machine learning has generally doubled every several months, a sort of hyper-version of Moore’s Law. For example, Facebook last year pointed to AI as a key reason for its annual doubling of its data center power usage.

in our near future, we should also expect consumers to be ready for the age of VR-based video after weeks of lockdowns experiencing shortcomings in video conferencing on small planar screens. VR increases the image density by as much as 1000x and drives data traffic up roughly 20-fold. The technology is ready despite fits and starts, and the coming wave of high-speed 5 G networks has the ability to handle all those extra pixels. Nevertheless, it requires repeating: since all bits are electrons, that means more virtual reality leads to more power demands than in the forecasts of today.

Add to all this the recent trend of building micro-data centers closer to customers on “the edge.” Light speed is too slow to deliver AI-driven intelligence from remote data centers to real time applications such as VR for conferences and games, autonomous vehicles, automated manufacturing, or “smart” physical infrastructure, including smart hospitals and diagnostic systems. (The healthcare’s digital and energy intensity is already high and rising: a hospital’s square foot already uses some five times more energy than a square foot in other commercial buildings)

Edge data centers are now predicted to add 100,000 MW of power consumption before a decade is out. For the sake of perspective, that’s far more than California’s entire electric grid power capacity. Again, in recent years none of that has been on the roadmap of any energy forecaster.

Will digital energy priorities shift?

Which brings us to a related question: Will cloud companies continue to focus their spending on energy indulgences or availability in the post-coronavirus era? By indulgences I mean those corporate investments made in wind / solar generation somewhere else (including overseas) other than powering one’s own facility directly. Those remote investments are ‘credited’ to a local facility for claiming that it is green powered, even though the facility is not actually powered.

Nothing prevents any green-seeking company from physically disconnecting from the conventional grid and building its own local wind / solar generation – except that to do so and ensure availability 24/7 would result in an approximately 400 percent increase in the electricity costs of that facility.

As it stands today in terms of prospects for purchased indulgences, it’s useful to know that the global information infrastructure is already consuming more electricity than all the solar and wind farms combined in the world. So there’s not enough wind / solar power on the planet for tech firms — much less anyone else — to buy as ‘credits’ to offset all digital energy use.

The handful of researchers who are studying digital energy trends expect cloud fuel use in the coming decade to rise by at least 300 percent, and that was before our global pandemic. Meanwhile, the International Energy Agency predicts over that time frame a ‘merely’ doubling in global renewable electricity. That prediction was made in the precoronavirus economy as well. The IEA now fears the recession will drain fiscal enthusiasm for costly green plans.

Regardless of the issues and debates surrounding the technologies used to make electricity, the information infrastructure operators’ priority will shift to its availability increasingly, and necessarily. That’s because the cloud is becoming more and more inextricably linked to our economic health, our mental and physical health as well.

All of this should make us optimistic about what comes from the pandemic and unprecedented shutdown of our economy to the other side of the recovery. Credit Microsoft, in its pre-COVID 19 energy manifesto, for noting that “advances in human prosperity … are inextricably linked to energy use.” Our cloud-centered infrastructure of the 21st century will be no different. And that proves to be a good thing.