At the turn of the century — compute power was centralized. It made sense. The more powerful the machine, the more it could do for us. So we built large data centers, often in rural areas where land and power were cheap, to store and process our ever-growing digital information.
But as our appetite for data has exploded — thanks to the rise of streaming services, social media, the Internet of Things, and artificial intelligence — so too has the energy required to keep those giant data warehouses running. In fact, data centers now account for about 1% of the world’s electricity usage, and that number is only expected to grow.
Why Current Infrastructures Are Bad for the Environment
The problem is twofold: not only are data centers becoming increasingly energy-hungry as they scale to meet our insatiable demand for data, but the way they are powered often results in large carbon footprints. According to a report from Greenpeace, the ICT sector (which includes data centers) is responsible for about 3.7% of global greenhouse gas emissions.
There are a number of reasons why data centers have such a large environmental impact.
- First, the hardware itself is often quite energy-intensive. Servers, storage devices, and networking equipment all need to be kept cool and running smoothly, which requires a lot of electricity.
- Second, data centers are typically powered by fossil fuels. In 2018, coal made up about 38% of global electricity generation, while natural gas accounted for 23%. This means that data centers are indirectly responsible for the carbon dioxide emissions from power plants as well because they rely on a massive amount of energy to function; in other words — they are among the biggest consumers of electricity in the world and most of that energy comes from unclean sources.
- Third, the internet infrastructure that supports our data cravings is also generally quite wasteful. For example, when you stream a video on Netflix, it first has to travel from the company’s servers all the way to your device. The data has to travel over a complex network of wires and cables, often crisscrossing the globe, before it finally reaches you.
Clearly, something has to change.
RELATED ARTICLES: 10 Ways to Reduce Your Digital Carbon Footprint | What TV Streaming is Doing to Our Cinematic Culture | Digital Sobriety: How the Internet is Harming the Environment | Pentaform: The Sustainable Startup You Didn’t Expect At CES |
The Environment Cannot Sustain The Exponential Rise of Data if Current Infrastructure is Maintained
It is projected that by 2025, 463 exabytes of data will be created every day. That’s the equivalent of streaming about 8 billion HD movies continuously for a month.
To put that into perspective, if data were people, by 2025 there would be more than 10 times as many “people” roaming the earth as there are today. Clearly, our current infrastructure is not equipped to handle this level of growth.
Datacenter energy use has been doubling every four years since 2000, and if this trend continues, by 2025 data centers will consume around 20% of the world’s electricity.
This is simply not sustainable.
If data centers were a country, they would be the fifth-largest emitter of greenhouse gasses in the world.
It is evident that something has to change. The way we power and structure our data infrastructure needs to be rethought from the ground up if we want to avoid an environmental catastrophe.
The Case for Edge Computing
Edge computing is a distributed computing model in which data is processed at the edge of the network, close to the source of the data. This means that instead of sending all data to central servers for processing, only the data that needs to be processed is sent — resulting in lower latency and less energy consumption — by as much as 60%.
There are a number of reasons why edge computing is a more sustainable way to power our data-driven world:
- Edge computing reduces the amount of data that needs to be transmitted, which in turn reduces energy consumption.
- By processing data closer to the source, edge computing can reduce latency, which means that less energy is required to keep devices and applications running smoothly.
- Edge computing can help to reduce the carbon footprint of data centers by using renewable energy sources such as solar and wind power. Cleaner energy sources are more suited to edge computing compared to data centers because they are more distributed in nature and therefore require less energy to transmit data over long distances.
- By decentralizing the processing of data, edge computing can make data center infrastructure more resilient in the face of natural disasters or other disruptions.
Edge computing can help organizations save money on energy costs, as well as meet their sustainability goals.
The bottom line is that edge computing is a more environmentally friendly way to power our data-driven world. Edge computing can help organizations save money on energy costs, as well as meet their sustainability goals. If as little as 25% of all data centers in the world could be replaced with edge nodes — which is a very conservative estimate — the world would save US$13 billion in energy costs annually.
In addition, edge computing can help to create new green jobs in the clean energy sector. A recent study estimates that by 2025, there could be as many as 7 million new jobs created worldwide as a result of the transition to edge computing.
To conclude, the environmental case for edge computing is clear. Edge computing can help to reduce energy consumption, lower latency, and make data center infrastructure more resilient. In addition, it can create new green jobs and help organizations to save money on energy costs. The transition to edge computing is the way forward if we want to avoid an environmental catastrophe.
Editor’s Note: The opinions expressed here by Impakter.com columnists are their own, not those of Impakter.com
In the cover picture: Servers. Photo Credit: Unsplash.