AI Requires Lots Of Power

Foley & Lardner


Foley & Lardner LLP looks beyond the law to focus on the constantly evolving demands facing our clients and their industries. With over 1,100 lawyers in 24 offices across the United States, Mexico, Europe and Asia, Foley approaches client service by first understanding our clients’ priorities, objectives and challenges. We work hard to understand our clients’ issues and forge long-term relationships with them to help achieve successful outcomes and solve their legal issues through practical business advice and cutting-edge legal insight. Our clients view us as trusted business advisors because we understand that great legal service is only valuable if it is relevant, practical and beneficial to their businesses.
Artificial intelligence (AI) is revolutionizing everything from health care and finance to entertainment and daily conveniences.
United States Technology
To print this article, all you need is to be registered or login on

Artificial intelligence (AI) is revolutionizing everything from health care and finance to entertainment and daily conveniences. Behind the seamless user experience and groundbreaking innovations lies a critical component that often goes unnoticed: the immense power required for AI processing.

AI systems, especially those based on deep learning and neural networks, demand substantial computational resources. These systems analyze vast amounts of data to learn patterns, make predictions, and improve over time. This process, known as training, involves running complex algorithms across powerful hardware, which consumes significant energy.

The linked article here from Goldman Sachs describes the huge need AI has for power.

High-performance GPUs (Graphics Processing Units) and TPUs (Tensor Processing Units) are vital to AI processing. Unlike traditional CPUs, GPUs and TPUs are designed to handle parallel processing tasks, making them ideal for the matrix operations central to AI algorithms. However, this advanced processing capability comes at a cost: these units require a lot of electricity to function effectively.

Data centers, where AI models are often trained and deployed, house thousands of servers – each containing multiple GPUs and TPUs, all working in tandem. The energy consumption of these data centers is substantial. According to some estimates, data centers account for about 1% of global electricity demand, a figure that is expected to rise as AI adoption grows.

The need for power doesn't end with training. Inference, the process of using a trained AI model to make predictions or decisions, also requires significant computational resources, especially in real-time applications. As AI becomes more embedded in everyday devices—from smartphones to autonomous vehicles—the demand for efficient, low-power AI processing solutions becomes even more pressing.

Efforts are underway to mitigate the environmental impact of AI's power consumption. Innovations in hardware design, such as more energy-efficient chips, and advancements in AI algorithms that reduce computational requirements, are critical areas of research. Additionally, leveraging renewable energy sources for data centers can significantly reduce the carbon footprint of AI operations.

While AI continues to push the boundaries of what's possible, it's essential to recognize and address the power needs that underpin this technology. Sustainable and efficient energy solutions will be key to ensuring that the benefits of AI can be enjoyed without compromising our planet's health.

On average, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More