Home » today » Technology » In the future, electricity consumption by AI is on the rise.

In the future, electricity consumption by AI is on the rise.

Apr 7, 2023

AI is eating electricity in the future / by Investman

in the technology industry Probably no one doesn’t know ChatGPT, which is like a sign of the beginning of the age of AI.

but another angle Many people may not know that AI also comes with a major challenge, namely, they are eating up our energy enormously. and is likely to increase further according to the advancement of technology

Today, let’s see how much power AI pumps ?
Invest Man will tell you about it.
╔═══════════╗
Inflation and market volatility Follow multi-page, focused economic news on Blockdit – a content platform with 2 million active users. Try it free. blockdit.com/download
╚═══════════╝
To say that AI is simply an upgraded computer software. would not be wrong

Because even though every step of the work of AI is more complicated than the computers we use today. But it requires data processing. And use electricity as a driver, not different from a computer

with current technology Normal computers are not capable of supporting AI processing.

One of the main reasons is that “the amount of data AI uses to learn is enormous.”

When data is entered to learn and analyze There are enormous amounts, thus making AI require more computing power.

For example, ChatGPT that we are using now. There’s a supercomputer behind it.

by key tool Used for processing is called “graphic card” and the processing used for ChatGPT is over 10,000 NVIDIA A100s.

To support learning messages up to 570 gigabytes or 612,032 million characters.

If speaking, visualize This is comparable to 564,000 copies of all Harry Potter novels.

which human beings cannot read this much But this is not the limitation of AI being able to outperform the average person.

Even if it can do better, it’s true. But in exchange for using more electricity, such as ChatGPT, which is expected Use up to 1.2 gigawatts of electricity per hour.

This is equivalent to the electricity consumption of 120 American households for a period of 1 year.

In addition, there is an assessment that AI used in the organization will consume electricity up to 10-15% of the total amount of electricity of the organization ever

Another consequence of AI’s use of large amounts of data is the need to store more data.

The best place to store data for AI work is the “data center”.

Currently, there are more than 8,000 data centers worldwide.
Of these, there are 1,000 large data centers, although there were only 400 in 2016.

With this large number of data centers resulting in more electricity consumption both from processing storage process Including the cooling system in the data center

Data from the University of Pennsylvania. The United States states that in the future our computers will use more processing power. both from increased processing power and a wider scope of use

In the past, only 1-2% of the world’s total electricity was used, but in the future, computers may use up to 8-21% of the world’s electricity.

Up to this point, it can be seen that The issue of power generated by AI is a serious one. and cannot be overlooked

Because even the human brain has limitations in processing and storing information. But it uses only 40 watts of energy each day. which is equivalent to opening a notebook for only 1 hour

Unlike AI, it has enough computational power to change our world in many ways, but at the cost of huge amounts of energy. and is likely to increase further

problem of this Therefore, it is not just about how we develop AI or how much AI will change our world.

But we still have to keep in mind along with that The energy we have to spend on this technology. Will it be enough? And how much will it affect the energy crisis on our planet in the future?

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.