To train ChatGpt as much water as needed for a nuclear reactor

To train ChatGpt as much water as needed for a nuclear reactor

[ad_1]

Artificial intelligence consumes too much fresh water. Or at least that’s what researchers at the University of Colorado Riverside and the University of Texas at Arlington claim in a study just published entitled Making Ai Less Thirsty, or “Make Ai less thirsty”. The basic idea is underline how energy consumption, an aspect on which we generally focus, is only part of the problem. In fact, water is used to cool the data centers that allow digital services to function, and specifically to train generative linguistic models such as those of OpenAi. Too much, according to the two American universities.

Emissions

The fourth nation in the world. This is how much the web consumes

by Jaime D’Alessandro


“The growing carbon footprint of artificial intelligence (Ai) models, especially large ones like Gpt-3 (which underpins ChatGpt, ed.) and Gpt-4, has often been subjected to scrutiny”, reads the research. “Unfortunately the equally important and huge water footprint has not received the same attention”. It obviously applies to technology in general and to all other sectors productive of which in fact in general the direct or energy-related greenhouse gas emissions and not the absorption of water unless we are talking about agriculture.

Apparently a medium length conversation with ChatGpt would be equivalent to picking up a bottle of water. The Gpt-3 training in Microsoft’s US datacenters instead would have consumed, according to estimates, about 700,000 liters. Enough to produce 370 BMW or 320 Tesla cars, the study reads. And consumption would have tripled if the training had been done in Microsoft’s Asian datacenters, which are less cutting-edge and therefore also less optimized. The header gizmode she went further signaling that it is an amount equal to that required to fill a cooling tower of a nuclear reactor.

Google Earth: How the Earth has changed in almost 40 years





Beyond the comparisons that leave the time they find, in fact different fields are compared with each other such as the production of energy or that of cars with a digital service, the aspect considered worrying by the research is the growing scarcity of water resources in the world. A situation aggravated by the constant expansion of data centers linked to cloud services which, however, are very little compared to the waste due, for example, to faults in the distribution systems. In Italy, for example, 42% of the water transported is lost.

The data

With the water emergency at risk, 18% of GDP



It is a pity then that the study lacks a comparison with other types of commonly used digital applications, from social networks to e-mail. On the other hand, it is not correct to claim that the problem has never been taken into consideration, given that cooling has a cost. Microsoft itself launched Project Natick in 2015, placing a small data center in the sea at a depth of 35 meters precisely to avoid having to cool it. There was also a second phase, from 2018 to 2020.

However, it is true that this is an experiment and that the other improvements made to traditional cooling systems are not sufficient to keep consumption constant given the exponentially growing demand for computing power. It doubles every two or three months only with regard to generative AI, according to what has been revealed by Nvidia, one of the most important in the field of infrastructure necessary for artificial intelligence. Google-owned servers, in the US alone, took in 12.7 billion liters for cooling in 2021, and about 90% of that was potable water. For the training of Gpt-3 it is then necessary to add another 2.8 million liters due to the consumption of electricity. So, combined with data center cooling and the water used in power generation to power them, that comes to 3.5 million liters for the US or 4.9 million liters in Asia.

The experiment

What AI knows about the climate crisis and how it would solve it

by Jaime D’Alessandro



For sure, beyond Gpt, without a strategy to address the global water challenge, nearly half of the population will face challenging conditions by 2030 and about one in four children will live in areas prone to high water stress by 2040. The research concludes that ‘when and where to train a large AI model is essential’. Then choose the time and geographical location of a certain data center to minimize consumption. However, the problem arises linked to solar energy which is increasingly used by cloud service infrastructures, also because it costs less, and which is produced in abundance on the hottest and sunniest days, the same ones that involve the need for greater cooling of the servers. Energy and water efficiency come into conflict here. It will be necessary to find the right balance considering the increase in temperatures caused by human activity.

[ad_2]

Source link