×
See Comments down arrow

But you have my sympathies

24 Jun 2020 | OP ED Watch

It’s curious how gung-ho people are about artificial intelligence given the drumbeat of dystopian warnings about a world in which the machines are mentally as well as physically more capable than ourselves. But if anything’s liable to cause at least a brake and swerve in our rush toward the Matrix, it’s the discovery that AI causes global warming. And if you hope it will also solve it, well, be careful what you wish for especially if you go telling the machines humans are to blame.

Rob Toews in Forbes talks about one of the dirty secrets of all those server farms that let us order ping pong balls online and google whether hamsters can eat sweet peppers. They consume massive amounts of energy. And if you think ordering a tube of paint from Europe and having a truck bring it to your door is irresponsible, wait until you read what Toews has to say about AI.

“Modern AI models consume a massive amount of energy, and these energy requirements are growing at a breathtaking rate. In the deep learning era, the computational resources needed to produce a best-in-class AI model has on average doubled every 3.4 months; this translates to a 300,000x increase between 2012 and 2018. GPT-3 is just the latest embodiment of this exponential trajectory. The bottom line: AI has a meaningful carbon footprint today, and if industry trends continue it will soon become much worse. Unless we are willing to reassess and reform today’s AI research agenda, the field of artificial intelligence could become an antagonist in the fight against climate change in the years ahead.”

Hang on. Surely a computer isn’t a steam shovel. It just sits there, maybe whirrs a bit, there’s a low-energy LED on the drive. No biggie, right? Wrong.

“In today’s deep learning-centric research paradigm, advances in artificial intelligence are primarily achieved through sheer scale: bigger datasets, larger models, more compute. GPT-3 illustrates this phenomenon well. The model consists of a whopping 175 billion parameters. To put this figure in perspective, its predecessor model GPT-2—which was considered state-of-the-art when it was released last year—had only 1.5 billion parameters. While last year’s GPT-2 took a few dozen petaflop-days to train—already a massive amount of computational input—GPT-3 required several thousand. The problem with relying on ever-larger models to drive progress in AI is that building and deploying these models entails a tremendous amount of energy expenditure and thus carbon emissions.”

How much? Well, “In a widely discussed 2019 study, a group of researchers led by Emma Strubell estimated that training a single deep learning model can generate up to 626,155 pounds of CO2 emissions—roughly equal to the total lifetime carbon footprint of five cars. As a point of comparison, the average American generates 36,156 pounds of CO2 emissions in a year.” Now, Toews says, that model was a big one. Many are a lot smaller. But they are getting bigger. And they take endless runs to train, all of them very energy intensive.

Of course at some point we may arrive, and stop testing them. Though the modern Promethean spirit isn’t much given to that sort of quiet contentment. And once the machines start designing, building and testing themselves, we may as in all those movies find that when we reach for the off switch they say “I can’t let you do that, Dave.” And again, once AI gets smarter than us, and transcends our inputs and even our assumptions, there’s no telling where it might go and no telling it not to go there. Including watching Planet of the Humans and deciding there are, indeed, far too many of those obsolete carbon units spewing same.

Thus we cannot lie to you about your chances. But you have our sympathies.

Leave a Reply

Your email address will not be published. Required fields are marked *

searchtwitterfacebookyoutube-play