• 3 minutes e-car sales collapse
  • 6 minutes America Is Exceptional in Its Political Divide
  • 11 minutes Perovskites, a ‘dirt cheap’ alternative to silicon, just got a lot more efficient
  • 1 hour GREEN NEW DEAL = BLIZZARD OF LIES
  • 19 hours How Far Have We Really Gotten With Alternative Energy
  • 5 hours Could Someone Give Me Insights on the Future of Renewable Energy?
  • 3 days e-truck insanity
  • 1 day An interesting statistic about bitumens?
  • 6 days "What’s In Store For Europe In 2023?" By the CIA (aka RFE/RL as a ruse to deceive readers)
  • 8 days Bankruptcy in the Industry
  • 6 days Oil Stocks, Market Direction, Bitcoin, Minerals, Gold, Silver - Technical Trading <--- Chris Vermeulen & Gareth Soloway weigh in
  • 9 days The United States produced more crude oil than any nation, at any time.
Haley Zaremba

Haley Zaremba

Haley Zaremba is a writer and journalist based in Mexico City. She has extensive experience writing and editing environmental features, travel pieces, local news in the…

More Info

Premium Content

IBM's New Analogue Chip Boosts AI Energy Efficiency

  • IBM's analogue chip can run an AI speech recognition model 14 times more efficiently than traditional chips due to its compute-in-memory (CiM) design.
  • The chip's phase-change memory cells can represent synaptic weights in neural networks, reducing computational effort and energy consumption.
  • The AI industry's energy demands have surged, with AI's carbon footprint nearing that of some developed nations; this chip could be a solution to these rising concerns.
Chip

A new breakthrough in tech may have just solved artificial intelligence’s energy problems. AI requires a massive and growing amount of energy, and is producing more and more greenhouse gas emissions as the sector continues to expand. The issue has been the subject of increasing attention and anxiety in recent months, but those worries could soon be a thing of the past thanks to a new kind of analogue computer chip developed by IBM research.

The analogue chip is capable of running an AI speech recognition model 14 times more efficiently than a standard computer chip. The analogue chip is a compute-in-memory (CiM) model, which means that it’s able to performs calculation directly within its own memory instead of sending information back and forth millions of times to recall or store data in external memory chips, thereby relieving a significant bottleneck currently plaguing AI operations.

“IBM’s device contains 35 million so-called phase-change memory cells – a form of CiM – that can be set to one of two states, like transistors in computer chips, but also to varying degrees between them,” New Scientist reported this week. This is a huge breakthrough for computing, as “these varied states can be used to represent the synaptic weights between artificial neurons in a neural network, a type of AI that models the way that links between neurons in human brains vary in strength when learning new information or skills, something that is traditionally stored as a digital value in computer memory.” Thanks to this innovation, the analogue chip is able to store and process weights with just a fraction of the computing effort typically required. 

The efficiency of these new analogue chips could not only solve AI’s energy use issue, but also its chip-use issue. Training AI programs can require an enormous amount of computer chips, with thousands sometimes used on a single project. This, too, has become an issue due to a worldwide computer chip shortage coupled with a boom in new AI ventures. In particular, AI companies are having unprecedented difficulties when trying to secure a type of chip known as a graphics processing unit, or GPU, as it has (until now) been the most efficient form of chip for AI’s processing needs. The shortage has left startups and smaller companies “scrambling” and taking “desperate measures” to secure the essential chips, a recent New York Times article reported.

This kind of energy-saving innovation for AI can’t come fast enough. The scale of the present-day energy needs for machine learning is enormous, and growing at a breakneck pace. The sector’s energy use has grew 100-fold between 2012 and 2021, and has dramatically spiked since ChatGPT hit the market and spurred and AI gold rush. And most of that energy is derived from fossil fuels. Already, the overall carbon footprint of Artificial Intelligence is almost as large as that of Bitcoin – meaning it’s equivalent to the carbon footprint of some developed nations. “Currently, the entire IT industry is responsible for around 2 percent of global CO2 emissions,” Science Alert recently reported. But AI is on track to blow those numbers out of the water. Consulting firm Gartner projects that in a business-as-usual scenario, the AI sector alone will consume 3.5 percent of global electricity by 2030.

It has been estimated that the training process for GPT-3, which later evolved into ChatGPT, required around 1,287 megawatt hours of electricity and a whopping 10,000 computer chips. To put this in perspective, that amount of energy could power about 121 homes in the United States for an entire year – and produce around 550 tonnes of carbon dioxide in the process. At present, experts calculate that Open.AI, the creators of ChatGPT, are likely spending approximately US$700,000 per day just on computing costs in order to provide the chatbot’s services to 100 million users worldwide. 

By Haley Zaremba for Oilprice.com 

More Top Reads From Oilprice.com:


Download The Free Oilprice App Today

Back to homepage





Leave a comment

Leave a comment




EXXON Mobil -0.35
Open57.81 Trading Vol.6.96M Previous Vol.241.7B
BUY 57.15
Sell 57.00
Oilprice - The No. 1 Source for Oil & Energy News