project
Electrification

AI and the Entropy Economy

3 min read

AI is driving innovation at lightning speed, but our focus on faster and easier must not overshadow our opportunities to lessen carbon production while optimizing energy efficient learning.

Entropy Infographic

Sustainable innovation is driving GE Vernova Advance Research’s game-changing concept, the Entropy Economy. As energy consumption increases globally, we must connect how we optimize energy systems and computational systems. Joint optimization by managing the shared entropy flow is an opportunity to significantly reduce carbon emissions and energy costs while maximizing energy efficient learning.

Did you know? By 2030, 10% to 20% of energy used worldwide is expected to be consumed by computers, including large data centers/high performance computing (HPC). As our appetite for digital grows, so too do our opportunities to innovate and evolve.

Today’s energy systems and computational systems are optimized separately. The energy grid, for example, provides low-cost power everywhere in the world, irrespective of how efficiently that energy is used. Likewise, the CERN sponsored LHC computing grid optimizes use of computational capacity without regard to energy use. There’s missed opportunity in waste heat capture and so much more we can do with reuse. Simply put, our world lacks an efficiency metric for AI.

The Entropy Economy takes a holistic and unique approach to address the predicted exponential rise in energy consumed by compute within the next decade. GE Vernova seeks to jointly optimize learning, energy efficiency, and disposition of waste heat through a combination of energy aware machine learning (EAML), grid architectures, and distributed HPC infrastructure.

Optimizing entropy reduction in learning while addressing entropy flow loss through thermodynamic inefficiencies will lessen carbon production, increase energy efficiency, and can ultimately help stabilize the grid.

Learn More 

Scott Evans presented “Causality and Green AI: Can Causal AI help solve the Climate Crisis” at the Causal AI 2024 conference in San Francisco on June, 4, 2024. Watch Scott's presentation 

entropy economy image

 

 

 

Scott Evans presented “The Entropy Economy and the Kolmogorov Learning Cycle” at the Symposium on Algorithmic Information Theory and Machine Learning, July 4-5, 2022 at the Alan Turing Institute in London, UK. Watch Scott's presentation

Project Impact

Energy aware machine learning (EAML) – Key to executing the Entropy Economy is the development of EAML algorithms that enable tradeoffs between HPC throughput, energy consumed, and output quality. Future work will produce EAML algorithms capable of balancing energy loads, learning from wasted/needless entropy flow loss, and adjusting to create ideal energy profiles.

Grid architectures – The second focus of the Entropy Economy is to move information work machines – better known as data centers and HPCs – to where renewable, low-cost energy exists through an optimized compute/energy grid architecture. The work here shifts to a more holistic vision of the electric grid that seeks to jointly optimize the power source and information work systems.  

Distributed HPC infrastructure – This component considers the distribution of HPCs throughout the grid that could deliver smart loads by making tradeoffs enabled by the EAML algorithms. HPCs can optimize the use of recovered power while simultaneously achieving desired accuracy. Leveraging EAML algorithms can enable dynamic change of numeric precision depending on the available recoverable power. This, in turn, addresses the challenge of desired accuracy defining constraints.

Executed in concert, these three Entropy Economy components are poised to balance the energy load throughout the grid while increasing supercomputer/datacenter AI capacity with the same or less overall carbon production. 

“We talk about ‘using’ energy, but doesn’t one of the laws of nature say that energy can’t be created or destroyed? … When we ‘use up’ one kilojoule of energy, what we’re really doing is taking one kilojoule of energy in a form that has low entropy (for example, electricity), and converting it into an exactly equal amount of energy in another form, usually one that has much higher entropy (for example, hot air or hot water). When we’ve ‘used’ the energy, it’s still there; but we normally can’t ‘use’ the energy over and over again, because only low entropy energy is ‘useful’ to us... It’s a convenient but sloppy shorthand to talk about the energy rather than the entropy…” 

David Mackay,  Sustainable Energy — without the hot air. 

Publications

A New Paradigm for Carbon Reduction and Energy Efficiency for the Age of AI

The Entropy Economy: A New Paradigm for Carbon Reduction and Energy Efficiency for the Age of AI

Using ML Training Computations for Grid Stability in 2050

Optimizing Emissions for Machine Learning Training

Deep Time Series Sketching and Its Applications on Industrial Time Series Clustering

Energy Efficient Streaming Time Series Classification with Attentive Power Iteration

The Entropy Economy and the Kolmogorov Learning Cycle: Leveraging the Intersection of Machine Learning and Algorithmic Information Theory to Jointly Optimize Energy and Learning

Share

Project Lead

Scott Evans

Principal Scientist, AI-Machine Learning

Project Team

Tapan Shah

Senior Engineer

Hao Huang

Machine Learning Scientist

Alexander Duncan

Lead Engineer

Blake Rose

Edison Engineer