Skip to content

MSN Technology

Tech Solutions for a Smarter World

Menu
  • About MSN Technology
  • Contact Us
  • Write for Us
Menu
GettyImages 1151571867 1152x648

Can we make AI less power-hungry? These researchers are working on it.

Posted on March 24, 2025

GettyImages 1151571867

HIS Tests, their team used setup with NVIDIA’s A100 and H100 GPUs, which are commonly used in data centers today, and it measured how much energy they used to run different large language models (LLMS), the flying models that produce text input -based images or videos, and many other types of AI systems.

The largest LLM Meta Lama on the Leader Board was 3.1 405B, which was an open source chat with 405 billion parameters. He used 3352.92 Jools in a two -H100 GPUS application application. It is about 0.93 watts of hours. These measurements confirmed the improvement of hardware energy efficiency. Maxterel 8x22B was the team’s largest LLM, whose team managed to run on both Empire and Hopper platforms. Running the model on two emperors GPU resulted in a hopper GPU on a GPU just 0.15 watts of hours per application.

However, what is not known is the performance of a proprietary model like GPT4, Gemini, or Grook. The ML Energy Initiative team says it is very difficult for the investigating community to come up with energy -saving problems when we do not even know what we are facing. We can estimate, but Chung insists that they need to be analyzed by error. Today we have nothing like that.

According to Chung and Chowdhury, the most important problem is lack of transparency. “Companies like Google or Open AI have no incentive to talk about electricity consumption,” said Chaudhry. If anything is, releasing the actual number will hurt them. ” “But people should understand what is actually happening, so we may somehow do them in releasing some of these numbers.”

Where the rubber meets the road

“Energy performance in the data centers follows the trend like Moore’s law,” he said, adding that the rack, which is used in the data centers between 10 to 14 Nvidia GPUs, is increasing, but the performance is improving, but the performance is improving.

Source link

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recent Posts

  • New Lego-building AI creates models that actually stand up in real life
  • When doctors describe your brain scan as a “starry sky,” it’s not good
  • Wearables firm’s endless free hardware upgrades were too good to be true
  • Recap: Here’s what happened in Google’s search antitrust trial
  • Linux to end support for 1989’s hottest chip, the 486, with next release

Recent Comments

  1. How to Make a Smart Kitchen: The Ultimate Guide - INSCMagazine on Top Smart Cooking Appliances in 2025: Revolutionizing Your Kitchen
  2. Top Smart Cooking Appliances in 2025: Revolutionizing Your Kitchen – MSN Technology on Can I Control Smart Cooking Appliances with My Smartphone?
  3. Venn Alternatives for Remote Work: Enhancing Productivity and Collaboration – MSN Technology on Top 9 AI Tools for Data Analytics in 2025
  4. 10 Small Business Trends for 2025 – MSN Technology on How To Extending Your Business Trip for Personal Enjoyment: A Guide

Archives

  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024

Categories

  • Business
  • Education
  • Fashion
  • Home Improvements
  • Sports
  • Technology
  • Travel
  • Uncategorized
©2025 MSN Technology | Design: Newspaperly WordPress Theme
Go to mobile version