Site icon YSL News

Efficient AI Models: Computer Scientists Redefine Power Consumption

The creation of AI models has a significant impact on the environment, but it is often not taken into consideration. Researchers at the University of Copenhagen have put together a guide for designing AI models that are more energy-efficient while still maintaining high performance. They believe that the energy usage and environmental impact of a model should be a key factor in its design and training.

The development of AI models is an overlooked climate culprit. Computer scientists at the University of Copenhagen have created a recipe book for designing AI models that use much less energy without compromising performance. They argue that a model’s energy consumption and carbon footprint should be a fixed criterion when designing and training AI models.The importance of energy consumption should be considered when creating and training AI models. It’s well known that using AI, such as Google, Siri, ChatGPT, and others, requires a significant amount of energy. Research suggests that by 2027, AI servers will use as much energy as Argentina or Sweden. For example, a single ChatGPT prompt can consume as much energy as forty mobile phone charges. However, there is still a need for the research community and industry to prioritize developing energy-efficient AI models to be more environmentally friendly.At the University of Copenhagen, computer science researchers emphasize that developers are currently concentrating on creating AI models that are accurate without taking into account their energy efficiency. Assistant Professor Raghavendra Selvan states that this approach is similar to considering a car effective just because it gets you to your destination quickly, without considering the fuel consumption. As a result, AI models often end up being inefficient in terms of energy use. Professor Selvan’s research focuses on finding ways to reduce AI’s carbon footprint. This new study involves him and a computer science student.Pedram Bakhtiarifard, along with another author, have shown that it is possible to significantly reduce CO2e emissions without sacrificing the accuracy of an AI model. This can be achieved by incorporating the consideration of climate costs into the design and training stages of AI models.

“By developing a model that prioritizes energy efficiency from the start, we can minimize the carbon footprint throughout the entire lifecycle of the model. This includes the energy-intensive training phase, which can last for weeks or months, as well as its practical application,” explained Selvan.

Guidelines for the AI industry

IIn their research, the scientists computed the energy required to train over 400,000 convolutional neural network AI models without actually training them. Convolutional neural networks are utilized for various purposes such as analyzing medical images, language translation, and object and face recognition, which you may be familiar with from your smartphone’s camera app. The researchers used the calculations to establish a benchmark set of AI models that consume less energy to complete a task while maintaining similar performance levels.The research demonstrates that making adjustments to models or using different types of models can lead to energy savings of 70-80% during training and deployment, with a minimal 1% decrease in performance. The researchers believe that this estimate is cautious. They suggest that AI professionals can use their findings as a guide, not only for understanding the performance of various algorithms, but also for their energy efficiency. By making substitutions in the model design, practitioners can often achieve the same results. This allows them to choose a model based on both performance and energy efficiency.Pedram Bakhtiarifard explains that choosing the right AI model from the beginning can significantly improve performance and energy consumption, without the need for training each model. Many models are often trained before finding the most suitable one for a specific task, which results in an energy-intensive process. Therefore, it is more environmentally friendly to select a model from the start that does not consume excessive power during training. The researchers emphasize that in fields such as self-driving cars or certain medical areas, model precision is crucial for safety.Here, it is essential not to compromise on performance. However, this should not deter us from striving for high energy efficiency in other areas.

“AI has incredible potential. But to ensure sustainable and responsible AI development, we need a more comprehensive approach that considers not only model performance, but also climate impact. We demonstrate that it is possible to find a better balance. When developing AI models for different tasks, energy efficiency should be a fixed criterion — just as it is standard in many other industries,” concludes Raghavendra Selvan.

The “recipe book” assembled in this studyThe dataset is open-source and available for other researchers to use for experimentation. All of the information on the 423,000 architectures can be found on Github, which can be accessed by AI practitioners using Python scripts.

Researchers from UCPH calculated the energy required to train 429,000 convolutional neural networks, a type of AI model, included in the dataset. These models are used for various purposes such as object detection, language translation, and medical image analysis.

The study estimated that the training process alone for the 429,000 neural networks would consume 263,000 kWh of energy. This is equivalent to the amount of energy needed to rnrnAn average Danish citizen consumes over 46 years, and it would take a computer about 100 years to do the training. Instead of training these models themselves, the authors estimated them using another AI model, saving 99% of the energy it would have taken.

Training AI models consumes a lot of energy, emitting a lot of CO2e due to the intensive computations performed on powerful computers. This is especially true for large models like the language model behind ChatGPT, which are often processed in data centers requiring significant amounts of energy.The power needed to keep computers running and at a suitable temperature is crucial. The type of energy used by these facilities, which might come from non-renewable sources, affects their environmental impact.

Journal Reference:

  1. Pedram Bakhtiarifard, Christian Igel, Raghavendra Selvan. EC-NAS: Energy Consumption Aware Tabular Benchmarks for Neural Architecture Search. , DOI: 10.1109/ICASSP48485.2024.10448303

rnrn

Exit mobile version