
ELEGANT – 6G-OpenLab: A unique infrastructure as a real-world testbed for 6G applications
October 29, 2024
MOST: A unique hybrid device to generate electricity and store thermal energy in an efficient and sustainable manner
November 25, 202422/11/2024
Energy consumption in training and inference of artificial intelligence (AI) models, such as large language models (LLMs) like GPT-4, has become a critical challenge due to its environmental impact and the costs associated with high-performance computing (HPC). The energy required to train and interact with these models increases significantly as their size and complexity grow.
Training an AI model involves feeding its neural networks with large volumes of data and performing multiple iterations to optimise the model’s parameters and enhance its results. These tasks require substantial processing power, leading to considerable electricity consumption. According to Schwartz et al., the computational costs of training machine learning (ML) models increased 300,000-fold in just six years (2013–2019), doubling every 3.4 months. More recent studies indicate that these costs continue to rise at a rate of 2.4 times per year since 2016. If we focus solely on AI systems and consider an intermediate scenario, by 2027, data centres and servers hosting AI models are projected to consume between 85 and 134 terawatt hours (TWh) annually. This figure is comparable to the yearly electricity consumption of countries such as Argentina, the Netherlands, or Sweden and accounts for approximately 0.5% of current global electricity consumption.
However, to fully understand the environmental impact, it is essential to take a holistic view of the ML ecosystem, beyond just model training, and include the operational carbon footprint of ML. Inference - using a trained model to make real-time predictions - can also consume significant energy, particularly in large-scale applications. While the energy consumption for a single inference is much lower than for training, the widespread and continuous use in environments like customer service systems, content generation, or autonomous driving can result in a greater cumulative energy impact. In fact, for services like ChatGPT, inference has been identified as the primary driver of emissions, producing 25 times the carbon emissions required for its training in a single year, according to Chien et al.
This high energy consumption has several implications:
- Economic Costs: Training large-scale models is expensive due to the substantial hardware requirements and the energy needed to power data centres.
- Environmental Impact: Prolonged training with extensive electricity use contributes to carbon emissions, especially when non-renewable energy sources are involved, thereby challenging AI’s sustainability.
- Optimisation: Developing more efficient training and inference methods is crucial, including model optimisation, specialised hardware such as GPUs and TPUs, and strategies like federated learning and model compression.
In this context, the Software and Service Engineering Group (GESSI) at the Universitat Politècnica de Catalunya - BarcelonaTech (UPC) is developing GAISSA, a project that has led to the creation of the GAISSALabel software tool. This energy card evaluates the energy efficiency of any machine learning model during both the training and inference phases, making it particularly valuable for HPC services. GAISSALabel can assess the footprint of machine learning models, helping improve efficiency, reduce operational costs, and shorten computation times.
GAISSALabel will enable AI model developers to:
- Monitor energy consumption and generate detailed reports.
- Identify areas for improvement to reduce energy consumption during the programming of ML model training and inference.
- Optimise models to make them more efficient in terms of time and resources.
By following the refactoring recommendations provided by GAISSALabel, ML engineers can enhance the energy efficiency of ML systems by up to 50%. This improvement not only leads to economic savings (e.g., lower electricity bills for data centres and HPC facilities), but also reduces the environmental impact of AI applications.
GAISSALabel offers solutions at three levels:
- Social: Promoting commitment to the energy efficiency of ML systems and empowering end users to select and use sustainable ML solutions.
- For ML software providers: Providing the necessary resources to develop sustainable ML systems, supported by management and reporting tools aligned with regulatory requirements.
- Individual: Assisting data scientists and ML engineers in understanding and managing energy efficiency in production by raising awareness and providing insights into the energy consumption of ML systems and tools.
The project began in December 2022 and is scheduled to conclude in September 2025, with a total budget of €277,035.00. It is funded by the Ministry of Science and Innovation and Universities, along with contributions from the Next Generation EU Funds.

Sector
You want to know more?
Related Projects
- The Centre of Technological Innovation in Power Electronics and Drives (CITCEA) of the Universitat Politècnica de Catalunya - BarcelonaTech (UPC) is leading a European project that redesigns the current energy system to stabilise the power grid in the face of high renewable energy penetration. The proposed solutions involve using the loads that consume energy to help balance the grid.
- A team from the Environment Centre Laboratory (LCMA) of the Universitat Politècnica de Catalunya - BarcelonaTech (UPC) is taking part in a study commissioned by the city councils of la Llagosta, Mollet and Santa Perpètua (in Vallès Occidental), with the main objective of identifying the impact of industrial plants on the generation of unpleasant odours and continuously monitoring air quality in these municipalities.
- Researchers from the Concrete Sustainability and Smart Structures (C3S) research group, part of the Construction Engineering (EC) group at the Universitat Politècnica de Catalunya - BarcelonaTech (UPC), are leading the CIRC-BOOST project, which aims to promote sustainability, industrial competitiveness and greater resource efficiency in the European construction sector.
- A research team involving the Barcelona Innovative Transportation (BIT), inLab FIB, CARNET Barcelona – Future Mobility Research Hub (CER-AMA), and the Department of Computer Architecture (DAC) of the Universitat Politècnica de Catalunya - BarcelonaTech (UPC) is driving the i-MovE project, which aims to incorporate multisectoral data to provide much more accurate and valuable information for the mobility sector. The project develops four use cases focused on both companies and mobility authorities, covering public and individual transport, using the UPCxels demonstrator.