How we use AI at Tapio

Reading time: 4 minutes

Blog_Headers (4)

 

Artificial Intelligence (AI) is everywhere right now. From your business tools to even WhatsApp, AI seems to have become a must-have rather than an extra feature.

At the same time, tools like ChatGPT have received backlash for their heavy environmental impact, because they consume large amounts of water, energy and raw materials.

However, the way the word “AI” is used often creates misunderstandings: the term Artificial Intelligence includes more than just Large Language Models (LLM) such as ChatGPT, Gemini and Falcon, and each type of AI has different environmental impacts.

At Tapio, we’re building carbon management software to help companies measure and reduce their carbon footprint, so we want to use AI responsibly. In this article, we will explain how.

 

 

What is AI, really?

 

Artificial Intelligence is a broad term that includes everything from the logic that allows you to play chess against a computer to more complex generative models like LLMs. Simply put, AI refers to computer programs that can perform tasks that typically require human intelligence.

To get a bit more technical, here’s an example of what AI includes and its definitions (N.B.: this is not a complete list):

  • Artificial intelligence: the process of imitating human intelligence, based on the creation and application of algorithms.
  • Machine learning: the use and development of computer systems that can learn and adapt without following explicit instructions, using algorithms and statistical models to analyse and draw inferences from patterns in data.
  • Deep learning: a type of machine learning based on artificial neural networks in which multiple processing layers extract progressively higher-level features from data. LLMs are very large deep learning models.
  • Generative AI: computer systems that can copy intelligent human behaviour and produce new content, such as text.. 

So when people talk about AI, they can mean various things.

 

 

The environmental cost of AI

 

Not all types of AI have the same environmental impact. When you hear about the environmental impact of AI, they’re probably referring to LLMs, since these models have a huge impact.

Training, fine-tuning and deploying LLMs requires a large amount of energy and water. Data centers also need a large amount of raw materials and contribute to the increase of electronic waste.

Since AI has such a huge impact, we’re currently working on including the carbon emissions of AI in Tapio’s carbon report, and invite everyone to do the same.

 

 

How we use AI at Tapio

 

We understand that AI can improve the efficiency of a task, however, at this point of our journey we don’t want to trade a few minutes of efficiency for a spike in Tapio’s greenhouse gas emissions. If we use AI, we want to make sure that it represents a great advantage for the work of carbon experts.

We’re currently working on a few features including AI, that will significantly allow carbon experts to save time during data collection and reviews of carbon reports. However, it’s possible to use AI models that are more responsible. Here’s how you can do it:

  1. Choose AI models that are locally trained instead of using models trained on more energy-consuming external GPUs;
  2. Choose AI models that are already trained, but not fine-tuned. In most cases, this doesn’t impact the accuracy of your models.

 

 

Emission factor suggestions

 

So far, we’ve used AI for Emission factor suggestions, a feature allowing us to get an automatic emission factor suggestion based on the name of the activity data. We’ve also been using the Anomalies detection toolkit to detect anomalies in carbon reports automatically and save time during the reviewing process. This feature is for internal use only, but will soon be available to all users.

In the case of the Emission factor suggestions for example, to control our impact, we’ve opted for a model with a limited size but enough efficiency to bring the suggestions to a good performing feature. Also, this model isn’t fine-tuned, as this would bring minimal performance gains through a large impact on its energy consumption and carbon emissions.