Executives face more pressure than ever to reduce their environmental impact. This is especially true for data centers because of their contribution to global warming. If all the data centers in the world were one country, they would be ranked as the fifth largest energy consumers in the world. In 2020, data centers consumed about 1% of global electricity demand and accounted for 0.3% of all CO2 emissions.
Today, companies need to provide transparency about their carbon footprint, and there is a rush for data centers to improve their efficiency rankings. Data centers around the world are listed by PUE (Cost Effectiveness) and Greenpeace has created a Cleantech industry ranking of centers based on their carbon footprint.
Greener code requirement
The very first of data center sustainability is based on using renewable energy for cooling or optimizing the cooling system to reduce power consumption. However, in addition to the energy required for data analytics to maintain environmental controls, the software itself has a significant impact on the amount of electricity consumed. How much A little too much.
Based on current research, a giant machine learning (ML) model, such as the Mina, uses the same amount of energy as a passenger vehicle that travels 242,231 miles. Researchers at the University of Massachusetts at Amherst estimate that training a large deep-learning model produces 626,000 pounds of CO2, equivalent to the lifetime emissions of five cars.
As a result, interest and dedication to creating more efficient code has increased. The Green Software Foundation (GSF), with members such as VMware, Microsoft, Accenture and GitHub, has a mission to design, architect and code software that consumes less energy.
Tips for sustainable machine learning
There are many educational articles on how to write green algorithms for AI / ML models, but here are some basic tips.
One way to reduce computing resources is to reduce the number of training experiments. There are hundreds of ML models or blueprints that are already trained, where developers just need to bring their own data to add AI capabilities to the application, significantly reducing the time required to develop and train the model.
It is also important to have visibility in the carbon footprint of the algorithm to decide on the best way to optimize efficiency. Researchers at many universities have created tools for that purpose. For example, Green Algorithms calculates your cloud computing carbon footprint. Another example is CodeCarbon, a software package that integrates into the Python codebase and estimates the amount of CO2 produced by the computing resources used to execute the code.
Automation can also be used to reduce training time. It is possible to reduce the number of experiments and / or the amount of data analyzed while maintaining accuracy. By itself the more efficient data sampling can speed up the model runtime by a factor of 5.8.
Software that is actually used to perform calculations can also help reduce the number of computing resources required. There are databases specifically designed to process large amounts of data that can optimize the use of memory and storage to reduce energy consumption. The advantage of these databases is that there is no need to limit the amount of data analyzed, which reduces the risk of compromising the accuracy of the model by trying to speed up the runtime.
In addition to reducing model runtime, increasing energy efficiency, it also reduces the total time for insights for business-critical applications such as fraud detection, cybersecurity solutions, quality control, etc. More efficient code is not only good for the environment, but it is also good for business.
More potential customers want transparency in the company’s commitment to its green strategies, and having a code “green” standard can be an important first step. Employees want to work for an environmentally sensitive company that makes responsible decisions about the environment. In the future, cloud vendors may require visibility in the workload’s carbon footprint, along with penalties for processing that is deemed excessive or unnecessary.
With the huge number of calculations needed to make sense to make better business decisions, being socially responsible is not just a great thing, it has become a necessity.
Ohad Shalev is a strategic analyst at SQream,
Welcome to the VentureBeat community!
DataDecisionMakers is a place where experts, including tech people working on data, can share data-related insights and innovations.
If you would like to read about the latest ideas and latest information, best practices and the future of data and data tech, join us at DataDecisionMakers.
You might even consider contributing to your own article!
Read more from DataDecisionMakers