The computations required for Deep Learning research have been doubling every few months, resulting in an estimated 300,000x increase from 2012 to 2018. AI could account for as much as one-tenth of the world’s electricity use by 2025 according to this article [1].
AI papers tend to target accuracy rather than efficiency. This following figure shows the proportion of papers that target accuracy, efficiency, both or other from a sample of 60 papers from top AI conferences [2].

The Allen Institute for Artificial Intelligence (AI2) is proposing a new way to incentivize energy-efficient machine learning. Researchers of AI2 have proposed a new way to mitigate this trend. They recommend that AI researchers always publish the financial and computational costs of training their models along with their performance results [2]. In their work, they proposed the following Red AI equation:
Cost(R) ∝ E·D·H
The cost of an AI (R)esult grows linearly with the cost of processing a single (E)xample, the size of the training (D)ataset and the number of (H)yperparameter experiments.
Even though this equation ignores other factors, it illustrates three quantities that are each an important factor in the total cost of generating a result. Below,
The authors hope that increasing transparency into what it takes to achieve performance gains will motivate more investment in the development of efficient machine-learning algorithms [3].
The vision of Green AI raises many exciting research directions that help to overcome the inclusiveness challenges of Red AI. Progress will reduce the computational expense with a minimal reduction in performance, or even improve performance as more efficient methods are discovered.
Please find more details in the references.
Reference: