Tuesday, March 19, 2024

In 1965, Gordon Moore observed that the number of transistors in a dense integrated circuit doubled every two years. This observation, Moore’s Law, has held until recently. Moore’s Law has begun to slow down, although exponential improvements are still being made. This is a result of the revolution in big data, hyperconnectivity and artificial intelligence placing even greater demands on integrated circuits, which have not advanced as rapidly. The consequence of this is that whereas in the past, the environmental impact of computing was offset by its rising efficiency, this efficiency has started to wane and the environmental impact has become more pronounced. However, as this occurs, a new source of efficiency has emerged: better algorithms.

The rising demand for computing can be seen in the explosion in the number of data centers across the world. Data centers, which can take up millions of square feet, consume enormous amounts of electricity. According to the International Energy Agency, data centers are responsible for 1% of global energy consumption, as well as 0.3% of global carbon dioxide emissions. Without greater efficiencies, computing will lead to even greater environmental problems.

An important question for us is whether the efficiencies obtained from better algorithms match up to those we have enjoyed for decades from Moore’s Law. Moore’s Law allows computers to tackle an increasing number of operations, while algorithms allow computers to find better ways of doing stuff. According to a study by Yash Sherry and Neil C. Thompson, algorithms do match up to Moore’s Law in providing computing efficiencies.

An example of this is with Google Maps. If you want to find the best possible path between 1,000 frequently visited places using an old algorithm, you will have to use a million times more computing power than a new algorithm would. Text matching provides another example. For instance, when a search engine looks for keywords in a web page, or when a lawyer searches legal documents for specific references, they are conducting text matching. With better algorithms, the process can be sped up by 100 times than it took in the past. In other words, even in a world of declining efficiencies in integrated circuit development, algorithms are capable of making computing much more efficient, reducing their consumption of energy and their use of floor space and other resources.

In Sherry and Thompson’s study, the duo poured through 57 textbooks and over a thousand academic papers to discover what researchers thought were the most important algorithms out there. They mapped 113 “algorithm families”, each of which solves a specific problem in different ways. They went through a time series, starting from the 1940s, looking at new algorithms that emerged to solve specific problems, and placed them in the appropriate algorithm family.

They found that when it came to big data problems, for instance, 43% of the algorithm families experienced improvements each year that were equal to or greater than the improvements due to Moore’s Law. 14% of these algorithms led to efficiency gains far in excess of those from Moore’s Law. This result had many computer scientists smiling broadly with their dental veneers glistening in the sun, because it gives us hope that the environmental impact of computing can be tackled with better algorithms, despite the reduced gains from Moore’s Law.

0 Comments

Leave a Comment