How fast are algorithms improving? | MIT News

Algorithms are a lot like a parent for a computer. They tell the computer how to make sense of the information so that they, in turn, can do something useful with it.

The more efficient the algorithm, the less work the computer has to do. Despite all the technological advancements in computer hardware and the controversial Moore’s Law lifespan, computer performance is only one side of the picture.

Behind the scenes, a second trend is emerging: algorithms are improved, so less computing power is needed. While algorithmic efficiency is less important, you’ll definitely notice if your trusted search engine suddenly becomes a tenth faster, or if moving through large datasets felt like you were wading through mud.

This has led scientists at MIT’s Laboratory for Computing and Artificial Intelligence (CSAIL) to wonder: How fast are algorithms improving?

The existing data on this issue was largely anecdotal, consisting of case studies of particular algorithms that were believed to be representative of the broader scope. Faced with this dearth of evidence, the team began analyzing data from 57 textbooks and over 1,110 research articles to trace the history of improved algorithms. Some of the research papers directly reported how good the new algorithms were, and others had to be reconstructed by the authors using “pseudocodes,” shortened versions of the algorithm that describe the basic details.

In total, the team looked at 113 “algorithm families,” collections of algorithms that solve the same problem that had been highlighted as the most important in computer textbooks. For each of the 113, the team reconstructed its history, tracking each time a new algorithm was proposed for the problem and particularly noting which ones were most effective. Going in performance and separated by decades, from the 1940s until now, the team has found an average of eight algorithms per family, a couple of which have improved its efficiency. To share this assembled knowledge database, the team also created Algorithm-Wiki.org.

Scientists noted how quickly these families improved, focusing on the most analyzed feature of algorithms – how quickly they could guarantee the problem was solved (in computer parlance: “worst-case time complexity” ). What emerged was huge variability, but also important information about how transformative algorithmic improvement has been for computing.

For big computational problems, 43% of algorithm families exhibited year-over-year improvements equal to or greater than the much vaunted gains of Moore’s Law. In 14% of the problems, the improvement of the performances thanks to the algorithms have surpassed those who came from improved hardware. The gains from improved algorithms have been particularly important for big data issues, so the importance of these advancements has increased in recent decades.

The biggest change the authors observed came when a family of algorithms went from exponential complexity to polynomial complexity. The amount of effort it takes to solve an exponential problem is like a person trying to guess a combination on a lock. If you only have one 10-digit number, it’s easy. With four dials like a bike lock, it’s hard enough for no one to steal your bike, but it’s still conceivable that you could try all the combinations. With 50, that’s almost impossible – it would take too many steps. Problems that are exponentially complex are like those of computers: as they get bigger, they quickly outgrow the computer’s ability to handle them. Finding a polynomial algorithm often fixes this, solving problems in a way that no hardware improvement can do.

As rumors of Moore’s Law coming to an end quickly permeate global conversations, researchers say IT users will increasingly need to look to areas like algorithms to improve performance. The team says the results confirm that historically the gains from algorithms have been huge, so the potential is there. But if the gains come from algorithms instead of hardware, they will be different. Hardware improvement from Moore’s Law occurs smoothly over time, and for algorithms, the gains are in stages which are usually large but infrequent.

“This is the first paper to show how quickly algorithms improve over a wide range of examples,” says Neil Thompson, MIT researcher at CSAIL and the Sloan School of Management and lead author on the new paper. “Through our analysis, we were able to tell how many more tasks could be performed using the same amount of computing power after improving an algorithm. As problems reach billions or billions of data points, algorithmic improvement becomes considerably more important than hardware improvement. In an age of growing concern about IT’s environmental footprint, it’s a way to improve businesses and other organizations without a hitch. “

Thompson wrote the article alongside visiting MIT student Yash Sherry. The article is published in the IEEE Proceedings. The work was funded by the Tides Foundation and the MIT Digital Economy Initiative.


Source link

Comments are closed.