Although human civilisation had been pondering the many nuances of mathematics since before the time of the ancient Greeks, it was the 9th-century Persian polymath Muhammad ibn Musa al-Khwarizmi who developed the concept of algebra, allowing us to solve complicated equations with moving variables. It was his work that formed the foundation of what we now call an algorithm, even lending the Latinised version of his name, *Algorithmi*, to the calculation. But over centuries of use, accelerated by the 21st century’s reliance on technology allowing algorithms to run much of our lives, the term’s definition has changed.

Ask a computer scientist what an algorithm is and they will tell you it is a set of instructions that takes an input, performs some repeatable computation on it and provides an output. Think of it like a super-precise recipe, usually written in the cold logic of a programming language.

A simple example is the bubble sort, which arranges a list of numbers in ascending order. It begins by comparing the first two numbers. If the first is greater than the second, it swaps them. Otherwise, it moves on to the next pair. It cycles through the list again and again until it passes through without any swaps needed, at which point it outputs an ordered list. If you are shopping online and filter products by price, the bubble sort algorithm is kicking into gear behind the scenes.

These days, popular use of the word algorithm is morphing: it is increasingly used to describe almost anything that a computer accomplishes. That includes the realms of artificial intelligence (AI) and machine learning, where the steps in the recipe aren’t always quite so clearly laid out.

Take neural networks, a type of AI system that mimics the human brain in that it can be trained to perform a task based on looking at examples of correct and incorrect results. Such “algorithms” can be incredibly powerful, but it is usually hard to look inside and determine how they really work.

There are those who find the loosening of the term algorithm to include AI unhelpful. “Now people use ‘algorithm’ to mean almost anything,” says Martin Dyer at the University of Leeds, UK. “I’ve become so annoyed at people misusing it.”

Dyer warns that, in future, we may increasingly lean on machine learning as an “easy way out” – a route to solve problems without fully understanding them ourselves.

He says we ought to apply the right kind of algorithm in the right context. There are times when a rigid set of predictable steps is desirable and times when highly capable but ambiguous AI can be beneficial. “It’s fine if it gets wrong whether you like this book or not, but it’s not fine if it crashes your car,” says Dyer.