# Algorithms

Informally, an algorithm is an ambiguous list of information processing instructions. In fact, in principle, an algorithm should be so explicit that even a machine would be able to execute it. More formally, Turing-1936 defined algorithms as data to be given to a Turing machine UpAndAtom20.

Interestingly, Turing proved that algorithms could equivalently be defined as Turing machines themselves, or as executable instructions written in a programming language Church-1936.

## Universality

The fundamental theorem of computer science is arguably Turing-1936's proof of the existence of *universal Turing machines*.

Essentially, the key property of such universal machines **U** is that, given any other Turing machine **M**, there exists a code, which we can write **code(M)**, such that the execution of **M** on input **I** is equivalent to the execution of **U** on input **(code(M), I)**. In other words, the single universal machine **U** can simulate any other machine **M**, provided it is given a description **code(M)** of the machine **M**.

## The halting problem

The other key result of Church-1936 and Turing-1936 is the proof that some information processing tasks cannot be solved by algorithms. One such example is the infamous *halting problem*. This is the problem of determining whether a machine **M** will terminate on different inputs **I**. More precisely, the undecidability theorem proves that no Turing machine **N** will be able to process **code(M)** and **I**, terminate, and output a bit which equals 1 if and only if the computation of **M(I)** terminates UpAndAtom-18.

## Church-Turing thesis

In 1936, Alonzo Church and Alan Turing both postulated what is now known as the *Church-Turing thesis*. This thesis claims that *all algorithms* are Turing machines. In other words, Turing's definition of algorithms is *complete*.

This postulate can be interpreted in at least two ways. First, we may regard it as a prediction on the future of mathematics and computer science. In this sense, Church and Turing are postulating that no other researcher will ever find a fundamentally different and consensually satisfactory definition of algorithms. To this day, no one has.

Another interpretation is called the *physical Church-Turing thesis*. It argues that our universe does not allow for information processing beyond the capabilities of Turing machines. This thesis is refutable. It would be rejected by any conception of a machine able of super-Turing information processing, such as solving the *halting problem*. So far, the thesis has not been convincingly rejected, despite nearly a century of massive investments in better computing machines Aaronson@FQXi-14.

Interestingly, the physical Church-Turing thesis can be justified by the combination of fairly accepted principles. In particular, Gandy-80's theorem derives it from (1) the homogeneity of space and time, (2) a maximum speed in space-time and (3) a bound on the amount of information that can be stored in a region of space. The first hypothesis is extremely common since Galileo's relativity principle. The second hypothesis follows from Einstein's special relativity. The third hypothesis is now known as Bekenstein's limit, and is justified by the physics of black holes. Generalizations to quantum physics have been proposed Dowek-12.

The Church-Turing thesis has important implications on what can be achieved within our universe. For instance, Lloyd02 estimate computational capacity measures of the universe. Hoang-20 argues that it implies that searching for the "fundamental laws" is meaningless, as, by virtue of universality, any Turing-complete laws would be a "fundamental law".