# Lecture 23

## Non-computable Problems

Godelisation

This is the technique discovered by Kurt Godel for giving numbers to objects in a collection. The numbers are called Godel numbers and can be used to make deductions about the objects. Godel used his numbers for formulas and proofs, but they can also be used to make deductions about programs.

Every turing machine has a unique Godel number associated with it, a sort of serial number. We can obtain the Godel number by encoding the state table in binary. Since the state table is all that makes one TM different from another, then this will be a unique number.

## The Halting Problem

Can we build a turing machine to decide if another turing machine will halt or not. This would be a very useful algorithm because we could use it to decide if our programs are correct or not.

The answer is NO.

To prove this assume that we have such an algorithm. Call it halttester. The algorithm has two inputs, P- a program and D- the program's data. halttester(P,D) returns 1 if P would halt when executed with input data D, otherwise it returns 0.

```int halttester(char *P, char *D)  {
/* does P halt? */
/* return 1 if it does, 0 otherwise */
.
.
}```
Now lets write another function that calls halttester.
```int newhalttester(char *P)  {
/* check to see if program P halts */
/* if executed with data P */
return halttester(P,P);
}```
If we assume that halttester exists then we can write the following program.
```int funny(P)  {
while (newhalttester(P));
}```
If we call funny with a program, then it will only halt if the program does not halt.

If we call funny with itself, then it will halt if it doesn't halt and not halt if it does halt. This makes no sense. The original assumption that we can write a halttester function must be false.

In practice, it is easy to see if some programs terminate. If the program uses only sequence and selection then it will terminate. However, there are lots of programs that do not obviously terminate.

consider the following program to try to disprove Fermat's last theorem. Assume a function called pow(x,y) exists to return x to the power of y.

```void flt(int n) {
int a,b,c;

for(a=1;;a++)
for(b=1;b<=a;b++)
for(c=2;c<=a+b;c++)
if (pow(a,n)+pow(b,n)==pow(c,n)) return;
}```
There is a mathematician called Andrew Wiles who says he can prove that this program will never terminate, but the proof is so complex that very few people understand it.

## Other non-computable problems

Can we write a program that takes two programs and compares them to see if they perform the same job - NO.

Given two BNF descriptions, can we tell if they describe the same language - NO.

Luckily some problems are computable.

Given a BNF description and a sequence of characters, can we check if the sequence is grammatically correct - YES.

## Partial Computability

Some problems are more computable than others. There are inputs to the halting problem that will produce an output e.g. a sequence or a while(1); statement. The halting problem is partially computable.

Godel showed that there is no algorithm to decide weather an arbitrary statement about integers is true or not - "Arithmetic is not computable". In fact, arithmetic is not even partially computable. In any arithmetical proof system, there are statements about integers which are true but cannot be proved!!

# Complexity

The study of computability looks at what can be computed, complexity looks at how efficiently things can be computed.

Complexity does not relate to the difficulty of writing a program or of understanding a program, but the resources required by a program.

Of all possible problems, only a small subset are computable and only a small subset of these are feasibly computable. Any problem that would take 1000 years to solve would not be feasibly computable. Luckily, the set of feasibly computable problems is large enough to keep us in work for the time being.

The computer resources used by a problem are time, memory and hardware. Time is how long the problem takes to solve, since computers often have different speed internal clocks, we measure this in imaginary units called steps. Memory is the amount of storage an algorithm uses. Hardware is the physical resources used, e.g. the number of processors.

Consider an algorithm to multiply two numbers by hand using long multiplication.

```    1984
*  6713
5952
1984
13888
11904
13318592```
If the algorithm is presented with two n-digit numbers, it will need to add together n rows containing n (or n+1) digits. Each row can be computed in n steps and there are n rows. The total time taken for this problem is thus n*n or n2 steps. The execution time of the algorithm is proportional to n squared.

The same problem can be solved by another algorithm.

```      a b  c d
1984*6713

ac=19*67  = 1273
(a+b)(c+d)-ac-bd=(103*80)-1273-1092  =   5875
bd=84*13  =     1092
13318592```
This algorithm takes a time proportional to n1.59 The fastest known algorithm for multiplying two numbers on a sequential machine has an execution time proportional to nlognloglogn. On a parallel computer, numbers can be multiplied in a time proportional to logn.

Thus different algorithms to solve the same problem may have different complexities. Often, different algorithms use resources in different ways. In this case, the programmer must choose a trade off, and try to balance the resources used.