Computability

Modern computers are based on what's called a Turing Machine, which was the first general purpose type of computer. What can a Turing Machine based computer compute? What can’t a turing machine based computer compute? What kind of problems are impractical to compute for anything other than really small data sets? We’ll briefly explore these questions below.

Straight up Non-Computable

Let's first talk about problems that current computers simply cannot compute.

The Halting Problem

If you are a software developer, you undoubtably have encountered infinite loops. Unlike interpretted languages such as PHP which will time out, lower level languages such as C++ will just keep running in an infinite loop forever if one exists. With this in mind, it may seem like a good idea for someone to create a debugging program that detects those kinds of run-time errors in apps. However, you may be surprised to learn that the reason why no such program exists is because it's impossible for a computer to determine if an arbitrary computer program and input will finish running or not.

To informally prove this, let's assume we have a function called "does_it_halt" which we can pass an arbitrary program and input to and have it tell us if the program will halt or loop indefinitely.


Assumption:
There exists a way for a computer to detect if any arbitrary program and input will halt or not.


With that assumption, let's assume someone wrote the following function as a simple way to utilize this knowledge:

function does_it_halt(program, input) {
    if (eventually halts)
        return true;
    else
        return false;
}

For simplicity's sake let's also say someone wrote wrapper for this function that passes the program as the input. Remember that our does_it_halt method is supposed to work on any arbitrary input, so passing in the memory location or text of a program works just fine.

function debug_app(program) {
    return does_it_halt(program, program);
}

Alright, hopefully you'll agree this is pretty straight forward. Now let's say someone really smart informed us that our does_it_halt function couldn't possibly work, but we didn't believe him. As a result, he decided to write a cheeky little function called paradox that does the following:

function paradox(program) {
    if(debug_app(program))
        while(1) {};
    else
        return true;
}

echo debug_app(paradox);

Ha ha ha... so this dude wrote a function that puts itself into an intentional infinite loop if the does_it_halt says the program stops, and returns "true" if the program does not stop. Then he follows this up by calling our debug function on his code. Surely this is the most useless bit of code ever written right?

Well, before we can answer that, we have to examine what happens when this dude calls debug_app(paradox).

Let's say our debug_app() method returns true when passed Paradox. Let's logically think through what that means in terms of execution:

  1. First up, since "echo debug_app(Paradox)" returned True, that means that Paradox halts.
  2. In the process of testing if Paradox halts, does_it_halt(), somewhere in its magical logic would have to run Paradox with the given input. That input is also a reference to Paradox. So the call to paradox would look like "paradox(paradox)".
  3. However, inside of Paradox is also a call to debug_app()! This call to "debug_app(program)" within Paradox would resolve to "debug_app(Paradox)" at runtime.
  4. In order for does_it_halt to detect that Paradox halts, that means the call to "debug_app(Paradox)" within Paradox returns False, because otherwise Paradox would send itself into an infinite loop.

Uhh oh! Turns out that little piece of code wasn't as useless as we originally thought because it just proved that our does_it_halt function can't possibly exist as we've defined it. We know from point #1 above that "echo debug_app(Paradox)" returns true, however, we also know that in order for that to happen, the call to "debug_app(Paradox)" as mentioned in point #4 above must return false. So two different calls to "debug_app(Paradox)" are returning differing results which is impossible. As such, our premise is proved false by what's called a Proof By Contradiction.

This problem ("will a program halt") which is unsolvable by computers by is known commonly as The Halting Problem. Alan Turing proved back in 1936 that a general algorithm to solve it is not possible.

Post-Correspondance Problem (PCP)

Another quick example just to peak your interest. Let's say you have a list of some arbitrary number of dominoes which look something like this:

c
cb
ac
a
b
bc
a
ba
a
c
c
ac
cb
c
aa
b

You want to have a computer tell you whether there's a way in which you can order the dominoes such that the sequence of the top row matches the sequence of the bottom row. Here's an example of a match where both the top and the bottom read "abbcbba":

a
abb
bb
c
cb
bb
ba
a

Can a computer program be made for you that determines if there's an ordering that results in a match for an arbitrary set of dominoes? The answer is "no". This is called the Post Correspondence Problem and was introduced by Emil Post back in 1946.

Effectively Uncomputable (with our current algorithms)

P vs NP

In order to discuss problems that fall into this "effectively uncomputable" category, we quickly need to discuss the concept of P vs. NP. "P" are a class of problems for which their exists an algorithm to both find and verify an answer in polynomial time. "NP" (non-deterministic polynomial time) are a class of problems for which there exists algorithms to verify an answer in polynomial time, but may or may not have algorithms to find an answer. Note the "may or may not" statement... P vs. NP is actually one of the biggest unknowns in Mathematics/CS because no-one has been able to prove definitively that P = NP or that P != NP.

If P = NP, then that means there DOES exist polynomial time algorithms to solve problems we lump into this category, but that we just haven't found them yet. If P != NP, then that would confirm that no such polynomial time algorithms exist, and we could stop looking. There has been a 1 million dollar reward on the table since the year 2000 for anyone that can prove this one way or the other.

So if you're wondering at this point what the heck polynomial time is, and wishing you could remember all your math lessons from high school, no fear - let's look at an example:

x^3 + 2x^2 + 3x = A Polynomial

The importance of polynomials is that they, and anything less complex than them, can be quickly solved by computers. Equations can be generally clumped into one of a number of different time groups:

y = 1       // Constant time
y = x       // Linear time
y = x^2     // Polynomial time
y = 2^x     // Exponential time
y = x!      // Factorial time

For a more complete list of time complexities see: Time Complexity.

The problems we will be discussing which fall into this "practically uncomputable" category are "NP" problems for which we have no good algorithms to solve them. These are technically referred to as the "NP-Complete" class of problems, a term coined in the 1970s by researchers who realized that these problems are all essentially variations of the same thing, and if a solution could be found for one of them, it would mean all of them could in turn be solved.

For a good video on this topic, I'd recommend the following:

The Travelling Salesman Problem

The travelling salesman problem is a certain instance of a problem that has many general purpose applications. The scenario goes like this: you are a salesman who needs to travel to a bunch of cities, and end your trip where you started in your home city. You obviously want to minimize your travel time by taking the most efficient route that takes you through all the cities.

For an example of this problem, let's use a scenario where we want to visit just 4 cities, and so starting at city A, we need to calculate the optimal route. Each city has a travel time "cost" to travel between them that we'll use to calculate the best route.

TSP_Graph

We are starting in city A, so we just need to evaluate all our options that take us through all the cities and add up the travel costs like so:

RouteRoute CostsTotal Cost
A->B->C->D->A3+5+6+216
A->B->D->C->A3+3+6+820
A->D->C->B->A2+6+5+316
A->D->B->C->A2+3+5+818
A->C->B->D->A8+5+3+218
A->C->D->B->A8+6+3+320

One thing you might notice is that there's two routes that cost 16, two routes that cost 18, and two routes that cost 20. This is because one of each of these pairs is just the same route in reverse. So if you do ABCDA, it's the same route if you do it backwards as ADCBA! For this reason we can eliminate these duplicate routes, cutting our total routes to consider in half, and narrow our choices down to 3 real options. Obviously we would choose the one that costs 16.

RouteRoute CostsTotal Cost
A->B->C->D->A3+5+6+216
A->B->D->C->A3+3+6+820
A->C->B->D->A8+5+3+218

Now what if we needed to travel between 5 cities instead of 4? How much extra work would this be to figure out? Let's call this 5th city 'E' and examine just a piece of this expanded problem by determining the routes possible when we start with city B:

TSP_Graph

RouteRoute CostsTotal Cost
A->B->C->D->E->A3+5+6+4+725
A->B->C->E->D->A3+5+2+4+216
A->B->D->C->E->A3+3+6+2+721
A->B->D->E->C->A3+3+4+2+820
A->B->E->C->D->A3+6+2+6+219
A->B->E->D->C->A3+6+4+6+827
A->C->...(continued)......

Wow! Not only does that graph look WAYYYYY more complicated, expanding the problem size to 5 cities gives us a lot more routes to consider.

As you can see, going from just 4 cities to 5 increased our options when we start our travels by going to city B first from just two, up to six! Similarly, our total number of route combinations increased from 6 up to 24 (if you finish the above example by mapping out all the possibilities you'll see there are 24). Of course we can still eliminate duplicates and cut that number of possible routes in half down to 12... but that's still a big jump from the 3 possibilities we had previously.

What's important to realize is that this travelling salesman problem is one that has a factorial growth rate! To be more exact, the formula for the number of combinations that need to be considered when the number of cities is 'X' is this:

(X - 1)! / 2

If you don't quite remember how factorials are calculated, here's an example:

4! = 1 * 2 * 3 * 4 = 24

Factorial growth is even worse than exponential growth!!!

n n! (# of permutations) ~Time to Compute
4 24 -
5 120 -
6 720 -
7 5,040 -
8 40,320 -
9 362,880 -
10 3,628,800 3 seconds
11 39,916,800 56 seconds
12 479,001,600 >11 minutes
13 6,227,020,800 ~2.5 hours
14 87,178,291,200 ~34 hours
15 1,307,674,368,000 ~21 days
16 20,922,789,888,000 ~340 days
17 355,687,428,096,000 ~15.8 years
18 6,402,373,705,728,000 ~284 years
19 121,645,100,408,832,000 ~5,411 years
20 2,432,902,008,176,640,000 ~108,230 years

Explanation of Factorial Growth

Looking at the table to the left, you can see that factorial growth is extremely fast! We very quickly go from small, manageable numbers like the 4! we did an example of above, to astronomically large numbers very quickly. Why this matters is that in order to brute-force a solution to the Travelling Salesman problem, we have to calculate all the routes to determine the best one. So if there are 'n' route permutations (a permutation is a set of data where the order of the data matters), then that means 'n' routes we have to consider. And since the number of possible routes is growing factorially, this quickly means huge numbers as can be seen in the table.

In order to give you an idea of how difficult solving the Travelling Salesman problem is for a modern computer, I decided to do some tests with my Macbook Pro, 2.4ghz Intel i7, with 16 gigs of RAM. I didn't have an exact implementation of the Travelling Salesman on hand for testing, however, I did have an implementation of the Johnson-Trotter algorithm that I wrote. The Johnson-Trotter algorithm is used to generate all the possible permutations of a number of length 'n'. For instance, if you run Johnson-Trotter on number sequences of length 3, you get the following: 123, 132, 312, 321, 231, 213. A total of 6 permutations, which equals 3!. In other words, this is the same factorial growth, consider all the possible permutations, type of problem that we are dealing with for the Travelling Salesman. I ran this algorithm for numbers of length 10 and 11 (thus calculating all the permutations of each), and used the calculation speed gathered from those tests to make estimates for the larger numbers. I think you can see how calculating the best route for the salesman via considering all the possible routes is simply not calculatable for any reasonably large number of cities.

It's not all doom and gloom though! In the same way that humans can look at a map and know that traveling from LA to Boston, then to Seattle, then back to Florida is going to be a horribly inefficient route without even thinking about it, there are ways to figure out the TSP quicker than considering all the possible permutations. The current fastest methods are known as Branch and Bound methods, which you can research if you want to learn more.

The Knapsack Problem

Another problem that falls into the same NP-Complete category as the Travelling Salesman that you might find interesting is the Knapsack Problem. It goes like this: say you are a thief with a backpack that can hold a max of 50 pounds in weight. With a collection of different objects nearby of differing weights and values, you want to know what the most optimal set of items you should steal are that can fit into your pack. This problem is also a factorial growth example.

Hope these examples peaked your interest into computability!