In theoretical computer science, a Turing machine is a model of computation, where an abstract machine operates on an infinite strip of tape divided into discrete cells. Turing machines can do everything that a real computer can do — problems that it cannot do are “beyond the theoretical limits of computation”.1

Deterministic Turing machines (DTM) have a few key properties: a finite state control, read-write head, and a tape with an infinite sequence of tape squares. Turing machines are like finite state automata with a few key differences:

  • TMs can both write to and read from the tape.
  • The read-write head can move both to the left and right on the tape.
  • The tape is infinite.
  • The rejecting and accepting states take effect immediately (in FAs, we can transition from an accepting state).

Theory

Formally, a Turing machine can be defined as a 7-tuple:

  • is a finite set of states.
  • is a finite input alphabet not containing the blank symbol .
  • is a finite tape alphabet, where and .
  • is the transition function, i.e., given a current state and the head is over a tape square with symbol , the transition function goes to an output state , writes a symbol to replace , and goes left or right.
  • is the start state.
  • is the accept state.
  • is the reject state, where .

We call a formal language Turing-recognisable if there exists a Turing machine that recognises it. We define a Turing machine that makes a decision to accept or reject a decider. If it recognises a language, then it decides that language, i.e., the language is Turing-decidable. If the machine reaches either an accepting or rejecting state, we say that it halts.

As mentioned, Turing machines can do everything a real computer can do. In practice, we can think of Turing machines in intuitively the same way as we understand algorithms or computer programs. If a Turing machine cannot solve a problem, then there cannot exist an algorithm to solve it. This is the Church-Turing thesis.

Footnotes

  1. From Introduction to the Theory of Computation, by Michael Sipser.