COMP30026 Models of Computation – Turing Machines and Termination
COMP30026 Models of Computation
Turing Machines and Termination
Bach Le / Anna Kalenkova
Lecture Week 10
Semester 2, 2021
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 1 / 23
Chomsky Hierarchy
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 2 / 23
Chomsky Hierarchy Again
Regular
Context-free
Context-
sensitive
Decidable
Turing recognisable
Finite
automata
Pushdown
automata
Linear-
bounded
automata
Halting
Turing
machines
Turing machines
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 3 / 23
Context-Sensitive Grammars (Not Examinable)
A context-sensitive grammar G is a 4-tuple (V ,Σ,R , S), where
1 V is a finite set of variables,
2 Σ is a finite set of terminals,
3 R is a finite set of rules,
4 S is the start variable,
5 Each rule is of the form (1) x → y , where x , y ∈ (V ∪ Σ)+ and
|x | ≤ |y |, or (2) S → ǫ.
Let u, v ,w ∈ (V ∪ Σ)∗. Then uxw ⇒ uyw iff x → y is a rule in R .
L(G ) = {s ∈ Σ∗ | S
∗
⇒ s}
A language which can be generated by some context-sensitive
grammar is a context-sensitive language.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 4 / 23
Context-Sensitive Grammars (Not Examinable)
A context-sensitive grammar G is a 4-tuple (V ,Σ,R , S), where
1 V is a finite set of variables,
2 Σ is a finite set of terminals,
3 R is a finite set of rules,
4 S is the start variable,
5 Each rule is of the form (1) x → y , where x , y ∈ (V ∪ Σ)+ and
|x | ≤ |y |, or (2) S → ǫ.
Theorem: A context-free language can be generated by a
context-sensitive grammar.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 5 / 23
Context-Sensitive Grammars. (Not Examinable)
This context-sensitive grammar generates L = {anbncn|n ≥ 1}:
S → aSBC | aBC
CB → BC
aB → ab
bB → bb
bC → bc
cC → cc
S ⇒ aSBC ⇒ aaSBCBC ⇒ aaaBCBCBC ⇒ aaaBBCCBC ⇒
aaaBBCBCC ⇒ aaaBBBCCC ⇒ aaabBBCCC ⇒ aaabbBCCC ⇒
aaabbbCCC ⇒ aaabbbcCC ⇒ aaabbbccC ⇒ aaabbbccc.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 6 / 23
Context-Sensitive Grammars. (Not Examinable)
A context-sensitive grammar G is a 4-tuple (V ,Σ,R , S), where
1 V is a finite set of variables,
2 Σ is a finite set of terminals,
3 R is a finite set of rules,
4 S is the start variable,
5 Each rule is of the form (1) x → y , where x , y ∈ (V ∪ Σ)+ and
|x | ≤ |y |, or (2) S → ǫ.
Let u, v ,w ∈ (V ∪ Σ)∗. Then uxw ⇒ uyw iff x → y is a rule in R .
L(G ) = {s ∈ Σ∗ | S
∗
⇒ s}
Context-sensitive languages are recognised by linear bounded
automata (Turing machines with a tape of a bounded finite length).
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 7 / 23
Unrestricted Grammars. (Not Examinable)
An unrestricted grammar G is a 4-tuple (V ,Σ,R , S), where
1 V is a finite set of variables,
2 Σ is a finite set of terminals,
3 R is a finite set of rules,
4 S is the start variable.
Let u, v ,w ∈ (V ∪ Σ)∗. Then uxw ⇒ uyw iff x → y is a rule in R .
L(G ) = {s ∈ Σ∗ | S
∗
⇒ s}
Languages generated by unrestricted grammars are recognised by
Turing machines (Turing recognisable).
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 8 / 23
Turing Machines
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 9 / 23
Turing Machines
A Turing machine has an unbounded tape through which it takes its
input and performs its computations.
Unlike our previous
automata it can:
both read from and
write to the tape, and
move either left or
right over the tape.
state
control
b a a xy xy · · ·
unbounded tape
tape head
The machine has distinct accept and reject states, in which it
accepts/rejects irrespective of where its tape head is.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 10 / 23
Turing Machines Formally
A Turing machine is a 7-tuple M = (Q,Σ, Γ, δ, q0, qa, qr) where
Q is a finite set of states,
Γ is a finite tape alphabet, which includes the blank character, xy,
Σ ⊆ Γ \ {xy} is the input alphabet,
δ : Q × Γ → Q × Γ× {L,R} is the transition function,
q0 is the initial state,
qa is the accept state, and
qr ( 6= qa) is the reject state.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 11 / 23
The Transition Function
A transition δ(qi , x) = (qj , y , d) depends on two things
1 current state qi , and
2 current symbol x under the tape head
It consists of three actions
1 change state to qj ,
2 over-write tape symbol x by y , and
3 move the tape head in direction d .
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 12 / 23
Drawing Turing Machines
We can have a graphical notation for Turing machines similar to that
for finite automata.
On an arrow from qi to qj we write
x → d when δ(qi , x) = (qj , x , d), and
x → y , d when δ(qi , x) = (qj , y , d), y 6= x .
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 13 / 23
Turing Machine Example 1
q0 q1 qa
qr
a → R
xy → R xy → R
b → R
a → R
b → R
This machine recognises the regular language (a ∪ b)∗aa(a ∪ b)∗.
We can leave the reject state qr out with the understanding that
transitions that are not specified go to qr .
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 14 / 23
Turing Machine Configurations
We write uqv for this configuration:
u
︷ ︸︸ ︷
v
︷ ︸︸ ︷
· · · all blank · · ·
q
On input aba, the example machine goes through 5 configurations:
ǫq0aba (or just q0aba)
⇒ aq1ba
⇒ abq0a
⇒ abaq1xy
⇒ abaxyqr
We read the relation ‘⇒’ as “yields”.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 15 / 23
Computations Formally
For all qi , qj ∈ Q, a, b, c ∈ Γ, and u, v ∈ Γ
∗, we have
uqibv ⇒ ucqjv if δ(qi , b) = (qj , c,R)
qibv ⇒ qjcv if δ(qi , b) = (qj , c, L)
uaqibv ⇒ uqjacv if δ(qi , b) = (qj , c, L)
The start configuration of M on input w is q0w .
M accepts w iff there is a sequence of configurations C1,C2, . . . ,Ck
such that
1 C1 is the start configuration q0w ,
2 Ci ⇒ Ci+1 for all i ∈ {1 . . . k − 1}, and
3 The state of Ck is qa.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 16 / 23
Turing Machines and Languages
The set of strings accepted by M is the language of M , L(M).
A language A is Turing recognisable (or recursively enumerable, or
just r. e.) iff A = L(M) for some Turing machine M .
Note carefully that three behaviours are possible for M on input w :
M may accept w , reject w , or fail to halt.
If A is recognised by a Turing machine M that halts on all input, we
say that M decides A.
A language is Turing decidable (or recursive, or just decidable) iff
some Turing machine decides it.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 17 / 23
Turing Machine Example 2
This machine decides the language {02
n
| n ≥ 0}.
It has input alphabet {0} and tape alphabet {xy, 0, x ,#}.
q0 q1 q2 q3
qa q4
0 → #,R 0 → x ,R
xy → R
xy → L
# → R
x → R x → R
x → R
0 → L
x → L
0 → R
0 → x ,R
Running the machine on input 000:
q0000 ⇒ #q100 ⇒ #xq20 ⇒ #x0q3xy ⇒ #x0xyqr
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 18 / 23
The Versatility of Turing Machines
We can decide that a Turing machine produces output (not just
accept/reject) through its tape. This way a Turing machine can be a
general computing device, not just a language recogniser.
We can capture data other than strings via suitable representations.
For example, natural numbers can be represented as strings of (unary,
binary, decimal, . . . ) digits, so a Turing machine can compute
number theoretic functions N → N.
Or, by suitable encoding, it can take multiple arguments, and/or
return multiple results.
A Turing machine can also solve graph problems, once we decide on
a suitable representation for graphs.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 19 / 23
Robustness of Turing Machines
Different books use different definitions of Turing machines.
Most differences are minute and technical and aim at making the
machines easier to program (for example, we may insist that
machines start with a tape that has the first cell blank, and they try
to leave that cell blank—to make it easier to compose machines).
Similarly, in addition to the two kinds of tape movement, we can
allow a ‘no move’ option.
Turing machines are robust in the sense that such changes to the
machinery do not affect what the machines are capable of computing.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 20 / 23
Variants of Turing Machines
In an attempt to make the Turing machine more powerful we could
add more radically to its features:
Let its tape extend indefinitely in both directions.
Let its tape have multiple tracks.
Let there be several tapes, each with its independent tape head.
Add nondeterminism.
However, none of this increases a Turing machine’s capabilities as a
language recogniser.
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 21 / 23
Turing Machine Example 3
This machine decides the language {aibjck | k = i · j , i , j > 0}.
Its tape alphabet is {xy, a, b, c,A,B ,C}.
1 7 8
2 5 6 9
3 4 qa
a → A,R
b → B ,R
c → C , L
b → L
B → R
B → b, L
a → L
A → R
A → R
C → R
xy → L
a → L
b → R
a → R
b → L
B → b, L
C → R
b → R
C → R
C → L
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 22 / 23
Exercise
Design a Turing machine with input alphabet {a, b, c} which decides
the language: A = {aibjck | i , j , k > 0 ∧ i + j ≥ k}.
1 2 3 4
56
7 8
qa
a → A,R b → a,R c → R
xy, L
c → C , L
A → R
C → L
a → A,R
C → L
a → L
A → L
a → R b → a,R c → R
a → L
c → L
a → R
c → R
Models of Computation (Sem 2, 2021) Turing Machines and Termination © University of Melbourne 23 / 23