程序代写 COMP2610/6261 – Information Theory Lecture 17: Noisy Channels

COMP2610/6261 – Information Theory Lecture 17: Noisy Channels
U Logo Use Guidelines
logo is a contemporary n of our heritage.
presents our name, ld and our motto:

Copyright By PowCoder代写 加微信 powcoder

earn the nature of things.
authenticity of our brand identity, there are n how our logo is used.
go – horizontal logo
go should be used on a white background. ludes black text with the crest in Deep Gold in
rinting is not available, the black logo can hite background.
e used white reversed out of a black occasionally a neutral dark background.
Research School of Computer Science
Preferred logo
Black version
2 October, 2018

1 Communication over Noisy Channels: Big Picture
2 Noisy Channels: Formally
3 Examples of Channels
4 Probability of Error

The Big Picture
.1 The big picture
Source coding
Channel coding
Compressor
Decompressor
Noisy channel
n Chapters 4 6, we discussed source coding with block codes, symbol code nd stream codesà We implicitly assumed that the channel from the compres
or to the decompressor was noise-freeà Real channels are noisyà We will no

The Big Picture
9.1 The big picture
Concept: Expected code length
n Chapters 4 6, we discussed source coding with block codes, symbol codes
and stream codesà We implicitly assumed that the channel from the compres-
Theorem: Source coding theorem
or to the decompressor was noise-freeà Real channels are noisyà We will now
pend two chapters on the subject of noisy-channel coding the fundamen-
Algorithms: { Huffman, Arithmetic } codes
al possibilities and limitations of error-free communication through a n3o/2i8sy
Source coding
Channel coding
Compressor
Decompressor
Noisy channel

The Big Picture
.1 The big picture
Source coding
Channel coding
Compressor
Decompressor
Noisy channel
Concept: Channel capacity
n Chapters 4 6, we discussed source coding with block codes, symbol code
nd stream codesà We implicitly assumed that the channel from the compres
Theorem: Channel coding theorem
or to the decompressor was noise-freeà Real channels are noisyà We will no
pend two chapters on the subject of noisy-channel coding the fundamen
Algorithms: Repetition codes, Hamming codes
al possibilities and limitations of error-free communication through a n3o/2i8s
s a- w – y

Communication over Noisy Channels
A channel is some medium for transmitting messages
A noisy channel is a channel that potentially introduces errors in the
sender’s message
The Problem of Communication
“The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” ( , 1948)

Example: Telephone Network
“Hi”% “Hi”%
Source : : Telephone handset Channel : Analogue telephone line Decoder : Telephone handset
Destination : Mark

Key Questions
How do we model noisy communication abstractly? What are the theoretical limits of noise correction? What are the practical approaches to noise correction?

1 Communication over Noisy Channels: Big Picture
2 Noisy Channels: Formally
3 Examples of Channels
4 Probability of Error

Encoder/Decoder Pairs
Suppose we have some set S = {1, 2, . . . , S} of possible messages e.g. codewords from Huffman coding on some ensemble
Sender and receiver agree on what these are

Encoder/Decoder Pairs
Suppose we have some set S = {1, 2, . . . , S} of possible messages e.g. codewords from Huffman coding on some ensemble
Sender and receiver agree on what these are
When communicating over a channel, the sender must encode messages into some input alphabet X

Encoder/Decoder Pairs
Suppose we have some set S = {1, 2, . . . , S} of possible messages e.g. codewords from Huffman coding on some ensemble
Sender and receiver agree on what these are
When communicating over a channel, the sender must encode messages
into some input alphabet X
The receiver then receives some (possibly corrupted) element from an
output alphabet Y
Simple case: X = Y = {0, 1}
The bit the sender transmits may not be what the receiver sees

Encoder/Decoder Pairs
Formally, the sender encodes messages via
enc: S → XN

Encoder/Decoder Pairs
Formally, the sender encodes messages via
enc: S → XN
The receiver then decodes messages via

Encoder/Decoder Pairs
Formally, the sender encodes messages via
enc: S → XN The receiver then decodes messages via
Isn’t the compressor already “encoding” a message?
Yes, but we might want to add something for noise tolerance
Might have X ̸= Y
e.g. if we allow a special “erased” symbol
N > 1 can be thought of as multiple uses of a channel e.g. use a bitstring of length 4 to represent messages

Channels: Informally
A discrete channel Q will:
accept an input from X produce an output from Y

Channels: Informally
A discrete channel Q will: accept an input from X
produce an output from Y
There is a probability of observing various outputs, given an input
This represents some inherent noise Noise could depend on the input

Channels: Formally
A discrete channel Q consists of:
an input alphabet X = {a1,…,aI}
an output alphabet Y = {b1,…,bJ} transition probabilities P(y|x).

Channels: Formally
A discrete channel Q consists of:
an input alphabet X = {a1,…,aI}
an output alphabet Y = {b1,…,bJ}
transition probabilities P(y|x).
The channel Q can be expressed as a matrix
Qj,i =P(y=bj|x=ai)
This represents the probability of observing bj given that we transmit ai

Channels: Example
Example: A channel Q with inputs X = {a1, a2, a3}, outputs
Y = {b1, b2}, and transition probabilities expressed by the matrix
Q = 􏰋0.8 0.5 0.2􏰌 0.2 0.5 0.8
So P(b1|a1) = 0.8 = P(b2|a3) and P(b1|a2) = P(b2|a2) = 0.5.

Channels: Example
Example: A channel Q with inputs X = {a1, a2, a3}, outputs
Y = {b1, b2}, and transition probabilities expressed by the matrix
Q = 􏰋0.8 0.5 0.2􏰌 0.2 0.5 0.8
So P(b1|a1) = 0.8 = P(b2|a3) and P(b1|a2) = P(b2|a2) = 0.5.
We arrange the inputs along the columns, and outputs along the rows Actual details of alphabet are abstracted away

1 Communication over Noisy Channels: Big Picture
2 Noisy Channels: Formally
3 Examples of Channels
4 Probability of Error

The Binary Noiseless Channel
One of the simplest channels is the Binary Noiseless Channel. The received symbol is always equal to the transmitted symbol – there is no probability of error, hence noiseless.
Inputs X = {0, 1}; Outputs Y = {0, 1}; Transition probabilities
Q=􏰋1 0􏰌 01

The Binary Noiseless Channel
One of the simplest channels is the Binary Noiseless Channel. The received symbol is always equal to the transmitted symbol – there is no probability of error, hence noiseless.
Inputs X = {0, 1}; Outputs Y = {0, 1}; Transition probabilities
Q=􏰋1 0􏰌 01
What was transmitted over the channel if 0000 1111 was received?
−→ 0000 1111

The Binary Noiseless Channel
One of the simplest channels is the Binary Noiseless Channel. The received symbol is always equal to the transmitted symbol – there is no probability of error, hence noiseless.
Inputs X = {0, 1}; Outputs Y = {0, 1}; Transition probabilities
Q=􏰋1 0􏰌 01
What was transmitted over the channel if 0000 1111 was received?
0000 1111 −→ 0000 1111

The Noisy Non-overlapping Channel
Even if there is some uncertainty about the output given the input, it may still be possible to perfectly infer what was transmitted.
Inputs X = {0, 1}; Outputs
Y = {a, b, c, d}; Transition probabilities
0.5 0 c 0 0.5

The Noisy Non-overlapping Channel
Even if there is some uncertainty about the output given the input, it may still be possible to perfectly infer what was transmitted.
Inputs X = {0, 1}; Outputs
Y = {a, b, c, d}; Transition probabilities
What was transmitted over the channel if abba ddcd was received?
0.5 0 c 0 0.5
−→ abba ddcd

The Noisy Non-overlapping Channel
Even if there is some uncertainty about the output given the input, it may still be possible to perfectly infer what was transmitted.
Inputs X = {0, 1}; Outputs
Y = {a, b, c, d}; Transition probabilities
What was transmitted over the channel if abba ddcd was received?
0.5 0 c 0 0.5
0000 1111 −→ abba ddcd

The Binary Symmetric Channel
Each symbol sent across a binary symmetric channel has a chance of being “flipped” to its counterpart (0 → 1; 1 → 0)
Inputs X = {0, 1}; Outputs Y = {0, 1}; Transition probabilities with P(flip) = f
X Y Q=1−ff

The Binary Symmetric Channel
Each symbol sent across a binary symmetric channel has a chance of being “flipped” to its counterpart (0 → 1; 1 → 0)
Inputs X = {0, 1}; Outputs Y = {0, 1}; Transition probabilities with P(flip) = f
X Y Q=1−ff
What was most likely transmitted over the channel if 0010 1001 was received, assuming f = 0.1 and P(x = 0) = P(x = 1) = 0.5?
−→ 0010 1001

The Binary Symmetric Channel
Each symbol sent across a binary symmetric channel has a chance of being “flipped” to its counterpart (0 → 1; 1 → 0)
Inputs X = {0, 1}; Outputs Y = {0, 1}; Transition probabilities with P(flip) = f
X Y Q=1−ff
What was most likely transmitted over the channel if 0010 1001 was received, assuming f = 0.1 and P(x = 0) = P(x = 1) = 0.5?
0010 1001 −→ 0010 1001

Inferring the Input
Suppose we know the P(x) — the probability x is transmitted.
Given a particular output y ∈ Y received over a channel Q, how likely was it that input x ∈ X was transmitted?

Inferring the Input
Suppose we know the P(x) — the probability x is transmitted.
Given a particular output y ∈ Y received over a channel Q, how likely was
it that input x ∈ X was transmitted?
P(x|y) = P(y|x)P(x) = 􏰐 P(y|x)P(x) P(y) x′∈X P(y|x′)P(x′)

Inferring the Input: Example
Suppose P(x = 0) = P(x = 1) = 0.5. What are the probability that a x = 0 was transmitted over a binary symmetric channel Q with f = 0.1 given that a y = 0 was received?

Inferring the Input: Example
Suppose P(x = 0) = P(x = 1) = 0.5. What are the probability that a x = 0 was transmitted over a binary symmetric channel Q with f = 0.1 given that a y = 0 was received?
P(x =0|y =0)= 0.9×0.5 =0.9
0.9×0.5+0.1×0.5 Similarly, P(x = 1|y = 1) = 0.9.

Inferring the Input: Example
Suppose P(x = 0) = P(x = 1) = 0.5. What are the probability that a x = 0 was transmitted over a binary symmetric channel Q with f = 0.1 given that a y = 0 was received?
P(x =0|y =0)= 0.9×0.5 =0.9
0.9×0.5+0.1×0.5 Similarly, P(x = 1|y = 1) = 0.9.
What if P(x = 0) = 0.01?

Inferring the Input: Example
Suppose P(x = 0) = P(x = 1) = 0.5. What are the probability that a x = 0 was transmitted over a binary symmetric channel Q with f = 0.1 given that a y = 0 was received?
P(x =0|y =0)= 0.9×0.5 =0.9
0.9×0.5+0.1×0.5 Similarly, P(x = 1|y = 1) = 0.9.
What if P(x = 0) = 0.01? P(x =0|y =0)=
0.9×0.01 ≈0.0833. 0.9×0.01+0.1×0.99

The Z Channel
Symbols may be corrupted over the channel asymmetrically.
Inputs X = {0, 1}; Outputs Y = {0, 1}; Transition probabilities
Q=􏰋1 f􏰌 0 1−f

The Z Channel
Symbols may be corrupted over the channel asymmetrically.
Inputs X = {0, 1}; Outputs Y = {0, 1}; Transition probabilities
Q=􏰋1 f􏰌 0 1−f
Inferring the input: Clearly P(x = 1|y = 1) = 1 but
P(y =0|x =0)P(x =0) P(x =0)
P(x = 0|y = 0) = 􏰐x′∈X P(y = 0|x′)P(x′) = P(x = 0) + f P(x = 1) SoP(x =0|y =0)→1asf →0,andgoestoP(x =0)asf →1

The Binary Erasure Channel
We can model a channel which “erases” bits by letting one of the output symbols be the symbol ’?’ with associated probability f . The receiver knows which bits are erased.
Inputs X = {0, 1}; Outputs Y = {0, ?, 1}; Transition probabilities
1−f 0 Q=ff
0 1−f 0000 1111 −→ 00?0 ?11?

The Noisy Typewriter Channel
This channel simulates a noisy “typewriter”. Inputs and outputs are 26 letters A through Z plus space. With probability 13 , each letter is either: unchanged; changed to the next letter, changed to the previous letter.

The Noisy Typewriter Channel
This channel simulates a noisy “typewriter”. Inputs and outputs are 26 letters A through Z plus space. With probability 13 , each letter is either: unchanged; changed to the next letter, changed to the previous letter.
Inputs X = {A,B,…,Z, }; Outputs Y = {A,B,…,Z, }; Transition probabilities
1 1 0 0 ··· 0 1 3 3 3
AA BB CC DD
1110···00 3 3 3 
0 1 1 1 ··· 0 0 Q= 3 3 3 
. . . . … . .
100···11 333

The Noisy Typewriter Channel
This channel simulates a noisy “typewriter”. Inputs and outputs are 26 letters A through Z plus space. With probability 13 , each letter is either: unchanged; changed to the next letter, changed to the previous letter.
Inputs X = {A,B,…,Z, }; Outputs Y = {A,B,…,Z, }; Transition probabilities
1 1 0 0 ··· 0 1 3 3 3
AA BB CC DD
The transition matrix for this channel has a diagonal structure: all of the probability mass is concentrated around the diagonal.
1110···00 3 3 3 
0 1 1 1 ··· 0 0 Q= 3 3 3 
. . . . … . . 100···11

148 9 The Noisy Typewriter Channel
Communication over a Noisy Channel
Some useful model channels are:
Binary symmetric channel. AX ={0,1}à AY ={0,1}à
Thisàchàannelsimulatesanoisy“typewriter”.Inputsandoutputsare26
0d0 Py=0|x=0 = −fà Py=0|x=1 = àà 0
P y=0|x=0 = −fà P y=0|x=1 = fà 0
E P y=1|x=0 = fà P y=1|x=1 = 1−f.
, each letter is either: unchanged; changed to the next letter, changed to the previous letter.
letters A through Z plus space. With probability
Binary erasure channel. AX = {0, 1}à AY = {0, ?, 1}à E01
x ‚d?y Py=?|x=0 = fà Py=?|x=1 = fà ? ” 1
P y=1|x=0 = àà P y=1|x=1 = −f.
Inputs X = {A,B,…,Z, };
tAypewriter. AX = AY = the 2à lettersA
re arranged in a circle, and when the typi
omes out is either A, B or C, with probabili
,Cthe output is B, C or Dà and so forth, wiCt
o the Þrst letter Aà DED
P y=F|x=G €Y
IG P y=G|x=G
IH P y=H|x=G
* à à g € Z Z
*àg €€I
-* €Eg – Y €€I Y
{A, B, ààà, Z, -}à The letters
Z €€I Z – * €E ‡ g –
Z channel. AX ={0,1}à AY ={0,1}à
Outputs Y = {A,B,…,Z, }; st attempts to type B, what
ty 3 eachà when the input is
Transition probabilities
h the Þnal letter Ô-Õ adjacent
= /3à = /3à = /3à
ABCDEFGHIJKLMNOPQRSTUVWXYZ-
A B C D E F G H I J K L M N O P Q R S T U V W X Y
The transition matrix for this channel has a diagonal structure: all of the
Py=0|x=0 = à Py=0|x=1 = fà 0
E P y=1|x=0 = àà P y=1|x=1 = −f. probability mass is concentrated around the diagonal.
9.4 Inferring the input given the output

1 Communication over Noisy Channels: Big Picture
2 Noisy Channels: Formally
3 Examples of Channels
4 Probability of Error

Communicating over Noisy Channels
Suppose we know we have to communicate over some channel Q and we want build an encoder/decoder pair to reliably send a message s over Q.
s Encoder x Q y Decoder s

Communicating over Noisy Channels
Suppose we know we have to communicate over some channel Q and we want build an encoder/decoder pair to reliably send a message s over Q.
s Encoder x Q y Decoder s
Reliability is measured via probability of error — that is, the probability of incorrectly decoding sout given sin as input:
P(sout ̸= sin) = 􏰄 P(sout ̸= sin|sin = s)P(sin = s) s

Probability of Error: Example
LetS ={a,b}andX =Y ={0,1}
Assumeencoder: a→0;b→1,decoder: 0→a;1→b.
Consider binary symmetric Q,
with f = 0.1
Q = 􏰋1 − f f 􏰌 f 1−f

Probability of Error: Example
If base probabilities of symbol transmission are (pa, pb) = (0.5, 0.5), P(sin ̸=sout)=P(sin =a,sout =b)+P(sin =b,sout =a)

Probability of Error: Example
If base probabilities of symbol transmission are (pa, pb) = (0.5, 0.5),
P(sin ̸=sout)=P(sin =a,sout =b)+P(sin =b,sout =a) = P(sout = b|sin = a)P(sin = a)+
=P(sout = a|sin = b)P(sin = b)

Probability of Error: Example
If base probabilities of symbol transmission are (pa, pb) = (0.5, 0.5),
P(sin ̸=sout)=P(sin =a,sout =b)+P(sin =b,sout =a) = P(sout = b|sin = a)P(sin = a)+
=P(sout = a|sin = b)P(sin = b)
=P(y =1|x =0)pa +P(y =0|x =1)pb

Probability of Error: Example
If base probabilities of symbol transmission are (pa, pb) = (0.5, 0.5),
P(sin ̸=sout)=P(sin =a,sout =b)+P(sin =b,sout =a) = P(sout = b|sin = a)P(sin = a)+
=P(sout = a|sin = b)P(sin = b)
=P(y =1|x =0)pa +P(y =0|x =1)pb =f

Probability of Error: Example
If base probabilities of symbol transmission are (pa, pb) = (0.5, 0.5),
P(sin ̸=sout)=P(sin =a,sout =b)+P(sin =b,sout =a) = P(sout = b|sin = a)P(sin = a)+
=P(sout = a|sin = b)P(sin = b)
=P(y =1|x =0)pa +P(y =0|x =1)pb =f

A Simple Coding Scheme
Suppose s ∈ {a, b} and we encode by a → 000 and b → 111.
To decode we count the number of 1s and 0s and set all bits to the majority count to determine s
000, 001, 010, 100 → a and 111, 110, 101, 011 → b 􏰘 􏰗􏰖 􏰙 􏰘 􏰗􏰖 􏰙

A Simple Coding Scheme
Suppose s ∈ {a, b} and we encode by a → 000 and b → 111.
To decode we count the number of 1s and 0s and set all bits to the majority count to determine s
000, 001, 010, 100 → a and 111, 110, 101, 011 → b 􏰘 􏰗􏰖 􏰙 􏰘 􏰗􏰖 􏰙
If the channel Q is binary symmetric with f = 0.1 again
P(sin ̸=sout)=P(y∈B|000)pa +P(y∈A|111)pb =[f3 +3f2(1−f)]pa +[f3 +3f2(1−f)]pb
=f3 +3f2(1−f)=0.028

A Simple Coding Scheme
Suppose s ∈ {a, b} and we encode by a → 000 and b → 111.
To decode we count the number of 1s and 0s and set all bits to the majority count to determine s
000, 001, 010, 100 → a and 111, 110, 101, 011 → b 􏰘 􏰗􏰖 􏰙 􏰘 􏰗􏰖 􏰙
If the channel Q is binary symmetric with f = 0.1 again
P(sin ̸=sout)=P(y∈B|000)pa +P(y∈A|111)pb =[f3 +3f2(1−f)]pa +[f3 +3f2(1−f)]pb
=f3 +3f2(1−f)=0.028
So the error has dropped from 0.1 to 0.028
but so has the rate: from 1 symbol/bit to 1/3 symbol/bit.

A Simple Coding Scheme
more useful codes
more useful codes
0 0.2 0.4 0.6 0.8 1
0 0.2 0.4 0.6 0.8 1

A Simple Coding Scheme
more useful codes
more useful codes
0 0.2 0.4 0.6 0.8 1
Can we make the error arbitrarily small without the rate going to zero?
0 0.2 0.4 0.6 0.8 1

Summary and Reading
Main Points:
Modelling Noisy Channels
􏰜 Noiseless, Overlap, Symmetric, Z, Erasure Simple Coding via Repetition
􏰜 Probability of Error vs Transmission Rate Reading:
MacKay §9.1 – §9.5
Cover & Thomas §7.1 – §7.3

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com