程序代写 COMP2610/6261 – Information Theory Lecture 21: Hamming Codes & Coding Revie

COMP2610/6261 – Information Theory Lecture 21: Hamming Codes & Coding Review
U Logo Use Guidelines . Williamson
logo is a contemporary n of our heritage.
presents our name, ld and our motto:

Copyright By PowCoder代写 加微信 powcoder

earn the nature of things.
authenticity of our brand identity, there are n how our logo is used.
go – horizontal logo
go should be used on a white background. ludes black text with the crest in Deep Gold in
rinting is not available, the black logo can hite background.
e used white reversed out of a black occasionally a neutral dark background.
Research School of Computer Science
Preferred logo
Black version
16 October, 2018

Today’s plan
Noisy Channel Coding Theorem proves there exists codes with rate R < C with arbitrarily low probability of error. Today’s plan Noisy Channel Coding Theorem proves there exists codes with rate R < C with arbitrarily low probability of error. But proof was non-constructive — we used a random code in order to be able to actually calculate error probability. Today’s plan Noisy Channel Coding Theorem proves there exists codes with rate R < C with arbitrarily low probability of error. But proof was non-constructive — we used a random code in order to be able to actually calculate error probability. What about constructive codes? Today’s plan Noisy Channel Coding Theorem proves there exists codes with rate R < C with arbitrarily low probability of error. But proof was non-constructive — we used a random code in order to be able to actually calculate error probability. What about constructive codes? We will focus on linear codes amd look at two simple linear codes: repetition codes Hamming codes We will sketch what can be said about the rate and reliability of the latter 1 Repetition Codes 2 The (7,4) Hamming code Coding Syndrome Decoding Multiple errors Error Probabilities 3 Coding: Review Repetition Codes Simplest channel code: add redundancy by repeating every bit of the message (say) 3 times: Source sequence Transmitted sequence 0 000 1 111 This repetition code is called R3. Repetition Codes for the BSC Example On a binary symmetric channel with flip probability f , receiver sees r=t+η where η is a noise vector p(ηi =1)=f Repetition Codes for the BSC Example On a binary symmetric channel with flip probability f , receiver sees r=t+η where η is a noise vector p(ηi =1)=f Example setting of η, and resulting message r: s0010110 􏰖􏰙􏰘􏰗 􏰖􏰙􏰘􏰗 􏰖􏰙􏰘􏰗 􏰖􏰙􏰘􏰗 􏰖􏰙􏰘􏰗 􏰖􏰙􏰘􏰗 􏰖􏰙􏰘􏰗 t 000 000 111 000 111 111 000 η 000 001 000 000 101 000 000 r 000 001 111 000 010 111 000 Note that elements of η are not replicated like those of t noise acts independently on every bit Beyond Repetition Codes Goal: Communication with small probability of error and high rate: Repetition codes introduce redundancy on a per-bit basis Can we improve on this? Beyond Repetition Codes Goal: Communication with small probability of error and high rate: Repetition codes introduce redundancy on a per-bit basis Can we improve on this? Beyond Repetition Codes Goal: Communication with small probability of error and high rate: Repetition codes introduce redundancy on a per-bit basis Can we improve on this? Idea: Introduce redundancy to blocks of data instead Beyond Repetition Codes Goal: Communication with small probability of error and high rate: Repetition codes introduce redundancy on a per-bit basis Can we improve on this? Idea: Introduce redundancy to blocks of data instead Block Code A block code is a rule for encoding a length-K sequence of source bits s into a length-N sequence of transmitted bits t. Introduce redundancy: N > K Focus on Linear codes
We will introduce a simple type of block code called the (7,4) Hamming code

An Example
The (7, 4) Hamming Code
ConsiderK =4,andasourcemessages=1000 The repetition code R2 produces
t=11000000 The (7,4) Hamming code produces
Redundancy, but not repetition
How are these magic bits computed?

1 Repetition Codes
2 The (7,4) Hamming code Coding
Syndrome Decoding Multiple errors
Error Probabilities
3 Coding: Review

The (7,4) Hamming code Coding
ConsiderK =4,N =7ands=1000

The (7,4) Hamming code Coding
ConsiderK =4,N =7ands=1000

The (7,4) Hamming code Coding
Copy the source bits into the the first 4 target bits:

The (7,4) Hamming code Coding
Set parity-check bits so that the number of ones within each circle is even:
encoder Sowehaves=1000 −→ t=1000101

The (7,4) Hamming code Coding
Algebraically, we have set:
ti = si for i = 1, . . . , 4 t5 = s1 ⊕ s2 ⊕ s3
t6 = s2 ⊕ s3 ⊕ s4
t7 = s1 ⊕ s3 ⊕ s4
where we use modulo-2 arithmetic

The (7,4) Hamming code Coding
In matrix form:
0010 T T 􏰋I4􏰌 
t=GswithG= P =0001, 1 1 1 0  0 1 1 1 
where s = 􏱌s1 s2 s3 s4􏱍T
G is called the Generator matrix of the code. The Hamming code is linear!

The (7,4) Hamming code: Codewords
Each (unique) sequence that can be transmitted is called a codeword. s Codeword (t)
0010 0010111 Codeword examples: 0110 0110001 1010 1010010
For the (7,4) Hamming code we have a total of 16 codewords

The (7,4) Hamming code Codewords
GT = 􏱌G1· G2· G3· where each Gi· is a 7 dimensional bit vector
Then, the transmitted message is
G4·􏱍 s = s1G1· + . . . + s4G4·
= 􏱌G1· G2· G3·

The (7,4) Hamming code: Codewords
There are 27 possible transmitted bit strings
There are 27 − 24 other bit strings that immediately imply corruption
If we see a codeword, does that imply no corruption?
Any two codewords differ in at least three bits Each original bit belongs to at least two circles
Useful in constructing reliable decoders

1 Repetition Codes
2 The (7,4) Hamming code Coding
Syndrome Decoding Multiple errors
Error Probabilities
3 Coding: Review

The (7,4) Hamming code: Decoding
We can encode a length-4 sequence s into a length-7 sequence t using 3 parity check bits

The (7,4) Hamming code: Decoding
We can encode a length-4 sequence s into a length-7 sequence t using 3 parity check bits
t can be corrupted by noise which can flip any of the 7 bits (including the
parity check bits):
􏰖 􏰙􏰘 􏰗 t 1000101
η 0100000 r 1100101

The (7,4) Hamming code: Decoding
We can encode a length-4 sequence s into a length-7 sequence t using 3 parity check bits
t can be corrupted by noise which can flip any of the 7 bits (including the
parity check bits):
􏰖 􏰙􏰘 􏰗 t 1000101
η 0100000 r 1100101
How should we decode r?
We could do this exhaustively using the 16 codewords
Assuming BSC, uniform p(s): Get the most probable explanation Find s such that ∥t(s) ⊖ r∥1 is minimum

The (7,4) Hamming code: Decoding
We can encode a length-4 sequence s into a length-7 sequence t using 3 parity check bits
t can be corrupted by noise which can flip any of the 7 bits (including the
parity check bits):
􏰖 􏰙􏰘 􏰗 t 1000101
η 0100000 r 1100101
How should we decode r?
We could do this exhaustively using the 16 codewords
Assuming BSC, uniform p(s): Get the most probable explanation Find s such that ∥t(s) ⊖ r∥1 is minimum
We can get the most probable source vector in an more efficient way.

The (7,4) Hamming code: Decoding Example 1
encoder noise Wehaves=1000 −→ t=1000101−→r=1100101:
(1) Detect circles with wrong (odd) parity 􏰜 What bit is responsible for this?

The (7,4) Hamming code: Decoding Example 1
encoder noise Wehaves=1000 −→ t=1000101−→r=1100101:
(2) Detect culprit bit and flip it
The decoded sequence is sˆ = 1 0 0 0

The (7,4) Hamming code: Decoding Example 2
encoder noise Wehaves=1000 −→ t=1000101−→r=1000001:
(1) Detect circles with wrong (odd) parity 􏰜 What bit is responsible for this?

The (7,4) Hamming code: Decoding Example 2
encoder noise Wehaves=1000 −→ t=1000101−→r=1000001:
(2) Detect culprit bit and flip it
The decoded sequence is sˆ = 1 0 0 0

The (7,4) Hamming code: Decoding Example 3
encoder noise Wehaves=1000 −→ t=1000101−→r=1010101:
(1) Detect circles with wrong (odd) parity 􏰜 What bit is responsible for this?

The (7,4) Hamming code: Decoding Example 3
encoder noise Wehaves=1000 −→ t=1000101−→r=1010101:
(2) Detect culprit bit and flip it
The decoded sequence is sˆ = 1 0 0 0

1 Repetition Codes
2 The (7,4) Hamming code Coding
Syndrome Decoding
Multiple errors Error Probabilities
3 Coding: Review

The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1,…,r7, assume BSC with small noise level f:
1 Define the syndrome as the length-3 vector z that describes the
pattern of violations of the parity bits r5, r6, r7.
􏰜 zi = 1 when the ith parity bit does not match the parity of r
􏰜 Flipping a single bit leads to a different syndrome

The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1,…,r7, assume BSC with small noise level f:
1 Define the syndrome as the length-3 vector z that describes the
pattern of violations of the parity bits r5, r6, r7.
􏰜 zi = 1 when the ith parity bit does not match the parity of r
􏰜 Flipping a single bit leads to a different syndrome
2 Check parity bits r5, r6, r7 and identify the syndrome

The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1,…,r7, assume BSC with small noise level f:
1 Define the syndrome as the length-3 vector z that describes the
pattern of violations of the parity bits r5, r6, r7.
􏰜 zi = 1 when the ith parity bit does not match the parity of r
􏰜 Flipping a single bit leads to a different syndrome
2 Check parity bits r5, r6, r7 and identify the syndrome
3 Unflip the single bit responsible for this pattern of violation
􏰜 This syndrome could have been caused by other noise patterns

The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1,…,r7, assume BSC with small noise level f:
Define the syndrome as the length-3 vector z that describes the pattern of violations of the parity bits r5, r6, r7.
􏰜 zi = 1 when the ith parity bit does not match the parity of r
􏰜 Flipping a single bit leads to a different syndrome
Check parity bits r5, r6, r7 and identify the syndrome
Unflip the single bit responsible for this pattern of violation
􏰜 This syndrome could have been caused by other noise patterns
z 000 001 010 011 100 101 110 111 Flip bit none r7 r6 r4 r5 r1 r2 r3

The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
Given r = r1,…,r7, assume BSC with small noise level f:
Define the syndrome as the length-3 vector z that describes the pattern of violations of the parity bits r5, r6, r7.
􏰜 zi = 1 when the ith parity bit does not match the parity of r
􏰜 Flipping a single bit leads to a different syndrome
Check parity bits r5, r6, r7 and identify the syndrome
Unflip the single bit responsible for this pattern of violation
􏰜 This syndrome could have been caused by other noise patterns
z 000 001 010 011 100 101 110 111 Flip bit none r7 r6 r4 r5 r1 r2 r3
The optimal decoding algorithm unflips at most one bit

The (7,4) Hamming code: Syndrome Decoding: Matrix Form
Recall that we just need to compare the expected parity bits with the actual ones we received:
z1 =r1 ⊕r2 ⊕r3 ⊖r5 z2 =r2 ⊕r3 ⊕r4 ⊖r6 z3 = r1 ⊕ r3 ⊕ r4 ⊖ r7,

The (7,4) Hamming code: Syndrome Decoding: Matrix Form
Recall that we just need to compare the expected parity bits with the actual ones we received:
z1 =r1 ⊕r2 ⊕r3 ⊖r5 z2 =r2 ⊕r3 ⊕r4 ⊖r6 z3 = r1 ⊕ r3 ⊕ r4 ⊖ r7,
but in modulo-2 arithmetic −1 ≡ 1 so we can replace ⊖ with ⊕ so we
1110100 z=HrwithH=􏱌 P I3 􏱍= 0111010 

The (7,4) Hamming code: Syndrome Decoding: Matrix Form
Recall that we just need to compare the expected parity bits with the actual ones we received:
z1 =r1 ⊕r2 ⊕r3 ⊖r5 z2 =r2 ⊕r3 ⊕r4 ⊖r6 z3 = r1 ⊕ r3 ⊕ r4 ⊖ r7,
but in modulo-2 arithmetic −1 ≡ 1 so we can replace ⊖ with ⊕ so we
1110100 z=HrwithH=􏱌 P I3 􏱍= 0111010 
1011001 Homework: What is the syndrome for a codeword?

1 Repetition Codes
2 The (7,4) Hamming code Coding
Syndrome Decoding Multiple errors
Error Probabilities
3 Coding: Review

The (7,4) Hamming code:
Optimal Decoding Algorithm: Syndrome Decoding
When the noise level f on the BSC is small, it may be reasonable that we see only a single bit flip in a sequence of 4 bits
The syndrome decoding method exactly recovers the source message in this case
c.f. Noise flipping one bit in the repetition code R3 But what happens if the noise flips more than one bit?

The (7,4) Hamming code: Decoding Example 4: Flipping 2 Bits
encoder noise Wehaves=1000 −→ t=1000101−→r=1010100:
(1) Detect circles with wrong (odd) parity 􏰜 What bit is responsible for this?

The (7,4) Hamming code: Decoding Example 4: Flipping 2 Bits
encoder noise Wehaves=1000 −→ t=1000101−→r=1010100:
(2) Detect culprit bit and flip it
The decoded sequence is sˆ = 1 1 1 0
􏰜 We have made 3 errors but only 2 involve the actual message

The (7,4) Hamming code: Decoding Exercises
[Mackay, Ex 1.5]: Decode the following sequences using the syndrome decoding for the (7,4) Hamming code:
(a) r = 1101011 → sˆ =?? (b) r = 0110110 → sˆ =?? (c) r = 0100111 → sˆ =??
(d) r = 1111111 → sˆ =??
Work out the answers on your own.

The (7,4) Hamming code: Solution
For each exercise we simply compute the syndrome and use the optimal decoding algorithm (Table above) to determine which bit we should unflip.
(a) r=1101011→:z1 =r1 ⊕r2 ⊕r3 ⊕r5 =0 z2 =r2 ⊕r3 ⊕r4 ⊕r6 = 1 z3 =r1 ⊕r3 ⊕r4 ⊕r7 =1Thereforez=011,weunflipr4,
(b) r=0110110→z=111,weunflipr3,sˆ=0100
(c) r=0100111→z=001,weunflipr7,sˆ=0100
(d) r=1111111→z=000,wedon’tunflipanybit,sˆ=1111

The (7,4) Hamming code: Zero-Syndrome Noise Vectors
[Mackay, Ex 1.7] Find some noise vectors that give the all-zero syndrome (so that the optimal decoding algorithm will not correct them). How many of these vectors are there?

By definition we have that the all-zero syndrome implies that the corresponding noise components should cancel out. For example for the first component we have:
z1 =r1 ⊕r2 ⊕r3 ⊕r5 =t1 ⊕t2 ⊕t3 ⊕t5 ⊕η1 ⊕η2 ⊕η3 ⊕η5. Butti =si for i = 1, . . . , 4 and t5 = s1 ⊕ s2 ⊕ s3. Therefore
z1 =2s1⊕2s2⊕2s3⊕η1⊕η2⊕η3⊕η5 =η1⊕η2⊕η3⊕η5.Thus,we have:
which is equivalent to:
z1 =η1 ⊕η2 ⊕η3 ⊕η5 =0 z2 =η2 ⊕η3 ⊕η4 ⊕η6 =0 z3 =η1 ⊕η3 ⊕η4 ⊕η7 =0
η5 = η1 ⊕ η2 ⊕ η3 η6 = η2 ⊕ η3 ⊕ η4 η7 = η1 ⊕ η3 ⊕ η4

Solution (cont.)
As η5 is determined by η1, η2, η3 we have 23 = 8 possibilities here. Now, for fixed η1, η2 (and η3) in the previous step we only have two possibilities for η4, which determines η6.
We have now that all the variables are set and η7 is fully determined by their values.
Thus, we have 8 × 2 × 1 possible noise vectors that yield the all-zero syndrome.
The trivial noise vectors that yield this syndrome are: η = 0000000 and η = 1111111.
However, we can follow the above procedure and set the corresponding variables.
This is equivalent to having arbitrary settings for η1, η2, η3 and η4 which gives us 16 possible noise vectors which exactly correspond to the 16 codewords of the (7,4) Hamming code.

1 Repetition Codes
2 The (7,4) Hamming code Coding
Syndrome Decoding Multiple errors
Error Probabilities
3 Coding: Review

The (7,4) Hamming code: Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits sˆi does not match the corresponding source bit si for i = 1, . . . 4

The (7,4) Hamming code: Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits sˆi does not match the corresponding source bit si for i = 1, . . . 4
p(Block Error) : pB = p(sˆ ̸= s)

The (7,4) Hamming code: Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits sˆi does not match the corresponding source bit si for i = 1, . . . 4
p(Block Error) : pB = p(sˆ ̸= s)
p(Bit Error) : pb = K
p(sˆk ̸= sk)

The (7,4) Hamming code: Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits sˆi does not match the corresponding source bit si for i = 1, . . . 4
p(Block Error) : pB = p(sˆ ̸= s)
p(Bit Error) : pb = K Rate : R = KN = 47
p(sˆk ̸= sk)

The (7,4) Hamming code: Error Probabilities
Decoding Error : Occurs if at least one of the decoded bits sˆi does not match the corresponding source bit si for i = 1, . . . 4
p(Block Error) : pB = p(sˆ ̸= s)
p(Bit Error) : pb = K Rate : R = KN = 47
What is the probability of block error for the (7,4) Hamming code with f = 0.1?
p(sˆk ̸= sk)

The (7,4) Hamming code: Leading-Term Error Probabilities
Block Error: This occurs when 2 or more bits in the block of 7 are flipped We can approximate pB to the leading term:
􏰄7 􏰂 7 􏰃 m 7 − m
m f (1−f) 􏰂7􏰃 2 2
≈ 2 f =21f .

The (7,4) Hamming code: Leading-Term Error Probabilities
Bit Error: Given that a block error occurs, the noise must corrupt 2 or more bits
The most probable case is when the noise corrupts 2 bits, which induces 3 errors in the decoded vector:

The (7,4) Hamming code: Leading-Term Error Probabilities
Bit Error: Given that a block error occurs, the noise must corrupt 2 or more bits
The most probable case is when the noise corrupts 2 bits, which induces 3 errors in the decoded vector:
p(sˆ ̸=s)≈ 3p fori =1,…,7 ii7B

The (7,4) Hamming code: Leading-Term Error Probabilities
Bit Error: Given that a block error occurs, the noise must corrupt 2 or more bits
The most probable case is when the noise corrupts 2 bits, which induces 3 errors in the decoded vector:
p(sˆ ̸=s)≈ 3p fori =1,…,7 ii7B
All bits are equally likely to be corrupted (due to symmetry)

The (7,4) Hamming code: Leading-Term Error Probabilities
Bit Error: Given that a block error occurs, the noise must corrupt 2 or more bits
The most probable case is when the noise corrupts 2 bits, which induces 3 errors in the decoded vector:
p(sˆ ̸=s)≈ 3p fori =1,…,7 ii7B
All bits are equally likely to be corrupted (due to symmetry) p b ≈ 73 p B ≈ 9 f 2

What Can Be Achieved with Hamming Codes?
0.1 0.08 0.06 0.04 0.02 0
0.01 R5 H(7,4) R1
BCH(31,16)
BCH(15,7) more useful codes
more useful codes BCH(511,76)
BCH(1023,101)
0 0.2 0.4 0.6 0.8 1 Rate
0 0.2 0.4 0.6 0.8 1 Rate
H(7,4) improves pb at a moderate rate R = 4/7 BCH are a generalization of Hamming codes.

1 Repetition Codes
2 The (7,4) Hamming code Coding
Syndrome Decoding Multiple errors
Error Probabilities
3 Coding: Review

Coding: Review
Communication over a Noisy Channel
9.1 The big picture
Source coding
Channel coding
The Big Picture
Compressor
Noisy channel
Decompressor
In Chapters 4 6, we discussed source coding with block codes, symbol codes
and stream codesà We implicitly assumed that the channel from the compres-
Source Coding for Compression Channel Coding for Reliability
sor to the decompressor was noise-freeà Real channels are noisyà We will now
spend two chapters on the subject of noisy-channel coding the fundamen-
tal possibilities and limitations of error-free communication through a noisy
Shrink sequences Protect sequences
channelà The aim of channel coding is to make the noisy channel behave like
Identify and remove redundancy Add known form of redundancy
a noiseless channelà We will assume that the data to be transmitted has been
through a good compressor, so the bit stream has no obvious redundancyà The channel code, which makes the transmission, will put back redundancy of a
Size limited by entropy Rate limited by capacity
special sort, designed to make the noisy received signal decodeableà
 /2 SourceCodSuinppgoseTwheetraonsrmeitmàsààbitspersecond

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com