代写代考 COMP2610/6261 Information Theory Assignment 2

Australian National University
COMP2610/6261 Information Theory Assignment 2
Instructions
Submit a paper copy of your solution to the assignment box by 5pm 19th October 2018. This is an individual assignment. Make sure you have your name, student ID number, and tutor group clearly written on the first page of your submission.

Copyright By PowCoder代写 加微信 powcoder

The questions for students enrolled in COMP2610 or COMP6261 are identical, but the mark weighting differs (the notation [n marks / m marks] means n marks for students enrolled in COMP2610 and m marks for students enrolled in COMP6261).

Arithmetic and Stream Coding [40 marks / 40 marks] Consider the ensemble X with alphabet A = {a, b, c, d} and uniform probabilities
p = (14, 14, 14, 14).
(a) Compute the cumulative distribution function F for this ensemble and each
of the binary intervals correponding to each of the symbols in A.
(b) Construct the Shannon-Fano-Elias code for X and encode the message abab.
(c) How does the length of a Huffman codeword for the same ensemble and mes- sage compare with the one you just determined?
Consider the Dirichlet multinomial model with m = (1,1,1,1) for the alphabet A = {a, b, c, d}. Compute the following probabilities under this model:
(a) P(x=b)
(b) P (x = a|baaa)
(c) P(x=b|ac)
Using an arithmetic coder with the above Dirichlet multinomial model, code the
sequence acb, showing the intervals and probabilities at each step.
Consider an ensemble X consisting of a uniform probability distribution over an alphabet consisting of the capital letters A through to Z, a space character, and punctuation characters ‘.’, ‘,’, ‘?’, ‘!’, and ‘”’. Suppose we wanted to compress the message x = AAA . . . A consisting of 1000 repetitions of the character A.
Estimate (or determine exactly) how many bits would be required to code the message x using:
(a) A Huffman code for X.
(b) An adaptive arithmetic code for X with a Dirichlet multinomial model with:
i. uniform model parameters m = (1, 1, . . . , 1). ii. modelparametersm=(1000,1,1,…,1).
Noisy-Channel Coding [40 marks / 20 marks]
You are put in charge of optimising communications over a complex noisy channel Q that takes as input one of 1,000 symbols from X and outputs symbols from Y that also contains 1,000 symbols. A salesman calls you up offering a function f : Y → Y that can be added to Q to form a new channel Q′ with increased capacity by mapping eachythatQoutputstoy′=f(y).

(a) Describe the behaviour of Q and Q′ when each of the symbols a and b are used as input.
(b) For an arbitrary distribution p = (p, 1 − p) with p ∈ [0, 1] over X , write down the following expressions for Q:
i. The probabilities P(y = a) and P(y = b) in terms of p.
ii. The entropy of H(Y ) in terms of the probability p = P(X = b) for X.
iii. The mutual information I (X ; Y ) in terms of p.
(c) Using the previous results or some other argument determine the channel
capacity for Q.
(d) Argue why the channel Q′ must have the same capacity as Q.
(e) Suppose you used the channels Q and Q′ to send messages by first flipping a fair coin and sending a symbol through Q if it landed heads and through Q′ if it landed tails.
i. Construct a matrix Q∗ that represents the channel defined by this process.
ii. Determine the capacity of this new channel Q∗. (Note: You do not have to derive this from the definition of capacity, you can appeal to result discussed in lectures or the textbook).
(f) Describe a way of using the channels Q and Q′ to communicate without error.
Combining Noisy Channels [20 marks / 40 marks]
(a) By appealing to the definition of channel capacity and an inequality from the lectures, explain why the salesperson must be wrong.
(b) The salesperson responds by saying, “Oh! Did I say the function was f : Y → Y? I meant f : X → X.” Appealing to the definition of capacity again, care- fully explain why using such a function to pre-process symbols before sending them through the channel Q will also not increase its capacity.
2. Let X = {a, b} and Y = {a, b} be the input and output alphabets for the following
two channels:
􏰋1 0.5􏰌 ′ 􏰋0.5 0􏰌 Q=00.5. and Q=0.51
Suppose we have two arbitrary noisy channels Q and Q′ where channel Q has inputs X = {a1,…,aN} and outputs Y = {b1,…,bM} while channel Q′ has inputs X′ = {a′1,…,a′N′} and outputs Y′ = {b′1,…,b′M′}. We will assume that the inputs of the channels and the outputs do not share any common symbols. That is, X ∩ X ′ = ∅ and Y∩Y′ =∅.
The product channel Q⊗Q′ uses Q and Q′ to send and receive pairs of symbols in paral- lel: itsinputalphabetsistheproductalphabetV = X×X′ = {(a1,a′1),(a1,a′2),…,(aN,a′N′)} (withN×N′ elements)anditsoutputalphabetisW =Y×Y′ ={(b1,b′1),…,(bM,b′M′)} (with M ×M′ elements). When P(y = b|x = a) and P(y′ = b′|x′ = a′) are the transition

probabilities in Q and Q′ respectively, the transition probabilities in Q ⊗ Q′ are given by their product, that is,
P(w = (b,b′)|v = (a,a′)) = P(y = b|x = a).P(y′ = b′|x′ = a′)
forall(a,a′)∈X×X′ and(b,b′)∈Y×Y′.
The sum channel Q ⊕ Q′ has input alphabet V = X ∪ X′ and output alphabet
W = Y ∪ Y′ and sends symbols from X through Q and symbols from X ′ through Q′. The transition probabilities for Q ⊕ Q′ are therefore
P(y=b|x=a) ifw=b∈Y andv=a∈X P(w|v)=P(y′ =b′|x′ =a′) ifw=b′ ∈Y′ andv=a′ ∈X′
0 otherwise.
1. For input alphabets X = {a, b}, and X ′ = {c, d} and output alphabets Y = {1, 2}
and Y′ = {3, 4}, compute the sum and product channels for 􏰋1 1/4􏰌 ′ 􏰋1/3 2/3􏰌
Q=03/4. and Q=2/31/3
That is, write down the input and output alphabets and transition matrices for
both the sum channel Q ⊕ Q′ and product channel Q ⊗ Q′.
2. Prove that for arbitrary channels Q and Q′ that the capacity of Q ⊗ Q′ is the sum
of the capacities of Q and Q′. That is, show
C(Q ⊗ Q′) = C(Q) + C(Q′).
3. (Challenging!) Prove that for arbitrary channels Q and Q′ that C(Q⊕Q′)=log 􏰉2C +2C′􏰊
where C = C(Q) and C′ = C(Q′) are the capacities of Q and Q′, respectively.

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com