代写代考 COMP2610/COMP6261

AUSTRALIAN NATIONAL UNIVERSITY COMP2610/COMP6261
Information Theory, Semester 2 2022
Assignment 1
Due Date: Monday 29 August 2022, 5:00 pm Assignment 1 weighting is 10% of the course mark.

Copyright By PowCoder代写 加微信 powcoder

Instructions: Marks:
1. The mark for each question is indicated next to the question. For questions where you are asked to prove results, if you can not prove a precedent part, you can still attempt subsequent parts of the question assuming the truth of the earlier part.
2. COMP2610 students:
• Answer Questions 1-3 and Section I of Question 4. You will be marked out of 100.
• You are not expected to answer Question 5 and Section II of Question 4.
3. COMP6261 students:
• Answer Questions 2-5. You will be marked out of 100. • You are not expected to answer Question 1.
Submission:
1. Submit your assignment together with a cover page as a single PDF on Wattle.
2. Clearly mention whether you are a COMP2610 student or COMP6261 student in the cover page.
3. The deadline for the assignment is absolute. If you submit after the deadline you get zero marks (100% penalty), unless you are ill, in which case you will need to present a doctor’s certificate, or have undergone severe trauma of some kind.
4. All assignments must be done individually. Plagiarism is a university offence and will be
dealt with according to university procedures http://academichonesty.anu.edu.au/UniPolicy.html.

Question 1 [25 marks total]
**Only COMP 2610 students are expected to attempt this question.
Suppose only two ANU residences, B (Burton from Garran Hall) and J (John from ), compete in a series of 5 races. The games play out as follows:
• Each race has exactly one winner (i.e. residences cannot tie).
• In every race, both residences have equal probability for winning. • Once one residences wins 3 races, no further races are conducted.
Let 𝑋 denote the random variable whose outcomes X are all possible strings representing the winners of the races, i.e., X = {𝐵𝐵𝐵, 𝐽𝐽𝐽, 𝐵𝐽𝐵𝐵, . . . , 𝐵𝐽𝐵𝐽𝐵}. Let 𝑌 denote a random variable representing the total number of races conducted in the series. Calculate each of the following, showing all your working.
(a) The information in seeing the sequence 𝐵𝐽𝐵𝐽𝐵. That is, compute h(𝑋 = 𝐵𝐽𝐵𝐽𝐵)
(b) Probability of Burton winning the series, if winning three races represents winning the
(e) 𝐻(𝑋|𝑌 = 4)
(f) 𝐻(𝑋|𝑌)
(g) 𝐻(𝑌|𝑋). Provide an intuitive explanation of your answer.
(h) Ordinarily, the winner of the series is simply the residence that wins the last race. However, if the residence who wins the last race in the series is different to the one who wins the race before the last race (i.e., the second last race), then officials are suspicious of some form of race fixing. In such a series, there is a 50% chance of officials declaring the series invalid, and neither residence is declared winner; the other 50% of the time, the officials decide the series is legitimate, and declare the winner as usual. Let 𝑊 be a random variable denoting the identity of the winning residence, with possible outcomes W = {𝐴, 𝐵, 𝑁}, for the winner being respectively team A, team B, or neither. Compute 𝐻(𝑊).

Question 2 [25 marks total]
**All students are expected to attempt this question.
(a) Let 𝑋 be a random variable with 𝑚 different outcomes X = {1, 2, · · · , 𝑚} and corresponding nonzeroprobabilities{𝑞1,𝑞2,···,𝑞𝑚}arrangedindecreasingorder. Nowdefineanew
𝑞1 𝑞2 𝑞𝑚−1 𝑞𝑚 4 randomvariableYwithm+1differentoutcomes{5, 5,···, 5 , 5 ,5}.Findtheentropy
of 𝐻(𝑌) in terms of 𝐻(𝑋).
(b) Consider a fair coin flip. What is the mutual information between the top side and the
bottom side of the coin?
(c) A 6-sided fair die is rolled. What is the mutual information between the top side and the front face (the side most facing you)?
(d) Show that the entropy of the probability distribution, (𝑝1,··· , 𝑝𝑖,··· , 𝑝𝑗,··· , 𝑝𝑚), is less 𝑝 +𝑝 𝑝 +𝑝
than the entropy of the distribution (𝑝 ,··· , 𝑖 𝑗 ,··· , 𝑖 𝑗 ,··· , 𝑝 ). 122𝑚
Question 3 [35 marks total]
**All students are expected to attempt this question.
A standard deck of cards contains 4 suits — ♥,♦,♣,♠ (“hearts”, “diamonds”, “clubs”, “spades”) — each with 12 values — 2,3,4,5,6,7,8,9,10,J,Q,K (The 𝐽,𝑄,𝐾 are called “Jack”, “Queen”, “King”). Each card has a colour: hearts and diamonds are coloured red; clubs and spades are black. Cards with values 10, J, Q, K are called face cards.
Each of the 48 cards in a deck is identified by its value 𝑉 and suit 𝑆 and denoted 𝑉𝑆. For example, 2♥, J♣, and 7♠ are the “two of hearts”, “Jack of clubs”, and “7 of spades”, respectively. The variable 𝐶 will be used to denote a card’s colour. Let 𝑓 = 1 if a card is a face card and
𝑓 = 0 otherwise.
1. A card is drawn at random from a thoroughly shuffled deck. Calculate:
(a) The information in observing a red non-face card, i.e., h(𝑐 = red, 𝑓 = 0)
(b) The conditional information in observing a King given a face card was drawn, i.e.,
h(𝑉 = 𝐾| 𝑓 = 1).
(c) The entropies 𝐻(𝑆) and 𝐻(𝑉,𝑆).
(d) The mutual information between 𝑉 and 𝑆, i.e., 𝐼(𝑉; 𝑆).
(e) The mutual information between the value and colour of a card, i.e., 𝐼(𝑉;𝐶).
2. Now consider that 16 cards are removed from a standard deck: All 12 ♠s; the 2♥, 3♥, 4♥, and 5♥.
(a) Calculate the entropies 𝐻(𝑆) and 𝐻(𝑉,𝑆). HINT: Express 𝐻(𝑉,𝑆) in terms of 𝐻(𝑉|𝑆).
(b) Calculate 𝐼(𝑉; 𝑆). Explain why it is different to the 𝐼(𝑉; 𝑆) when a card is drawn at random from a standard of 48 cards (i.e. prior to the removal of 16 cards).
(c) Calculate 𝐼(𝑉; 𝑆|𝐶).

Question 4
Let 𝑌 be a random variable with possible outcomes 0,1, and 𝑝(𝑌 = 1) = 1/2. Let 𝑋 be a random variable with possible outcomes 𝑋 = 𝑎, 𝑏, 𝑐. Define
Suppose that
p = (𝑝(𝑋 = 𝑎|𝑌 = 1), 𝑝(𝑋 = 𝑏|𝑌 = 1), 𝑝(𝑋 = 𝑐|𝑌 = 1)) q = (𝑝(𝑋 = 𝑎|𝑌 = 0), 𝑝(𝑋 = 𝑏|𝑌 = 0), 𝑝(𝑋 = 𝑐|𝑌 = 0)).
311 p = (5, 5, 5)
131 q = (5, 5, 5)
Section I [15 marks total]
**All students are expected to attempt this section.
(a) Using the definition of mutual information, show that for any choice of p, q, 𝐼(𝑋;𝑌) = 𝐷KL(p||m) + 𝐷KL(q||m)
where m = p+q . 2
(b) Compute 𝐼(𝑋;𝑌). Section II [15 marks total]
**Only COMP 6261 students are expected to attempt this section.
(c) Let𝑍bearandomvariablewithoutcomes𝑥∈X,suchthatforall𝑥∈Xand𝑦∈{0,1},𝑝(𝑍= 𝑥|𝑌 = 𝑦) = 𝑝(𝑋 = 𝑥|𝑌 = 1 − 𝑦). Using part (a) compute 𝐼(𝑍;𝑌). Explain intuitively the relation of your answer to part (a).
(d) Suppose p and q are as in part (a). Using part (b), or otherwise, give an example of a random variable 𝑍 with possible outcomes X satisfying 𝐼(𝑋;𝑌) > 𝐼(𝑍;𝑌). Explain your answer in terms of the data-processing inequality.
(e) Suppose p and q are as in part (a). Give an example of a random variable 𝑍 with possible outcomes 𝑥 ∈ X satisfying 𝐼(𝑋;𝑌) < 𝐼(𝑍;𝑌). Explain your answer in terms of the data- processing inequality. Question 5 [10 marks total] **Only COMP 6261 students are expected to attempt this question. Suppose 𝑋 is a real valued random variable with 𝜇 = 𝐸(𝑋) = 0 (a) Showthatforany𝑡>0and𝑎≤𝑋≤𝑏,
𝐸(𝑒𝑡𝑋) ≤ 𝑒𝑔(𝑡(𝑏−𝑎)), where 𝑔(𝑢) = log(1 − 𝛾 + 𝛾𝑒𝑢) − 𝛾𝑢, with 𝛾 = 𝑎/(𝑎 − 𝑏).
(Hint: write 𝑋 as a convex combination of 𝑎 and 𝑏, where the convex weighting parameter depends upon 𝑋. Exploit the convexity of the function 𝑥 → 𝑒𝑡𝑥 and the fact that inequalities are preserved upon taking expectations of both sides, since expectations are integrals.)
(b) By using Taylor’s theorem, show that for all 𝑢 > 0, 𝑢2
Furthermore show that
𝑔(𝑢) ≤ . 8
𝐸(𝑒𝑡𝑋) ≤ 𝑒

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com