代写代考 COMP2610 COMP6261 Information Theory Tutorial 2: Entropy and Information

COMP2610 COMP6261 Information Theory Tutorial 2: Entropy and Information
Robert 2, 2018
1. Let X be a random variable with possible outcomes 1, 2, 3. Let the probabilities of the outcomes be pX 1 2
pX 2 2 pX 31

Copyright By PowCoder代写 加微信 powcoder

for some parameter 0, 1.
Suppose we see N observations of the random variable, x1, . . . , xN . Let ni denote the number of times
that we observe the outcome X i, i.e.
niN 1 ifxki
k1 0 else.
a Writedownthelikelihoodfunctionofgiventheobservationsx1,…,xNintermsofn1,n2,n3.
b Suppose the observations are
3,3,1,2,3,2,2,1,3,1.
Compute the maximum likelihood estimate of . Hint: Compute the loglikelihood function, and
check when the derivative is zero.
2. Consider the following joint distribution over X, Y :
pX,Y X 1234
1 0 0 18 18 2 18 116 116 0 3 18 18 0 0 4 0 116 116 18
a Show that X and Y are not statistically independent. Hint: You need only show that for at least one specific x,y pair, pX x,Y y pX xpY y.
b Compute the following quantities: i HX
ii HY iii HXY iv HYX
v HX,Y vi IX;Y.

3. A standard deck of cards contains 4 suits , , , hearts, diamonds, clubs, spades each with 13 values A,2,3,4,5,6,7,8,9,10,J,Q,K The A,J,Q,K are called Ace, Jack, Queen, King. Each card has a colour: hearts and diamonds are coloured red; clubs and spades are black. Cards with values J, Q, K are called face cards.
Each of the 52 cards in a deck is identified by its value v and suit s and denoted vs. For example, 2, J, and 7 are the two of hearts, Jack of clubs, and 7 of spades, respectively. The variable c will be used to denote a cards colour. Let f 1 if a card is a face card and f 0 otherwise.
A card is drawn at random from a thoroughly shuffled deck. Calculate:
a The information hc red, v K in observing a red King
b The conditional information hv Kf 1 in observing a King given a face card was drawn.
c The entropies HS and HV, S.
d The mutual information IV ; S between V and S.
e The mutual information IV ; C between the value and colour of a card using the last result and the data processing inequality.
4. Recall that for a random variable X, its variance is
VarX EX2 EX2.
Using Jensens inequality, show that the variance must always be nonnegative.
5. Let X and Y be independent random variables with possible outcomes 0, 1, each having a Bernoulli
distribution with parameter 21 , i.e.
pX 0pX 1 12 pY 0 pY 1 12.
a ComputeIX;Y.
b LetZXY.ComputeIX;YZ.
c Do the above quantities contradict the dataprocessing inequality? Explain your answer.
6. Consider a discrete variable X taking on values from the set X . Let pi be the probability of each state, with i 1, . . . , X . Denote the vector of probabilities by p. We saw in lectures that the entropy of X satisfies:
HX logX ,
with equality if and only if pi 1 for all i, i.e. p is uniform. Prove the above statement using
inequality, which says
for any probability distributions p, q over X outcomes, with equality if and only if p q.
p log pi 0

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com