COMP2610/COMP6261 Tutorial 6 Sample Solutions Tutorial 6: Source Coding
Young Lee and
Tutors: and
Week 8 (25th – 29th Sep), Semester 2, 2017 We know that X is a binomial with parameters n, p. Thus,
Copyright By PowCoder代写 加微信 powcoder
p(X ≥n−1)=p(X =n−1)+p(X =n) =n·pn−1 ·(1−p)+pn.
p(X≥n−1)≤E[X]= n ·p. n−1 n−1
p(X ≥1)=2·p·(1−p)+p2 =p·(2−p). p(X ≥1)≤2·p.
2 · p − (2 · p − p2) = p2.
Thus, for the Markov bound to be within 0.01 of the exact probability, we will need p ≤ 0.1.
We need to code all sequences of length 100 with three or less 1’s. There is 1 sequence with zero 1’s. There
are 100 sequences with one 1 (one sequence for each possible position in which the 1 appears.) Similarly,
there are 100 and 100 sequences with two and three 1’s respectively. In total we have 23
100 100
1 + 100 + 2 + 3 = 166, 751.
If we want a uniform code over this number of elements, we need ⌈log2(166, 751)⌉ = 18
We need to find the probability of observing more than 3 1’s. Let K be the number of 1’s observed. Then
P(K >3)=1−P(K ≤3)=1−P(K =0)−P(K =1)−P(K =2)−P(K =3) = 1 − 0.995100 + 100 × 0.99599 × 0.005
By Markov’s inequality,
When n = 2,
The bound from Markov’s inequality is
The difference between the two is
100 98 2 100 97
+ 2 ×0.995 ×0.005 + 3 ×0.995 ×0.005
(c) The number of 1’s, K, is a binomial random variable with probability of success 0.995. It has expectation E[K] = 100 × 0.005 = 0.5 and variance V[K] = 100 × 0.995 × 0.005 = 0.4975. By Chebyshev’s inequality we have
P(K ≥ 4) ≤ P(|K − 0.5| ≥ 3.5) = P(|K − E[K]| ≥ 3.5) ≤ V[K] = 0.4975 = 0.0406
This is larger than the probability we found in part (b), it is off by a factor of around 20.
3. (a) Using the definition of entropy
i. Recall that
H(X)=0.4×log 1 +0.6×log 1 =0.971 0.4 0.6
TNβ :={x:|−N1 log2P(x)−H(X)|<β} so we are looking for k (the number of ones) such that
0.871=H(X)−0.1<−N1 log2P(k)
程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com