CS代写 University Apr 18, 2022

University Apr 18, 2022
Due on Apr 27, 2022
Advanced Probability & Statistics for Engineers 18-665/465 ̆an
Homework 8

Copyright By PowCoder代写 加微信 powcoder

Instructions
• There are 12 problems in this HW. Please return solutions for ONLY 8 of them that you choose. We will distribute solutions for all problems.
• The homework is due at 11:59 PM local time on April 27, 2022.
• Please submit your homework to Gradescope and label the corresponding pages of each problem.
Please be sure to do this correctly otherwise your homework may not be graded correctly.
• Please show and justify all the steps.
• It is recommended that you use a scanner to scan your solutions. If you are planning to use your
phone’s camera, please use a Document Scanning application, e.g., iOS Notes, CamScanner, etc.
Problem 1 [Revisiting HW4 – Digital Communication with Gaus- sian Noise]
LetY =X+BwhereBisindependentofXand
􏰀 −1, with probability p
B = 1, with probability 1 − p
X is Gaussian random variable and X ∼ N(μx,σx2). This set-up arises in digital communica- tion where B is the encoding of the bit that is being transmitted (e.g., B = −1 when 0-bit is transmitted and B = +1 when 1-bit is transmitted) and X represents the channel noise.
In HW4, you derived the MMSE estimate of X given the observation of Y . In doing so, you effectively had to decide, based on observing Y = y, whether B = +1 or B = −1.
Consider this set-up again, this time formulating it as a Hypothesis Testing problem:
H0 :B=−1 H1 :B=+1
Design the Bayes test with uniform costs and calculate the corresponding probability of error.
Compare your answer with the result of HW4.
Problem 2 [Randomized Response for Local Differential Pri- vacy]
One of the hottest fields in ML these days is privacy-preserving inference and data-analytics. A special (user-friendly) case of this problem is known as local differential privacy where “even if an

University Apr 18, 2022
Due on Apr 27, 2022
Advanced Probability & Statistics for Engineers 18-665/465 ̆an
Homework 8
adversary has access to the personal responses of an individual in the database, the adversary will still be unable to learn too much about the user’s personal data.” A simple way to achieve (a certain level) of local differential privacy is a method known as Randomized Response.
To motivate this, consider the problem of estimating the income distribution of N individ- ual in a privacy-preserving manner, where the income of each individual can take one of 100 possible values {v1, . . . , v100}. Let θ denote the income of a specific user.
Assume that a data aggregator approaches each individual and asks what their income is. According to the randomized-response scheme, instead of sending their income (say, θ), each user sends the aggregator a random response X, where
 θ with probability γ + 1−γ 100
 vj withprobability 1−γ foreachvj ∈{v1,…,v100}−{θ}
Put differently, the users send their actual income only with probability γ and with probability 1 − γ they send a random income value from {v1, . . . , v100} each with equal probability.
The value of γ is what adjusts the trade-off between privacy of users and the ability of the aggregator to learn (or, estimate) the income distribution of this group of people; e.g., if γ is close to zero then user data will be almost perfectly private but the aggregator will not be able to make an accurate prediction, and if γ is close to one, then user privacy will be very low but the aggregator will be able to make accurate predictions of the income distribution. Think about what values of γ you would be comfortable with if you are participating in this process. (To make things more interesting, you can think about this set-up for other types of information that you might want to keep private).
Let’s analyze this trade-off more carefully. For a given γ, assume that the randomized- response scheme is employed over a very large population so that the aggregator managed to learn the income distribution perfectly (e.g., the aggregator could be using an ML estimator). Let pj denote the fraction of users whose income equals vj , for each j ∈ {1, . . . , 100}. After learning this distribution, the aggregator can try to predict the income of each individual based on their randomized response X, e.g., by constructing a Hypothesis Testing problem. Formally, the aggregator observes X and tries to decide which of the 100 Hypotheses, H1, . . . , H100 is true, whereHj :theincomeoftheuserequalsvj.
Derive the decision rule that leads to minimum probability of error (i.e., that maximizes the chances of the aggregator to correctly ”guess” the income of each user). Calculate the resulting probability of error and discuss how the value of γ changes this. Assuming that p1 = · · · = p100, are you still comfortable with the γ value you initially selected?

University Apr 18, 2022
Due on Apr 27, 2022
Advanced Probability & Statistics for Engineers 18-665/465 ̆an
Homework 8
Suppose X is a random variable that, under hypothesis H0, has PDF 􏰁(2/3)(x+1), 0≤x≤1
fX (x | H0) = 0, otherwise and, under hypothesis H1, has PDF
􏰁1, 0≤x≤1 fX (x | H1) = 0, otherwise
(a) Find the Bayes rule and minimum Bayes risk for testing H0 versus H1 with uniform costs (i.e.,C00 =C11 =0andC10 =C01 =1)andequalpriors(i.e.,p0 =p1 =1/2).
(b) Find the Neyman-Pearson rule for false-alarm probability α ∈ (0, 1). Calculate the corre- sponding detection probability.
Let X be a discrete random variable that is known to follow a Geometric distribution with parameter θ, i.e.,
PX(x|θ)=P(X =x|θ)=θ(1−θ)x−1, x=1,2,…. Here, θ is a hidden parameter with two possible values:
H0 :θ=θ0; H1 :θ=θ1.
Suppose that we observe X = x and are asked to predict whether H0 or H1 is true.
(a) Find the decision rule that minimizes the probability of error in terms of the prior proba- bilities p0, p1. Calculate the corresponding probability of error.
(b) Find the decision rule that maximizes the probability of detection (i.e., minimizes the probability of miss) under the constraint that the size (i.e., false alarm probability) of the test is less than or equal to α; i.e., derive the Neyman-Pearson test. Calculate the corresponding detection probability.

University Apr 18, 2022
Due on Apr 27, 2022
Advanced Probability & Statistics for Engineers 18-665/465 ̆an
Homework 8
Repeat Problem 4(a), when our observation contains N iid samples of the random variable X. Namely,letX1,…,XN beiidGeometricrandomvariableswithparameterθ,whereθ∈{θ0,θ1}. Calculate the corresponding probability of error and compare with the result of Problem 4(a).
(a) Consider the hypothesis pair
H0 :X=N versus H1 :X=N+S
where N and S are independent random variables each Exponential(1) i.e., 􏰁e−z, z≥0 􏰁e−s, s≥0
fN(z) = 0, z < 0 and fS(s) = 0, s < 0 Put differently, the observation X = N + θS, where the hidden parameter θ ∈ {0, 1}, and H0 :θ=0,H1 :θ=1.FindthelikelihoodratiobetweenH0 andH1. (b) Derive α-level Neyman-Pearson testing in (a). Suppose we have an observation X = x and binary hypotheses described by the following pair 􏰁(1 − |x|), fX(x|H0)= 0, 􏰁(2 − |x|)/4, fX(x|H1)= 0, |x| ≤ 1 |x|>1
|x| ≤ 2 |x|>2
Assume that the costs are given by
Find the Bayesian test of H0 versus H1 in terms of prior probabilities p0, p1. 4
C01 =2C10 >0 C00 =C11 =0

University Apr 18, 2022
Due on Apr 27, 2022
Advanced Probability & Statistics for Engineers 18-665/465 ̆an
Homework 8
Find the Neyman-Pearson test of H0 versus H1 with false-alarm probability α in Problem 7. Find the corresponding power of the test.
Suppose we observe a random variable Y given by
Y = N + θλ
where θ is either 0 or 1, λ is a fixed number between 0 and 2, and where N is a random variable that has uniform density on the interval (−1, 1). We wish to decide between the hypotheses
H0 :θ=0 versus H1 :θ=1
(a) Find the Neyman-Pearson decision rule for false-alarm probability α.
(b) Find the power of the Neyman-Pearson decision rule as a function of the false-alarm probability and the parameter λ. Plot PD vs. α.
Problem 10
Consider the following pair of hypotheses concerning a sequence Y1, Y2, . . . , Yn of independent random variables
H0 : Yk ∼ N(μ0,σ02),k = 1,…,n versus H1 : Yk ∼ N(μ1,σ12),k = 1,…,n
where μ0 , μ1 , σ02 and σ12 are known constants. Suppose that we observe Y1 = y1 , Y2 = y2 ,
…,Yn =yn.
(a) Show that the likelihood ratio can be expressed as a function of the parameters μ0, μ1, σ02
and σ12, and the quantities 􏰂nk=1 yk2 and 􏰂nk=1 yk.
(b) Derive the Neyman-Pearson test for the two cases (μ0 = μ1, σ12 > σ02) and (σ02 = σ12, μ1 > μ0 ).

University Apr 18, 2022
Due on Apr 27, 2022
Advanced Probability & Statistics for Engineers 18-665/465 ̆an
Homework 8
Problem 11
Suppose X1, X2, . . . are iid Poisson rvs with unknown parameter θ, i.e., for all i we have −θ θk
P(Xi =k)=e k! k=0,1,…
Using N observations from this random variable, i.e., observing X1 = x1, X2 = x2,. . ., XN = xN , find the maximum-likelihood (ML) estimate of θ. Compute the bias and variance of your estimate.
Problem 12
Suppose we observe a sequence Y1, Y2, . . . , Yn given by
Yk =Nk +θsk,k=1,…,n
where N = (N1, . . . , Nn)T is a zero-mean Gaussian random vector with covariance matrix Σ > 0; s1, s2, . . . , sn is a known signal sequence; and θ is a (real) nonrandom parameter.
(a) Find the maximum-likelihood estimate of the parameter θ. (b) Compute the bias and variance of your estimate.

程序代写 CS代考 加微信: powcoder QQ: 1823890830 Email: powcoder@163.com