This is a problem from the ISI MStat Entrance Examination, 2019. This primarily tests one's familiarity with size, power of a test and whether he/she is able to condition an event properly.
Let Z be a random variable with probability density function
\( f(z)=\frac{1}{2} e^{-|z- \mu|} , z \in \mathbb{R} \) with parameter \( \mu \in \mathbb{R} \). Suppose, we observe \(X = \) max \( (0,Z) \).
(a)Find the constant c such that the test that "rejects when \( X>c \)" has size 0.05 for the null hypothesis \(H_0 : \mu=0 \).
(b)Find the power of this test against the alternative hypothesis \(H_1: \mu =2 \).
And believe me as Joe Blitzstein says: "Conditioning is the soul of statistics"
(a) If you know what size of a test means, then you can easily write down the condition mentioned in part(a) in mathematical terms.
It simply means \( P_{H_0}(X>c)=0.05 \)
Now, under \( H_0 \), \( \mu=0 \).
So, we have the pdf of Z as \( f(z)=\frac{1}{2} e^{-|z|} \)
As the support of Z is \( \mathbb{R} \), we can partition it in \( \{Z \ge 0,Z <0 \} \).
Now, let's condition based on this partition. So, we have:
\( P_{H_0}(X > c)=P_{H_0}(X>c , Z \ge 0)+ P_{H_0}(X>c, Z<0) =P_{H_0}(X>c , Z \ge 0) =P_{H_0}(Z > c) \)
Do, you understand the last equality? (Try to convince yourself why)
So, \( P_{H_0}(X >c)=P_{H_0}(Z > c)=\int_{c}^{\infty} \frac{1}{2} e^{-|z|} dz = \frac{1}{2}e^{-c} \)
Equating \(\frac{1}{2}e^{-c} \) with 0.05, we get \( c= \ln{10} \)
(b) The second part is just mere calculation given already you know the value of c.
Power of test against \(H_1 \) is given by:
\(P_{H_1}(X>\ln{10})=P_{H_1}(Z > \ln{10})=\int_{\ln{10}}^{\infty} \frac{1}{2} e^{-|z-2|} dz = \frac{e^2}{20} \)
The pdf occurring in this problem is an example of a Laplace distribution.Look it up on the internet if you are not aware and go through its properties.
Suppose you have a random variable V which follows Exponential Distribution with mean 1.
Let I be a Bernoulli(\(\frac{1}{2} \)) random variable. It is given that I,V are independent.
Can you find a function h (which is also a random variable), \(h=h(I,V) \) ( a continuous function of I and V) such that h has the standard Laplace distribution?
This is a problem from the ISI MStat Entrance Examination, 2019. This primarily tests one's familiarity with size, power of a test and whether he/she is able to condition an event properly.
Let Z be a random variable with probability density function
\( f(z)=\frac{1}{2} e^{-|z- \mu|} , z \in \mathbb{R} \) with parameter \( \mu \in \mathbb{R} \). Suppose, we observe \(X = \) max \( (0,Z) \).
(a)Find the constant c such that the test that "rejects when \( X>c \)" has size 0.05 for the null hypothesis \(H_0 : \mu=0 \).
(b)Find the power of this test against the alternative hypothesis \(H_1: \mu =2 \).
And believe me as Joe Blitzstein says: "Conditioning is the soul of statistics"
(a) If you know what size of a test means, then you can easily write down the condition mentioned in part(a) in mathematical terms.
It simply means \( P_{H_0}(X>c)=0.05 \)
Now, under \( H_0 \), \( \mu=0 \).
So, we have the pdf of Z as \( f(z)=\frac{1}{2} e^{-|z|} \)
As the support of Z is \( \mathbb{R} \), we can partition it in \( \{Z \ge 0,Z <0 \} \).
Now, let's condition based on this partition. So, we have:
\( P_{H_0}(X > c)=P_{H_0}(X>c , Z \ge 0)+ P_{H_0}(X>c, Z<0) =P_{H_0}(X>c , Z \ge 0) =P_{H_0}(Z > c) \)
Do, you understand the last equality? (Try to convince yourself why)
So, \( P_{H_0}(X >c)=P_{H_0}(Z > c)=\int_{c}^{\infty} \frac{1}{2} e^{-|z|} dz = \frac{1}{2}e^{-c} \)
Equating \(\frac{1}{2}e^{-c} \) with 0.05, we get \( c= \ln{10} \)
(b) The second part is just mere calculation given already you know the value of c.
Power of test against \(H_1 \) is given by:
\(P_{H_1}(X>\ln{10})=P_{H_1}(Z > \ln{10})=\int_{\ln{10}}^{\infty} \frac{1}{2} e^{-|z-2|} dz = \frac{e^2}{20} \)
The pdf occurring in this problem is an example of a Laplace distribution.Look it up on the internet if you are not aware and go through its properties.
Suppose you have a random variable V which follows Exponential Distribution with mean 1.
Let I be a Bernoulli(\(\frac{1}{2} \)) random variable. It is given that I,V are independent.
Can you find a function h (which is also a random variable), \(h=h(I,V) \) ( a continuous function of I and V) such that h has the standard Laplace distribution?
Take $h = (2I-1) V.$ Now since I can take $0$ or $1$ and $V \geq 0,$ so $h$ can take any real numbers as it's value. Let $x \geq 0.$ Then by using independence of the random variables $I$ and $V$ we get \begin{align*} \Bbb P (h \leq x) & = \Bbb P(I = 0) + \Bbb P(I=1, V \leq x) \\ & = \frac 1 2 + \Bbb P(I = 1) \Bbb P(V \leq x) \\ & = \frac 1 2 + \frac 1 2 (1 - e^{-x}) \\ & = 1 - \frac 1 2 e^{-x} \end{align*} Now if $x < 0$ then again by using independence of the random variables $I$ and $V$ we get \begin{align*} \Bbb P(h \leq x) & = \Bbb P (I = 0, V \geq -x) \\ & = \Bbb P (I=0) \Bbb P(V \geq -x) \\ & = \frac 1 2 (1 - \Bbb P(V \leq -x)) \\ & = \frac 1 2 (1 - (1-e^x)) \\ & = \frac 1 2 e^x \end{align*} Thus the random variable $h$ follows standard laplace distribution.
Great Work Arnab. Stay Tuned!