This is a very beautiful sample problem from ISI MStat PSB 2012 Problem 3 based on finding the distribution of a random variable . Let's give it a try !!
Let \(X_{1}\) and \(X_{2}\) be i.i.d. exponential random variables with mean \(\lambda>0\) .Let \(Y_{1}=X_{1}-X_{2}\) and \(Y_{2}=R X_{1}-(1-R) X_{2},\) where \(R\) is a Bernoulli random variable with parameter \(1 / 2\) and is independent of \(X_{1}\) and \(X_{2}\)
(a) Show that \(Y_{1}\) and \(Y_{2}\) have the same distribution.
(b) Obtain the common density function.
Cumulative Distribution Function
Bernoulli distribution
Exponential Distribution
Cumulative distribution of \( Y_{1} \) be
\(F_{Y_{1}}(y_{1})=P(Y_{1} \leq y_{1})=P(x_{1}-x_{2} \leq y_{1}) \) ,\( y_1 \in R\)
\( =P(x_{1} \leq y_{1}+x_{2})\)
Now, \(y_{1}+x_{2} \ge 0 \Rightarrow x_{2} \ge-y_{1}\)
Now, if \(y_{1} \ge 0\) then,
\(P(x_{1} \le y_{1}+x_{2}) =\int_{0}^{\infty} P(x_{1} \le y_{1}+x_{2}t) \lambda e^{-\lambda x_{2}} d x_{2} \)
=\( \int_{0}^{\infty} \int_{0}^{y_{1}+x_{2}} \lambda e^{-\lambda x_{1}} x \lambda e^{-\lambda x_{2}} d x_{1} d x_{2} \)
=\( \int_{0}^{\infty} \lambda e^{\lambda x_{2}} x \lambda \times \frac{1}{\lambda} (1-e^{-(\lambda y_{1}+x_{2}) }) d x_{2} \)
=\( \int_{0}^{\infty} \lambda e^{-\lambda x_{2}} d x_{2}-\int_{0}^{\infty} \lambda e^{-\lambda (y_{1}+2 x_{2})} d x_{2} \)
=\( 1-\frac{e^{-\lambda y_{1}}}{2} \)
Now, \( y_{1} \le 0\) then,
\(P(x_{1} \leq y_{1}+x_{2}) =\int_{-y_{1}}^{\infty} \int_{0}^{y_{1}+x_{2}} \lambda e^{-\lambda x_{4}} x \lambda e^{-\lambda x_{2}} d x_{1} d x_{2} \)
\(=\int_{-y_{1}}^{\infty} x e^{-\lambda x_{2}}(1-e^{-\lambda(y_{1}+x_{1})}) d x_{2} \)
\(=\lambda \int_{-y_{1}}^{\infty} e^{-x^{2} x_{2}} d x_{2}-\int_{-y_{1}}^{\infty} \lambda e^{-\lambda(y_{1}+2 x_{2})} d x_{2} \)
\(=e^{+\lambda y_{1}}-\frac{e^{-\lambda y_{1}}}{2} x e^{+2 \lambda y_{1}} \)
\(=\frac{e^{\lambda y_{1}}}{2}\)
Therefore, \(F_{Y_{1}}(y_{1}) = \begin{cases} 1-\frac{e^{-\lambda y_{1}}}{2} & , i f y_{1} \ge 0 \\ \frac{e^{\lambda y_{1}}}{2} & ,if y_{1}<0 \end{cases}.\)
Cumulative distribution of \( Y_{2} \) be \( F_{Y_{2}}(y_{2})=P(Y_{2} \le y_{2}) \) , \( y_2 \in R\)
=\( P(Y_{2} \le y_{2} \mid R=1) P(R=1)+P(Y_{2} \le y_{2} \mid R=0) P(R=0) \)
\(=P(x_{1} \le y_{2}) \times \frac{1}{2}+P(-x_{2} \le y_{2}) \times \frac{1}{2} \)
= \( \begin{cases} \frac{1}{2} [F_{x_{1}}(y_{2})+1] & , y_{2} \ge 0 \\ \frac{1}{2} [1-F_{x_{2}}(-y_{2})] & ,y_{2}<0 \end{cases}.\)
=\( \begin{cases} 1-\frac{e^{-\lambda y_{2}}}{2}, & \text { if } y_{2} \ge 0 \\ \frac{e^{\lambda y_{2}}}{2} \end{cases}.\)
since cdf of exponential random Variable, X is \( (1-e^{-\lambda x}), x \ge 0\)
Thus both \(Y_{1}\) and \(Y_{2}\) has same distribution
(b) \( f_{Y_{1}}(y_{1})=\begin{cases} \frac{d}{d y_{1}}(1-\frac{e^{-\lambda y_{1}}}{2}) & \text { if } y_{1} \ge 0 \\ \frac{d}{d y_{1}}(\frac{e^{\lambda y_{1}}}{2}) & , \text { if } y_{2}<0 \end{cases} \)
= \(\begin{cases} \frac{\lambda e^{-\lambda y_{1}}}{2} & \text { if } y_{1} \ge 0 \\ \frac{\lambda e^{\lambda y_{1}}}{2} & , \text { if } y_{1}<0 \end{cases} \)
Similarly, for \(Y_2\) .
If \( \theta \sim U(0, 2 \pi ) \) then find the distribution of \( sin(\theta + {\theta}_{0} ) \) , where \( {\theta}_{0} \in (0,2 \pi) \).
This is a very beautiful sample problem from ISI MStat PSB 2012 Problem 3 based on finding the distribution of a random variable . Let's give it a try !!
Let \(X_{1}\) and \(X_{2}\) be i.i.d. exponential random variables with mean \(\lambda>0\) .Let \(Y_{1}=X_{1}-X_{2}\) and \(Y_{2}=R X_{1}-(1-R) X_{2},\) where \(R\) is a Bernoulli random variable with parameter \(1 / 2\) and is independent of \(X_{1}\) and \(X_{2}\)
(a) Show that \(Y_{1}\) and \(Y_{2}\) have the same distribution.
(b) Obtain the common density function.
Cumulative Distribution Function
Bernoulli distribution
Exponential Distribution
Cumulative distribution of \( Y_{1} \) be
\(F_{Y_{1}}(y_{1})=P(Y_{1} \leq y_{1})=P(x_{1}-x_{2} \leq y_{1}) \) ,\( y_1 \in R\)
\( =P(x_{1} \leq y_{1}+x_{2})\)
Now, \(y_{1}+x_{2} \ge 0 \Rightarrow x_{2} \ge-y_{1}\)
Now, if \(y_{1} \ge 0\) then,
\(P(x_{1} \le y_{1}+x_{2}) =\int_{0}^{\infty} P(x_{1} \le y_{1}+x_{2}t) \lambda e^{-\lambda x_{2}} d x_{2} \)
=\( \int_{0}^{\infty} \int_{0}^{y_{1}+x_{2}} \lambda e^{-\lambda x_{1}} x \lambda e^{-\lambda x_{2}} d x_{1} d x_{2} \)
=\( \int_{0}^{\infty} \lambda e^{\lambda x_{2}} x \lambda \times \frac{1}{\lambda} (1-e^{-(\lambda y_{1}+x_{2}) }) d x_{2} \)
=\( \int_{0}^{\infty} \lambda e^{-\lambda x_{2}} d x_{2}-\int_{0}^{\infty} \lambda e^{-\lambda (y_{1}+2 x_{2})} d x_{2} \)
=\( 1-\frac{e^{-\lambda y_{1}}}{2} \)
Now, \( y_{1} \le 0\) then,
\(P(x_{1} \leq y_{1}+x_{2}) =\int_{-y_{1}}^{\infty} \int_{0}^{y_{1}+x_{2}} \lambda e^{-\lambda x_{4}} x \lambda e^{-\lambda x_{2}} d x_{1} d x_{2} \)
\(=\int_{-y_{1}}^{\infty} x e^{-\lambda x_{2}}(1-e^{-\lambda(y_{1}+x_{1})}) d x_{2} \)
\(=\lambda \int_{-y_{1}}^{\infty} e^{-x^{2} x_{2}} d x_{2}-\int_{-y_{1}}^{\infty} \lambda e^{-\lambda(y_{1}+2 x_{2})} d x_{2} \)
\(=e^{+\lambda y_{1}}-\frac{e^{-\lambda y_{1}}}{2} x e^{+2 \lambda y_{1}} \)
\(=\frac{e^{\lambda y_{1}}}{2}\)
Therefore, \(F_{Y_{1}}(y_{1}) = \begin{cases} 1-\frac{e^{-\lambda y_{1}}}{2} & , i f y_{1} \ge 0 \\ \frac{e^{\lambda y_{1}}}{2} & ,if y_{1}<0 \end{cases}.\)
Cumulative distribution of \( Y_{2} \) be \( F_{Y_{2}}(y_{2})=P(Y_{2} \le y_{2}) \) , \( y_2 \in R\)
=\( P(Y_{2} \le y_{2} \mid R=1) P(R=1)+P(Y_{2} \le y_{2} \mid R=0) P(R=0) \)
\(=P(x_{1} \le y_{2}) \times \frac{1}{2}+P(-x_{2} \le y_{2}) \times \frac{1}{2} \)
= \( \begin{cases} \frac{1}{2} [F_{x_{1}}(y_{2})+1] & , y_{2} \ge 0 \\ \frac{1}{2} [1-F_{x_{2}}(-y_{2})] & ,y_{2}<0 \end{cases}.\)
=\( \begin{cases} 1-\frac{e^{-\lambda y_{2}}}{2}, & \text { if } y_{2} \ge 0 \\ \frac{e^{\lambda y_{2}}}{2} \end{cases}.\)
since cdf of exponential random Variable, X is \( (1-e^{-\lambda x}), x \ge 0\)
Thus both \(Y_{1}\) and \(Y_{2}\) has same distribution
(b) \( f_{Y_{1}}(y_{1})=\begin{cases} \frac{d}{d y_{1}}(1-\frac{e^{-\lambda y_{1}}}{2}) & \text { if } y_{1} \ge 0 \\ \frac{d}{d y_{1}}(\frac{e^{\lambda y_{1}}}{2}) & , \text { if } y_{2}<0 \end{cases} \)
= \(\begin{cases} \frac{\lambda e^{-\lambda y_{1}}}{2} & \text { if } y_{1} \ge 0 \\ \frac{\lambda e^{\lambda y_{1}}}{2} & , \text { if } y_{1}<0 \end{cases} \)
Similarly, for \(Y_2\) .
If \( \theta \sim U(0, 2 \pi ) \) then find the distribution of \( sin(\theta + {\theta}_{0} ) \) , where \( {\theta}_{0} \in (0,2 \pi) \).