Categories

ISI MStat PSB 2006 Problem 8 | Bernoullian Beauty

This is a very beautiful sample problem from ISI MStat PSB 2006 Problem 8. It is based on basic idea of Maximum Likelihood Estimators, but with a bit of thinking. Give it a thought !

Problem– ISI MStat PSB 2006 Problem 8

Let $(X_1,Y_1),……,(X_n,Y_n)$ be a random sample from the discrete distributions with joint probability

$f_{X,Y}(x,y) = \begin{cases} \frac{\theta}{4} & (x,y)=(0,0) \ and \ (1,1) \\ \frac{2-\theta}{4} & (x,y)=(0,1) \ and \ (1,0) \end{cases}$

with $0 \le \theta \le 2$. Find the maximum likelihood estimator of $\theta$.

Prerequisites

Maximum Likelihood Estimators

Indicator Random Variables

Bernoulli Trials

Solution :

This is a very beautiful Problem, not very difficult, but her beauty is hidden in her simplicity, lets explore !!

Observe, that the given pmf is as good as useless while taking us anywhere, so we should think out of the box, but before going out of the box, lets collect whats in the box !

So, from the given pmf we get, $P( \ of\ getting\ pairs \ of\ form \ (1,1) \ or \ (0,0))=2\times \frac{\theta}{4}=\frac{\theta}{2}$,

Similarly, $P( \ of\ getting\ pairs \ of\ form \ (0,1) \ or \ (1,0))=2\times \frac{2-\theta}{4}=\frac{2-\theta}{2}=1-P( \ of\ getting\ pairs \ of\ form \ (1,1) \ or \ (0,0))$

So, clearly it is giving us a push towards involving Bernoulli trials, isn’t it !!

So, lets treat the pairs with match, .i.e. $x=y$, be our success, and the other possibilities be failure, then our success probability is $\frac{\theta}{2}$, where $0\le \theta \le 2$. So, if $S$ be the number of successful pairs in our given sample of size $n$, then it is evident $S \sim Binomial(n, \frac{\theta}{2})$.

So, now its simplified by all means, and we know the MLE of population proportion in binomial is the proportion of success in the sample,

Hence, $\frac{\hat{\theta_{MLE}}}{2}= \frac{s}{n}$, where $s$ is the number of those pairs in our sample where $X_i=Y_i$.

So, $\hat{\theta_{MLE}}=\frac{2(number\ of \ pairs \ in\ the\ sample\ of \ form\ (0,0)\ or \ (1,1))}{n}$.

Hence, we are done !!

Food For Thought

Say, $X$ and $Y$ are two independent exponential random variable with means $\mu$ and $\lambda$ respectively. But you observe two other variables, $Z$ and $W$, such that $Z=min(X,Y)$ and $W$ takes the value $1$ when $Z=X$ and $0$ otherwise. Can you find the MLEs of the parameters ?

Give it a try !!

Categories

ISI MStat PSB 2013 Problem 7 | Bernoulli interferes Normally

This is a very simple and beautiful sample problem from ISI MStat PSB 2013 Problem 7. It is mainly based on simple hypothesis testing of normal variables where it is just modified with a bernoulli random variable. Try it!

Problem– ISI MStat PSB 2013 Problem 7

Suppose $X_1$ and $X_2$ are two independent and identically distributed random variables with $N(\theta, 1)$. Further consider a Bernoulli random variable $V$ with $P(V=1)=\frac{1}{4}$ which is independent of $X_1$ and $X_2$ . Define $X_3$ as,

$X_3 = \begin{cases} X_1 & when & V=0 \\ X_2 & when & V=1 \end{cases}$

For testing $H_o: \theta= 0$ against $H_1=\theta=1$ consider the test:

Rejects $H_o$ if $\frac{(X_1+X_2+X_3)}{3} >c$.

Find $c$ such that the test has size $0.05$.

Prerequisites

Normal Distribution

Simple Hypothesis Testing

Bernoulli Trials

Solution :

These problem is simple enough, the only trick is that to observe that the test rule is based on 3 random variables, $X_1,X_2$ and $X_3$ but $X_3$ on extension is dependent on the the other bernoulli variable $V$.

So, here it is given that we reject $H_o$ at size $0.05$ if $\frac{(X_1+X_2+X_3)}{3}> c$ such that,

$P_{H_o}(\frac{X_1+X_2+X_3}{3}>c)=0.05$

So, Using law of Total Probability as, $X_3$ is conditioned on $V$,

$P_{H_o}(X_1+X_2+X_3>3c|V=0)P(V=0)+P_{H_o}(X_1+X_2+X_3>3c|V=1)P(V=1)=0.05$

$\Rightarrow P_{H_o}(2X_1+X_2>3c)\frac{3}{4}+P_{H_o}(X_1+2X_2>3c)\frac{1}{4}=0.05$ [ remember, $X_1$, and $X_2$ are independent of $V$].

Now, under $H_o$ , $2X_1+X_2 \sim N(0,5)$and $X_1+2X_2 \sim N(0,5)$ ,

So, the rest part is quite obvious and easy to figure it out which I leave it is an exercise itself !!

Food For Thought

Lets end this discussion with some exponential,

Suppose, $X_1,X_2,….,X_n$ are a random sample from $exponential(\theta)$ and $Y_1,Y_2,…..,Y_m$ is another random sample from the population of $exponential(\mu)$. Now you are to test $H_o: \theta=\mu$ against $H_1: \theta \neq \mu$ .

Can you show that the test can be based on a statistic $T$ such that, $T= \frac{\sum X_i}{\sum X_i +\sum Y_i}$.

What distribution you think, T should follow under null hypothesis ? Think it over !!

Categories

ISI MStat PSB 2005 Problem 2 | Calculating probability using Binomial Distribution

This is a very beautiful sample problem from ISI MStat PSB 2005 Problem 2 based on finding probability using the binomial distribution. Let’s give it a try !!

Problem– ISI MStat PSB 2005 Problem 2

Let $X$ and $Y$ be independent random variables with X having a binomial distribution with parameters 5 and $1 / 2$ and $Y$ having a binomial distribution with parameters 7 and $1 / 2 .$ Find the probability that $|X-Y|$ is even.

Prerequisites

Binomial Distribution

Binomial Expansion

Parity Check

Solution :

Given $X \sim$ Bin(5,1/2) and $Y \sim$ Bin(7,1/2) , and they are independent .

Now , we have to find , $P(|X-Y|=even )$

$|X-Y|$= even if both X and Y are even or both X and Y are odd .

Therefore $P(|X-Y|=even )=P(X=even,Y=even) + P(X=odd , Y=odd)$

P(X=even , Y= even ) =$( {5 \choose 0} {(\frac{1}{2})}^5 + {5 \choose 2} {(\frac{1}{2})}^5 + \cdots + {5 \choose 4} {(\frac{1}{2})}^5 )( {7 \choose 0} {(\frac{1}{2})}^7 + {7 \choose 2} {(\frac{1}{2})}^7 + \cdots + {7 \choose 6} {(\frac{1}{2})}^7)$

=$({(\frac{1}{2})}^5 \times \frac{2^5}{2})({(\frac{1}{2})}^7 \times \frac{2^7}{2})$

= $\frac{1}{4}$

Similarly , one can find P(X=odd , Y=odd ) which is coming out to be $\frac{1}{4}$ .

Hence , P(|X-Y|) = 14+1/4 = 1/2 .

Food For Thought

Try to find P(X-Y=odd) under the same condition as given in the above problem .

Categories

Restricted Maximum Likelihood Estimator |ISI MStat PSB 2012 Problem 9

This is a very beautiful sample problem from ISI MStat PSB 2012 Problem 9, It’s about restricted MLEs, how restricted MLEs are different from the unrestricted ones, if you miss delicacies you may miss the differences too . Try it! But be careful.

Problem– ISI MStat PSB 2012 Problem 9

Suppose $X_1$ and $X_2$ are i.i.d. Bernoulli random variables with parameter $p$ where it us known that $\frac{1}{3} \le p \le \frac{2}{3}$. Find the maximum likelihood estimator $\hat{p}$ of $p$ based on $X_1$ and $X_2$.

Prerequisites

Bernoulli trials

Restricted Maximum Likelihood Estimators

Real Analysis

Solution :

This problem seems quite simple and it is simple, if and only if one observes subtle details. Lets think about the unrestricted MLE of $p$,

Let the unrestricted MLE of $p$ (i.e. when $0\le p \le 1$ )based on $X_1$ and $X_2$ be $p_{MLE}$, and $p_{MLE}=\frac{X_1+X_2}{2}$ (How ??)

Now lets see the contradictions which may occur if we don’t modify $p_{MLE}$ to $\hat{p}$ (as it is been asked).

See, that when if our sample comes such that $X_1=X_2=0$ or $X_1=X_2=1$, then $p_{MLE}$ will be 0 and 1 respectively, where $p$, the actual parameter neither takes the value 1 or 0 !! So, $p_{MLE}$ needs serious improvement !

To, modify the $p_{MLE}$, lets observe the log-likelihood function of Bernoulli based in two samples.

$\log L(p|x_1,x_2)=(x_1+x_2)\log p +(2-x_1-x_2)\log (1-p)$

Now, make two observations, when $X_1=X_2=0$ (.i.e. $p_{MLE}=0$), then $\log L(p|x_1,x_2)=2\log (1-p)$, see that $\log L(p|x_1,x_2)$ decreases as p increase, hence under the given condition, log_likelihood will be maximum when p is least, .i.e. $\hat{p}=\frac{1}{3}$.

Similarly, when $p_{MLE}=1$ (i.e.when $X_1=X_2=1$), then for the log-likelihood function to be maximum, p has to be maximum, i.e. $\hat{p}=\frac{2}{3}$.

So, to modify $p_{MLE}$ to $\hat{p}$, we have to develop a linear relationship between $p_{MLE}$ and $\hat{p}$. (Linear because, the relationship between $p$ and $p_{MLE}$ is linear. ). So, $\hat{p}$ and $p_{MLE}$ is on the line that is joining the points $(0,\frac{1}{3})$ ( when $p_{MLE}= 0$ then $\hat{p}=\frac{1}{3}$) and $(1,\frac{2}{3})$. Hence the line is,

$\frac{\hat{p}-\frac{1}{3}}{p_{MLE}-0}=\frac{\frac{2}{3}-\frac{1}{3}}{1-0}$

$\hat{p}=\frac{2-X_1-X_2}{6}$. is the required restricted MLE.

Hence the solution concludes.

Food For Thought

Can You find out the conditions for which the Maximum Likelihood Estimators are also unbiased estimators of the parameter. For which distributions do you think this conditions holds true. Are the also Minimum Variance Unbiased Estimators !!

Can you give some examples when the MLEs are not unbiased ?Even If they are not unbiased are the Sufficient ??

Categories

ISI MStat PSB 2012 Problem 3 | Finding the Distribution of a Random Variable

This is a very beautiful sample problem from ISI MStat PSB 2012 Problem 3 based on finding the distribution of a random variable . Let’s give it a try !!

Problem– ISI MStat PSB 2012 Problem 3

Let $X_{1}$ and $X_{2}$ be i.i.d. exponential random variables with mean $\lambda>0$ .Let $Y_{1}=X_{1}-X_{2}$ and $Y_{2}=R X_{1}-(1-R) X_{2},$ where $R$ is a Bernoulli random variable with parameter $1 / 2$ and is independent of $X_{1}$ and $X_{2}$
(a) Show that $Y_{1}$ and $Y_{2}$ have the same distribution.
(b) Obtain the common density function.

Prerequisites

Cumulative Distribution Function

Bernoulli distribution

Exponential Distribution

Solution :

Cumulative distribution of $Y_{1}$ be

$F_{Y_{1}}(y_{1})=P(Y_{1} \leq y_{1})=P(x_{1}-x_{2} \leq y_{1})$ ,$y_1 \in R$

$=P(x_{1} \leq y_{1}+x_{2})$
Now, $y_{1}+x_{2} \ge 0 \Rightarrow x_{2} \ge-y_{1}$
Now, if $y_{1} \ge 0$ then,
$P(x_{1} \le y_{1}+x_{2}) =\int_{0}^{\infty} P(x_{1} \le y_{1}+x_{2}t) \lambda e^{-\lambda x_{2}} d x_{2}$

=$\int_{0}^{\infty} \int_{0}^{y_{1}+x_{2}} \lambda e^{-\lambda x_{1}} x \lambda e^{-\lambda x_{2}} d x_{1} d x_{2}$

=$\int_{0}^{\infty} \lambda e^{\lambda x_{2}} x \lambda \times \frac{1}{\lambda} (1-e^{-(\lambda y_{1}+x_{2}) }) d x_{2}$

=$\int_{0}^{\infty} \lambda e^{-\lambda x_{2}} d x_{2}-\int_{0}^{\infty} \lambda e^{-\lambda (y_{1}+2 x_{2})} d x_{2}$

=$1-\frac{e^{-\lambda y_{1}}}{2}$

Now, $y_{1} \le 0$ then,
$P(x_{1} \leq y_{1}+x_{2}) =\int_{-y_{1}}^{\infty} \int_{0}^{y_{1}+x_{2}} \lambda e^{-\lambda x_{4}} x \lambda e^{-\lambda x_{2}} d x_{1} d x_{2}$
$=\int_{-y_{1}}^{\infty} x e^{-\lambda x_{2}}(1-e^{-\lambda(y_{1}+x_{1})}) d x_{2}$
$=\lambda \int_{-y_{1}}^{\infty} e^{-x^{2} x_{2}} d x_{2}-\int_{-y_{1}}^{\infty} \lambda e^{-\lambda(y_{1}+2 x_{2})} d x_{2}$
$=e^{+\lambda y_{1}}-\frac{e^{-\lambda y_{1}}}{2} x e^{+2 \lambda y_{1}}$
$=\frac{e^{\lambda y_{1}}}{2}$
Therefore, $F_{Y_{1}}(y_{1}) = \begin{cases} 1-\frac{e^{-\lambda y_{1}}}{2} & , i f y_{1} \ge 0 \\ \frac{e^{\lambda y_{1}}}{2} & ,if y_{1}<0 \end{cases}.$

Cumulative distribution of $Y_{2}$ be $F_{Y_{2}}(y_{2})=P(Y_{2} \le y_{2})$ , $y_2 \in R$

=$P(Y_{2} \le y_{2} \mid R=1) P(R=1)+P(Y_{2} \le y_{2} \mid R=0) P(R=0)$
$=P(x_{1} \le y_{2}) \times \frac{1}{2}+P(-x_{2} \le y_{2}) \times \frac{1}{2}$
= $\begin{cases} \frac{1}{2} [F_{x_{1}}(y_{2})+1] & , y_{2} \ge 0 \\ \frac{1}{2} [1-F_{x_{2}}(-y_{2})] & ,y_{2}<0 \end{cases}.$
=$\begin{cases} 1-\frac{e^{-\lambda y_{2}}}{2}, & \text { if } y_{2} \ge 0 \\ \frac{e^{\lambda y_{2}}}{2} \end{cases}.$
since cdf of exponential random Variable, X is $(1-e^{-\lambda x}), x \ge 0$
Thus both $Y_{1}$ and $Y_{2}$ has same distribution
(b) $f_{Y_{1}}(y_{1})=\begin{cases} \frac{d}{d y_{1}}(1-\frac{e^{-\lambda y_{1}}}{2}) & \text { if } y_{1} \ge 0 \\ \frac{d}{d y_{1}}(\frac{e^{\lambda y_{1}}}{2}) & , \text { if } y_{2}<0 \end{cases}$

= $\begin{cases} \frac{\lambda e^{-\lambda y_{1}}}{2} & \text { if } y_{1} \ge 0 \\ \frac{\lambda e^{\lambda y_{1}}}{2} & , \text { if } y_{1}<0 \end{cases}$

Similarly, for $Y_2$ .

Food For Thought

If $\theta \sim U(0, 2 \pi )$ then find the distribution of $sin(\theta + {\theta}_{0} )$ , where ${\theta}_{0} \in (0,2 \pi)$.

Categories

Intertwined Conditional Probability | ISI MStat 2016 PSB Problem 4

This is an interesting problem from intertwined conditional probability and Bernoulli random variable mixture, which gives a sweet and sour taste to Problem 4 of ISI MStat 2016 PSB.

Problem

Let $X, Y,$ and $Z$ be three Bernoulli $\left(\frac{1}{2}\right)$ random variables such that $X$ and $Y$ are independent, $Y$ and $Z$ are independent, and $Z$ and $X$ are independent.
(a) Show that $\mathrm{P}(X Y Z=0) \geq \frac{3}{4}$.
(b) Show that if equality holds in (a), then $$Z= \begin{cases} 1 & \text { if } X=Y, \\ 0 & \text { if } X \neq Y\\ \end{cases}$$

Prerequisites

• Principle of Inclusion and Exclusion $|A \cup B \cup C|=|A|+|B|+|C|-|A \cap B|-|A \cap C|-|B \cap C|+|A \cap B \cap C|$
• Basic Probability Theory
• Conditional Probability
• $abc = 0$ iff $a= 0$ or $b= 0$ or $c = 0$.
• $\cup$ = or; $\cap$ = and

Solution

(a)

$P(XYZ = 0) \iff P( { X = 0} \cup {Y = 0} \cup {Z = 0})$

$$= P(X = 0) + P(Y = 0) + P(Z= 0) – P({ X = 0} \cap {Y = 0}) – P({Y = 0} \cap {Z= 0}) – P({X = 0} \cap {Z= 0}) + P({X = 0} \cap {Y = 0} \cap {Z= 0}).$$

We use the fact that $X$ and $Y$ are independent, $Y$ and $Z$ are independent, and $Z$ and $X$ are independent.

$$= P(X = 0) + P(Y = 0) + P(Z= 0) – P({ X = 0})P({Y = 0}) – P({Y = 0})P({Z= 0}) – P({X = 0})P({Z= 0}) + P({X = 0},{Y = 0},{Z= 0})$$.

$X, Y,$ and $Z$ be three Bernoulli $\left(\frac{1}{2}\right)$ random variables. Hence,

$P(XYZ = 0) = \frac{3}{4} + P({X = 0},{Y = 0},{Z= 0}) \geq \frac{3}{4}$.

(b)

$P(XYZ = 0) = \frac{3}{4} \iff P({X = 0},{Y = 0},{Z= 0}) = 0$.

Now, this is just a logical game with conditional probability.

$P({X = 0} |{Y = 0},{Z= 0}) = 0 \Rightarrow P({Z= 0} |{Y = 0},{X = 1}) = 1$.

$P({Y = 0} |{X = 0},{Z= 0}) = 0 \Rightarrow P({Z= 0} |{X = 0},{Y = 1}) = 1$.

$P({Z = 0} |{X = 0},{Y= 0}) = 0 \Rightarrow P({Z = 1} |{X = 0},{Y= 0}) = 1$.

$P( Z = 0) = P({X = 1},{Y = 0},{Z= 0}) + P({X = 0},{Y = 1},{Z= 0}) + P({X = 1},{Y = 1},{Z= 0}) + P({X = 0},{Y = 0},{Z= 0})$

$= \frac{1}{4} + \frac{1}{4} + P({X = 1},{Y = 1},{Z= 0})$.

Now, $Z$ is a Bernoulli $\left(\frac{1}{2}\right)$ random variable. So, $P(Z = 0) =\frac{1}{2}$ $\Rightarrow P({X = 1},{Y = 1},{Z= 0}) = 0 \Rightarrow P({Z = 0} | {Y = 1},{X= 1}) = 0$.

$P({Z= 0} |{Y = 0},{X = 1}) = 1$.

$P({Z= 0} |{X = 0},{Y = 1}) = 1$.

$P({Z = 1} |{X = 0},{Y= 0}) = 1$.

$P({Z = 1} | {Y = 1},{X= 1}) = 1$.

Hence, $$Z= \begin{cases} 1 & \text { if } X=Y, \\ 0 & \text { if } X \neq Y\\ \end{cases}$$.