This is a really beautiful sample problem from ISI MStat PSB 2008 Problem 10. It is based on testing simple hypothesis. This problem teaches me how observation, makes life simple. Go for it!
Consider a population with three kinds of individuals labelled 1,2 and 3. Suppose the proportion of individuals of the three types are given by \(f(k, \theta)\), k=1,2,3 where 0< \(\theta\)<1.
\(f(k, \theta) = \begin{cases} {\theta}^2 & k=1 \\ 2\theta(1-\theta) & k=2 \\ (1-\theta)^2 & k=3 \end{cases}\)
Let \(X_1,X_2,....,X_n\) be a random sample from this population. Find the most powerful test for testing \( H_o : \theta =\theta_o\) versus \(H_1: \theta = \theta_1\). (\(\theta_o< \theta_1< 1\)).
Binomial Distribution.
Neyman-Pearson Lemma.
Test function and power function.
Hypothesis Testing.
This is a quite beautiful problem, only when you observe it closely. Here the distribution of X may seem non-trivial ( non-theoretical), but if one observes the distribution of Y=X-1 (say), instead of X , one will find that \( Y \sim binomial( 2, 1-\theta) \) .
so, now let, p= 1-\( \theta\) , so, 0<p<1, and let, \( p_o= 1-\theta_o \) and \(p_1=1-\theta_1\).
and since , \( \theta_o< \theta_1 so, p_0>p_1 \), and our hypotheses, reduces to,
\( H_o : p = p_o\) versus \(H_1: p = p_1, where 1> p_o> p_1\).
so, under \(H_o\) , our joint pmf ( of Y=X-1), is \( f_o( \vec{y}) = \prod_{i=1}^n {2 \choose y_i} {(p_o)^{y_i}(1-p_0)^{2-y_i}}\) ; where \(y_i=x_i-1 , i=1,...,n \)
and under \(H_1\), our joint pmf is, \( f_o( \vec{y}) = \prod_{i=1}^n{2 \choose y_i}{(p_1)^{y_i}(1-p_1)^{2-y_i}} \) ; where \( y_i=x_i-1, i=1,...,n \)
So, now we can use, widely used Neyman-Pearson Lemma , and end up with,
\(\lambda (\vec{y})\)=\(\frac{f_1(\vec{y})}{f_o(\vec{y})}\)=\(\frac{\prod_{i=1}^{n} {2 \choose y_i} {p_1}^{y_i} {(1-p_0)}^{2-y_i}}{\prod_{i=1}^n {2 \choose y_i}{p_1}^{y_i}{(1-p_1)}^{2-y_i}}\)=\( {(\frac{p_1}{p_0})}^{\sum{y_i}} {(\frac{1-p_1}{1-p_o})}^{2-\sum{y_i}}\) .
now we define a test function, \(\phi(\vec{x})= \begin{cases} 1& \lambda*(\vec{x})> k \\ 0 &\lambda*(\vec{x}) \le k \end{cases}\). for some positive constant k.
Where \(\lambda(\vec{y})=\lambda*(\vec{x}), \vec{x}= ( X_1,....,X_n)\)
so, our test rule is, we reject \(H_o\) if \(\phi(\vec{x})=1\), and we choose k such that the for a give level \(\alpha\),
\(E_{H_o}(\phi(\vec{x})) \le \alpha\), for a given \(0<\alpha<1 \),
with a power function , \( \beta(\theta)= E(\phi(\vec{x})) \). Can you find the more subtle condition when,\( \lambda^*(\vec{x}) \le k \) ? Try It!
Suppose, \(\theta_o \le \theta_1\), can you verify, that there for any constant c, \(P_{\theta_1}(X>c) \le P_{\theta_1}(X>c) \) . Can you generalize the situation, what kind distribution must X follow ?? Think over it, until we meet again !
This is a really beautiful sample problem from ISI MStat PSB 2008 Problem 10. It is based on testing simple hypothesis. This problem teaches me how observation, makes life simple. Go for it!
Consider a population with three kinds of individuals labelled 1,2 and 3. Suppose the proportion of individuals of the three types are given by \(f(k, \theta)\), k=1,2,3 where 0< \(\theta\)<1.
\(f(k, \theta) = \begin{cases} {\theta}^2 & k=1 \\ 2\theta(1-\theta) & k=2 \\ (1-\theta)^2 & k=3 \end{cases}\)
Let \(X_1,X_2,....,X_n\) be a random sample from this population. Find the most powerful test for testing \( H_o : \theta =\theta_o\) versus \(H_1: \theta = \theta_1\). (\(\theta_o< \theta_1< 1\)).
Binomial Distribution.
Neyman-Pearson Lemma.
Test function and power function.
Hypothesis Testing.
This is a quite beautiful problem, only when you observe it closely. Here the distribution of X may seem non-trivial ( non-theoretical), but if one observes the distribution of Y=X-1 (say), instead of X , one will find that \( Y \sim binomial( 2, 1-\theta) \) .
so, now let, p= 1-\( \theta\) , so, 0<p<1, and let, \( p_o= 1-\theta_o \) and \(p_1=1-\theta_1\).
and since , \( \theta_o< \theta_1 so, p_0>p_1 \), and our hypotheses, reduces to,
\( H_o : p = p_o\) versus \(H_1: p = p_1, where 1> p_o> p_1\).
so, under \(H_o\) , our joint pmf ( of Y=X-1), is \( f_o( \vec{y}) = \prod_{i=1}^n {2 \choose y_i} {(p_o)^{y_i}(1-p_0)^{2-y_i}}\) ; where \(y_i=x_i-1 , i=1,...,n \)
and under \(H_1\), our joint pmf is, \( f_o( \vec{y}) = \prod_{i=1}^n{2 \choose y_i}{(p_1)^{y_i}(1-p_1)^{2-y_i}} \) ; where \( y_i=x_i-1, i=1,...,n \)
So, now we can use, widely used Neyman-Pearson Lemma , and end up with,
\(\lambda (\vec{y})\)=\(\frac{f_1(\vec{y})}{f_o(\vec{y})}\)=\(\frac{\prod_{i=1}^{n} {2 \choose y_i} {p_1}^{y_i} {(1-p_0)}^{2-y_i}}{\prod_{i=1}^n {2 \choose y_i}{p_1}^{y_i}{(1-p_1)}^{2-y_i}}\)=\( {(\frac{p_1}{p_0})}^{\sum{y_i}} {(\frac{1-p_1}{1-p_o})}^{2-\sum{y_i}}\) .
now we define a test function, \(\phi(\vec{x})= \begin{cases} 1& \lambda*(\vec{x})> k \\ 0 &\lambda*(\vec{x}) \le k \end{cases}\). for some positive constant k.
Where \(\lambda(\vec{y})=\lambda*(\vec{x}), \vec{x}= ( X_1,....,X_n)\)
so, our test rule is, we reject \(H_o\) if \(\phi(\vec{x})=1\), and we choose k such that the for a give level \(\alpha\),
\(E_{H_o}(\phi(\vec{x})) \le \alpha\), for a given \(0<\alpha<1 \),
with a power function , \( \beta(\theta)= E(\phi(\vec{x})) \). Can you find the more subtle condition when,\( \lambda^*(\vec{x}) \le k \) ? Try It!
Suppose, \(\theta_o \le \theta_1\), can you verify, that there for any constant c, \(P_{\theta_1}(X>c) \le P_{\theta_1}(X>c) \) . Can you generalize the situation, what kind distribution must X follow ?? Think over it, until we meet again !