Get inspired by the success stories of our students in IIT JAM MS, ISI  MStat, CMI MSc DS.  Learn More 

ISI MStat PSB 2009 Problem 6 | abNormal MLE of Normal

This is a very beautiful sample problem from ISI MStat PSB 2009 Problem 6. It is based on the idea of Restricted Maximum Likelihood Estimators, and Mean Squared Errors. Give it a Try it !

Problem-ISI MStat PSB 2009 Problem 6


Suppose X_1,.....,X_n are i.i.d. N(\theta,1), \theta_o \le \theta \le \theta_1, where \theta_o < \theta_1 are two specified numbers. Find the MLE of \theta and show that it is better than the sample mean \bar{X} in the sense of having smaller mean squared error.

Prerequisites


Maximum Likelihood Estimators

Normal Distribution

Mean Squared Error

Solution :

This is a very interesting Problem ! We all know, that if the condition "\theta_o \le \theta \le \theta_1, for some specified numbers \theta_o < \theta_1" had been not given, then the MLE would have been simply \bar{X}=\frac{1}{n}\sum_{k=1}^n X_k, the sample mean of the given sample. But due to the restriction over \theta things get interestingly complicated.

So, simplify a bit, lets write the Likelihood Function of theta given this sample, \vec{X}=(X_1,....,X_n)',

L(\theta |\vec{X})={\frac{1}{\sqrt{2\pi}}}^nexp(-\frac{1}{2}\sum_{k=1}^n(X_k-\theta)^2), when \theta_o \le \theta \le \theta_1ow taking natural log both sides and differentiating, we find that ,

\frac{d\ln L(\theta|\vec{X})}{d\theta}= \sum_{k=1}^n (X_k-\theta).

Now, verify that if \bar{X} < \theta_o, then L(\theta |\vec{X}) is always a decreasing function of \theta, [ where, \(\theta_o \le \theta \le \theta_1\)], Hence the maximum likelihood attains at \theta_o itself. Similarly, when, \theta_o \le \bar{X} \le \theta_1, the maximum likelihood attains at \bar{X}, lastly the likelihood function will be increasing, hence the maximum likelihood will be found at \theta_1.

Hence, the Restricted Maximum Likelihood Estimator of \theta, say

\hat{\theta_{RML}} = \begin{cases} \theta_o & \bar{X} < \theta_o \\ \bar{X} & \theta_o\le \bar{X} \le \theta_1 \\ \theta_1 & \bar{X} > \theta_1 \end{cases}

Now, to check that, \hat{\theta_{RML}} is a better estimator than \bar{X}, in terms of Mean Squared Error (MSE).

Now, MSE_{\theta}(\bar{X})=E_{\theta}(\bar{X}-\theta)^2=\int^{-\infty}_\infty (\bar{X}-\theta)^2f_X(x)\,dx

=\int^{-\infty}_{\theta_o} (\bar{X}-\theta)^2f_X(x)\,dx+\int^{\theta_o}_{\theta_1} (\bar{X}-\theta)^2f_X(x)\,dx+\int^{\theta_1}_\infty (\bar{X}-\theta)^2f_X(x)\,dx.

\ge \int^{-\infty}_{\theta_o} (\theta_o-\theta)^2f_X(x)\,dx+\int^{\theta_o}_{\theta_1} (\bar{X}-\theta)^2f_X(x)\,dx+\int^{\theta_1}_\infty (\theta_1-\theta)^2f_X(x)\,dx

=E_{\theta}(\hat{\theta_{RML}}-\theta)^2=MSE_{\theta}(\hat{\theta_{RML}}).

Hence proved !!


Food For Thought

Now, can you find an unbiased estimator, for \theta^2 ?? Okay!! now its quite easy right !! But is the estimator you are thinking about is the best unbiased estimator !! Calculate the variance and also compare weather the Variance is attaining Cramer-Rao Lowe Bound.

Give it a try !! You may need the help of Stein's Identity.


Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


This is a very beautiful sample problem from ISI MStat PSB 2009 Problem 6. It is based on the idea of Restricted Maximum Likelihood Estimators, and Mean Squared Errors. Give it a Try it !

Problem-ISI MStat PSB 2009 Problem 6


Suppose X_1,.....,X_n are i.i.d. N(\theta,1), \theta_o \le \theta \le \theta_1, where \theta_o < \theta_1 are two specified numbers. Find the MLE of \theta and show that it is better than the sample mean \bar{X} in the sense of having smaller mean squared error.

Prerequisites


Maximum Likelihood Estimators

Normal Distribution

Mean Squared Error

Solution :

This is a very interesting Problem ! We all know, that if the condition "\theta_o \le \theta \le \theta_1, for some specified numbers \theta_o < \theta_1" had been not given, then the MLE would have been simply \bar{X}=\frac{1}{n}\sum_{k=1}^n X_k, the sample mean of the given sample. But due to the restriction over \theta things get interestingly complicated.

So, simplify a bit, lets write the Likelihood Function of theta given this sample, \vec{X}=(X_1,....,X_n)',

L(\theta |\vec{X})={\frac{1}{\sqrt{2\pi}}}^nexp(-\frac{1}{2}\sum_{k=1}^n(X_k-\theta)^2), when \theta_o \le \theta \le \theta_1ow taking natural log both sides and differentiating, we find that ,

\frac{d\ln L(\theta|\vec{X})}{d\theta}= \sum_{k=1}^n (X_k-\theta).

Now, verify that if \bar{X} < \theta_o, then L(\theta |\vec{X}) is always a decreasing function of \theta, [ where, \(\theta_o \le \theta \le \theta_1\)], Hence the maximum likelihood attains at \theta_o itself. Similarly, when, \theta_o \le \bar{X} \le \theta_1, the maximum likelihood attains at \bar{X}, lastly the likelihood function will be increasing, hence the maximum likelihood will be found at \theta_1.

Hence, the Restricted Maximum Likelihood Estimator of \theta, say

\hat{\theta_{RML}} = \begin{cases} \theta_o & \bar{X} < \theta_o \\ \bar{X} & \theta_o\le \bar{X} \le \theta_1 \\ \theta_1 & \bar{X} > \theta_1 \end{cases}

Now, to check that, \hat{\theta_{RML}} is a better estimator than \bar{X}, in terms of Mean Squared Error (MSE).

Now, MSE_{\theta}(\bar{X})=E_{\theta}(\bar{X}-\theta)^2=\int^{-\infty}_\infty (\bar{X}-\theta)^2f_X(x)\,dx

=\int^{-\infty}_{\theta_o} (\bar{X}-\theta)^2f_X(x)\,dx+\int^{\theta_o}_{\theta_1} (\bar{X}-\theta)^2f_X(x)\,dx+\int^{\theta_1}_\infty (\bar{X}-\theta)^2f_X(x)\,dx.

\ge \int^{-\infty}_{\theta_o} (\theta_o-\theta)^2f_X(x)\,dx+\int^{\theta_o}_{\theta_1} (\bar{X}-\theta)^2f_X(x)\,dx+\int^{\theta_1}_\infty (\theta_1-\theta)^2f_X(x)\,dx

=E_{\theta}(\hat{\theta_{RML}}-\theta)^2=MSE_{\theta}(\hat{\theta_{RML}}).

Hence proved !!


Food For Thought

Now, can you find an unbiased estimator, for \theta^2 ?? Okay!! now its quite easy right !! But is the estimator you are thinking about is the best unbiased estimator !! Calculate the variance and also compare weather the Variance is attaining Cramer-Rao Lowe Bound.

Give it a try !! You may need the help of Stein's Identity.


Similar Problems and Solutions



ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Knowledge Partner

Cheenta is a knowledge partner of Aditya Birla Education Academy
Cheenta

Cheenta Academy

Aditya Birla Education Academy

Aditya Birla Education Academy

Cheenta. Passion for Mathematics

Advanced Mathematical Science. Taught by olympians, researchers and true masters of the subject.
JOIN TRIAL
support@cheenta.com
Menu
Trial
Whatsapp
rockethighlight