Get inspired by the success stories of our students in IIT JAM MS, ISI  MStat, CMI MSc DS.  Learn More

# Restricted Maximum Likelihood Estimator |ISI MStat PSB 2012 Problem 9 This is a very beautiful sample problem from ISI MStat PSB 2012 Problem 9, It's about restricted MLEs, how restricted MLEs are different from the unrestricted ones, if you miss delicacies you may miss the differences too . Try it! But be careful.

## Problem- ISI MStat PSB 2012 Problem 9

Suppose and are i.i.d. Bernoulli random variables with parameter where it us known that . Find the maximum likelihood estimator of based on and .

### Prerequisites

Bernoulli trials

Restricted Maximum Likelihood Estimators

Real Analysis

## Solution :

This problem seems quite simple and it is simple, if and only if one observes subtle details. Lets think about the unrestricted MLE of ,

Let the unrestricted MLE of (i.e. when )based on and be , and (How ??)

Now lets see the contradictions which may occur if we don't modify to (as it is been asked).

See, that when if our sample comes such that or , then will be 0 and 1 respectively, where , the actual parameter neither takes the value 1 or 0 !! So, needs serious improvement !

To, modify the , lets observe the log-likelihood function of Bernoulli based in two samples. Now, make two observations, when (.i.e. ), then , see that decreases as p increase, hence under the given condition, log_likelihood will be maximum when p is least, .i.e. .

Similarly, when (i.e.when ), then for the log-likelihood function to be maximum, p has to be maximum, i.e. .

So, to modify to , we have to develop a linear relationship between and . (Linear because, the relationship between and is linear. ). So, and is on the line that is joining the points ( when then ) and . Hence the line is,  . is the required restricted MLE.

Hence the solution concludes.

## Food For Thought

Can You find out the conditions for which the Maximum Likelihood Estimators are also unbiased estimators of the parameter. For which distributions do you think this conditions holds true. Are the also Minimum Variance Unbiased Estimators !!

Can you give some examples when the MLEs are not unbiased ?Even If they are not unbiased are the Sufficient ??

## Subscribe to Cheenta at Youtube

This is a very beautiful sample problem from ISI MStat PSB 2012 Problem 9, It's about restricted MLEs, how restricted MLEs are different from the unrestricted ones, if you miss delicacies you may miss the differences too . Try it! But be careful.

## Problem- ISI MStat PSB 2012 Problem 9

Suppose and are i.i.d. Bernoulli random variables with parameter where it us known that . Find the maximum likelihood estimator of based on and .

### Prerequisites

Bernoulli trials

Restricted Maximum Likelihood Estimators

Real Analysis

## Solution :

This problem seems quite simple and it is simple, if and only if one observes subtle details. Lets think about the unrestricted MLE of ,

Let the unrestricted MLE of (i.e. when )based on and be , and (How ??)

Now lets see the contradictions which may occur if we don't modify to (as it is been asked).

See, that when if our sample comes such that or , then will be 0 and 1 respectively, where , the actual parameter neither takes the value 1 or 0 !! So, needs serious improvement !

To, modify the , lets observe the log-likelihood function of Bernoulli based in two samples. Now, make two observations, when (.i.e. ), then , see that decreases as p increase, hence under the given condition, log_likelihood will be maximum when p is least, .i.e. .

Similarly, when (i.e.when ), then for the log-likelihood function to be maximum, p has to be maximum, i.e. .

So, to modify to , we have to develop a linear relationship between and . (Linear because, the relationship between and is linear. ). So, and is on the line that is joining the points ( when then ) and . Hence the line is,  . is the required restricted MLE.

Hence the solution concludes.

## Food For Thought

Can You find out the conditions for which the Maximum Likelihood Estimators are also unbiased estimators of the parameter. For which distributions do you think this conditions holds true. Are the also Minimum Variance Unbiased Estimators !!

Can you give some examples when the MLEs are not unbiased ?Even If they are not unbiased are the Sufficient ??

## Subscribe to Cheenta at Youtube

This site uses Akismet to reduce spam. Learn how your comment data is processed.

### Knowledge Partner  