This is a very simple sample problem from ISI MStat PSB 2018 Problem 9. It is mainly based on estimation of ordinary least square estimates and Likelihood estimates of regression parameters. Try it!
Suppose satisfies the regression model,
for
where are fixed constants and
are i.i.d.
errors, where
and
are unknown parameters.
(a) Let denote the least squares estimate of
obtained assuming
. Find the mean squared error (MSE) of
in terms of model parameters.
(b) Obtain the maximum likelihood estimator of this MSE.
Normal Distribution
Ordinary Least Square Estimates
Maximum Likelihood Estimates
These problem is simple enough,
for the given model, for
.
The scenario is even simpler here since, it is given that , so our model reduces to,
, where
and
's are i.i.d.
now we know that the Ordinary Least Square (OLS) estimate of is
(How ??) where
is the (generally) the OLS estimate of
, but here
is known, so,
again,
, hence
is a biased estimator for
with
.
So, the Mean Squared Error, MSE of is,
+
[ as, it follows clearly from the model, and
's are non-stochastic ] .
(b) the last part follows directly from the, the note I provided at the end of part (a),
that is, and we have to find the Maximum Likelihood Estimator of
and
and then use the inavriant property of MLE. ( in the MSE obtained in (a)). In leave it as an Exercise !! Finish it Yourself !
Suppose you don't know the value of even, What will be the MSE of
in that case ?
Also, find the OLS estimate of and you already have done it for
, so now find the MLEs of all
and
. Are the OLS estimates are identical to the MLEs you obtained ? Which assumption induces this coincidence ?? What do you think !!
This is a very simple sample problem from ISI MStat PSB 2018 Problem 9. It is mainly based on estimation of ordinary least square estimates and Likelihood estimates of regression parameters. Try it!
Suppose satisfies the regression model,
for
where are fixed constants and
are i.i.d.
errors, where
and
are unknown parameters.
(a) Let denote the least squares estimate of
obtained assuming
. Find the mean squared error (MSE) of
in terms of model parameters.
(b) Obtain the maximum likelihood estimator of this MSE.
Normal Distribution
Ordinary Least Square Estimates
Maximum Likelihood Estimates
These problem is simple enough,
for the given model, for
.
The scenario is even simpler here since, it is given that , so our model reduces to,
, where
and
's are i.i.d.
now we know that the Ordinary Least Square (OLS) estimate of is
(How ??) where
is the (generally) the OLS estimate of
, but here
is known, so,
again,
, hence
is a biased estimator for
with
.
So, the Mean Squared Error, MSE of is,
+
[ as, it follows clearly from the model, and
's are non-stochastic ] .
(b) the last part follows directly from the, the note I provided at the end of part (a),
that is, and we have to find the Maximum Likelihood Estimator of
and
and then use the inavriant property of MLE. ( in the MSE obtained in (a)). In leave it as an Exercise !! Finish it Yourself !
Suppose you don't know the value of even, What will be the MSE of
in that case ?
Also, find the OLS estimate of and you already have done it for
, so now find the MLEs of all
and
. Are the OLS estimates are identical to the MLEs you obtained ? Which assumption induces this coincidence ?? What do you think !!