This is a very beautiful sample problem from ISI MStat PSB 2004 Problem 6. It's a very simple problem, and its simplicity is its beauty . Fun to think, go for it !!
Let , and
be four uncorrelated random variables with
,
,
where and
(>0) are unknown parameters. Find the values of
and
for which
is unbiased for
and has least variance.
Unbiased estimators
Minimum-Variance estimators
Cauchy-Schwarz inequality
This is a very simple and cute problem, just do as it is said...
for , to be an unbiased estimator for
, then it must satisfy,
so, ......................(1)
So, we have to find and
, such that (1), is satisfied . But hold on there is some other conditions also.
Again, since the given estimator will also have to be minimum variance, lets calculate the variance of ,
...............................................(2)
So, for minimum variance, must be minimum in (2).
So, we must find and
, such that (1), is satisfied and
in (2) is minimum.
so, minimizing when it is given that
,
What do you think, what should be our technique of minimizing ???
For, me the beauty of the problem is hidden in this part of minimizing the variance. Can't we think of Cauchy-Schwarz inequality to find the minimum of, ??
So, using CS- inequality, we have,
...........(3). [ since
].
now since is minimum the equality in (3) holds, i.e.
.
and we know the equality condition of CS- inequality is, (say),
then for i=1,2,3,4 , where k is some constant .
Again since, . Hence the solution concludes .
Let's, deal with some more inequalities and behave Normal !
Using, Chebyshev's inequality we can find a trivial upper bound for , where
and t>0 ( really !! what's the bound ?). But what about some non-trivial bounds, sharper ones perhaps !! Can you show the following,
for all t>0.
also, verify this upper bound is sharper than the trivial upper bound that one can obtain.
This is a very beautiful sample problem from ISI MStat PSB 2004 Problem 6. It's a very simple problem, and its simplicity is its beauty . Fun to think, go for it !!
Let , and
be four uncorrelated random variables with
,
,
where and
(>0) are unknown parameters. Find the values of
and
for which
is unbiased for
and has least variance.
Unbiased estimators
Minimum-Variance estimators
Cauchy-Schwarz inequality
This is a very simple and cute problem, just do as it is said...
for , to be an unbiased estimator for
, then it must satisfy,
so, ......................(1)
So, we have to find and
, such that (1), is satisfied . But hold on there is some other conditions also.
Again, since the given estimator will also have to be minimum variance, lets calculate the variance of ,
...............................................(2)
So, for minimum variance, must be minimum in (2).
So, we must find and
, such that (1), is satisfied and
in (2) is minimum.
so, minimizing when it is given that
,
What do you think, what should be our technique of minimizing ???
For, me the beauty of the problem is hidden in this part of minimizing the variance. Can't we think of Cauchy-Schwarz inequality to find the minimum of, ??
So, using CS- inequality, we have,
...........(3). [ since
].
now since is minimum the equality in (3) holds, i.e.
.
and we know the equality condition of CS- inequality is, (say),
then for i=1,2,3,4 , where k is some constant .
Again since, . Hence the solution concludes .
Let's, deal with some more inequalities and behave Normal !
Using, Chebyshev's inequality we can find a trivial upper bound for , where
and t>0 ( really !! what's the bound ?). But what about some non-trivial bounds, sharper ones perhaps !! Can you show the following,
for all t>0.
also, verify this upper bound is sharper than the trivial upper bound that one can obtain.
In the final step, k must be equal to 1/4 (due to unbiasedness).