Get inspired by the success stories of our students in IIT JAM MS, ISI  MStat, CMI MSc Data Science.  Learn More 

ISI MStat PSB 2006 Problem 9 | Consistency and MVUE

This is a very simple sample problem from ISI MStat PSB 2006 Problem 9. It's based on point estimation and finding consistent estimator and a minimum variance unbiased estimator and recognizing the subtle relation between the two types. Go for it!

Problem- ISI MStat PSB 2006 Problem 9


Let \(X_1,X_2,......\) be i.i.d. random variables with density \(f_{\theta}(x), \ x \in \mathbb{R}, \ \theta \in (0,1) \), being the unknown parameter. Suppose that there exists an unbiased estimator \(T\) of \(\theta\) based on sample size 1, i.e. \(E_{\theta}(T(X_1))=\theta \). Assume that \(Var(T(X_1))< \infty \).

(a) Find an estimator \(V_n\) for \(\theta\) based on \(X_1,X_2,......,X_n\) such that \(V_n\) is consistent for \(\theta \) .

(b) Let \(S_n\) be the MVUE( minimum variance unbiased estimator ) of \(\theta \) based on \(X_1,X_2,....,X_n\). Show that \(\lim_{n\to\infty}Var(S_n)=0\).

Prerequisites


Consistent estimators

Minimum Variance Unbiased Estimators

Rao-Blackwell Theorem

Solution :

Often, problems on estimation seems a bit of complicated and we feel directionless, but most cases its always beneficiary do go with the flow.

Here, it is given that \(T\) is an unbiased estimator of \(\theta \) based on one observation, and we are to find a consistent estimator for \(\theta \) based on a sample of size \(n\). Now first, we should consider what are the requisition of an estimator to be consistent?

  • The required estimator \(V_n\) have to be unbiased for \(\theta \) as \( n \uparrow \infty \) . i.e. \(\lim_{n \uparrow \infty} E_{\theta}(V_n)=\theta \).
  • The variance of the would be consistent estimator must converge to 0, as n grows large .i.e. \(\lim_{n \uparrow \infty}Var_{\theta}(V_n)=0 \).

First thing first, let us fulfill the unbiased criteria of \(V_n\), so, from each of the observation from the sample , \(X_1,X_2,.....,X_n\) , of size n, we can get as set of n unbiased estimator of \(\theta \) \( T(X_1), T(X_2), ....., T(X_n)\). So, can we write \(V_n=\frac{1}{n} \sum_{i=1}^n(T(X_i)+a)\) ? where \(a\) is a constant, ( kept for generality). Can you verify that \(V_n\) satisfies the first requirement of being a consistent estimator?

Now, proceeding towards fulfilling the final requirement, that is the variance of \(V_n\) converges to 0 as \(n \uparrow \infty\) . Since we have defined \(V_n\) based on \(T\), and it is given that \(Var(T(X_i)) \) exists for \( i \in \mathbb{N}\), and \(X_1,X_2,...X_n\) are i.i.d. (which is a very important realization here), leads us to

\(Var(V_n)= \frac{Var(T(X_1))}{n}\) , (why ??) . So, clearly, \(Var(V_n) \downarrow 0\) a \( n \uparrow \infty\), fulfilling both required conditions for being a consistent estimator. So, \(V_n= \sum_{i=1}^n(T(X_i)+a)\) is a consistent estimator for \(\theta \).

(b) For this part one may also use Rao-Blackwell theorem, but I always prefer using as less formulas and theorem as possible, and in this case we can do the required problem from the previous part. Since given \(S_n\) is MVUE for \(\theta \) and we found that \(V_n\) is consistent for \(\theta \), so, by the nature of MVUE,

\(Var(S_n) \le Var(V_n) \), so as n gets bigger, \( \lim_{ n \to \infty} Var(S_n) \le \lim{n \to infty} Var(V_n) \Rightarrow \lim_{n \to \infty}Var(S_n) \le 0\)

again, \(Var(S_n) \ge 0\), so, \(\lim_{n \to \infty }Var(S_n)= 0\). Hence, we conclude.


Food For Thought

Lets extend this problem a liitle bit just to increase the fun!!

Let, \(X_1,....,X_n\) are independent but not identical, but still \(T(X_1),T(X_2),.....,T(X_n)\), remains unbiased of \(\theta\) , and \(Var(T(X_i)= {\sigma_i}^2 \), and

\( Cov(T(X_i),T(X_j))=0\) if \( i \neq j \).

Can you show that of all the estimators of form \( \sum a_iT(X_i)\), where \(a_i\)'s are constants, and \(E_{\theta}(\sum a_i T(X_i))=\theta\), the estimator,

\(T*= \frac{\sum \frac{T(X_i)}{{\sigma_i}^2}}{\sum\frac{1}{{\sigma_i}^2}} \) has minimum variance.

Can you find the variance ? Think it over !!


ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


This is a very simple sample problem from ISI MStat PSB 2006 Problem 9. It's based on point estimation and finding consistent estimator and a minimum variance unbiased estimator and recognizing the subtle relation between the two types. Go for it!

Problem- ISI MStat PSB 2006 Problem 9


Let \(X_1,X_2,......\) be i.i.d. random variables with density \(f_{\theta}(x), \ x \in \mathbb{R}, \ \theta \in (0,1) \), being the unknown parameter. Suppose that there exists an unbiased estimator \(T\) of \(\theta\) based on sample size 1, i.e. \(E_{\theta}(T(X_1))=\theta \). Assume that \(Var(T(X_1))< \infty \).

(a) Find an estimator \(V_n\) for \(\theta\) based on \(X_1,X_2,......,X_n\) such that \(V_n\) is consistent for \(\theta \) .

(b) Let \(S_n\) be the MVUE( minimum variance unbiased estimator ) of \(\theta \) based on \(X_1,X_2,....,X_n\). Show that \(\lim_{n\to\infty}Var(S_n)=0\).

Prerequisites


Consistent estimators

Minimum Variance Unbiased Estimators

Rao-Blackwell Theorem

Solution :

Often, problems on estimation seems a bit of complicated and we feel directionless, but most cases its always beneficiary do go with the flow.

Here, it is given that \(T\) is an unbiased estimator of \(\theta \) based on one observation, and we are to find a consistent estimator for \(\theta \) based on a sample of size \(n\). Now first, we should consider what are the requisition of an estimator to be consistent?

  • The required estimator \(V_n\) have to be unbiased for \(\theta \) as \( n \uparrow \infty \) . i.e. \(\lim_{n \uparrow \infty} E_{\theta}(V_n)=\theta \).
  • The variance of the would be consistent estimator must converge to 0, as n grows large .i.e. \(\lim_{n \uparrow \infty}Var_{\theta}(V_n)=0 \).

First thing first, let us fulfill the unbiased criteria of \(V_n\), so, from each of the observation from the sample , \(X_1,X_2,.....,X_n\) , of size n, we can get as set of n unbiased estimator of \(\theta \) \( T(X_1), T(X_2), ....., T(X_n)\). So, can we write \(V_n=\frac{1}{n} \sum_{i=1}^n(T(X_i)+a)\) ? where \(a\) is a constant, ( kept for generality). Can you verify that \(V_n\) satisfies the first requirement of being a consistent estimator?

Now, proceeding towards fulfilling the final requirement, that is the variance of \(V_n\) converges to 0 as \(n \uparrow \infty\) . Since we have defined \(V_n\) based on \(T\), and it is given that \(Var(T(X_i)) \) exists for \( i \in \mathbb{N}\), and \(X_1,X_2,...X_n\) are i.i.d. (which is a very important realization here), leads us to

\(Var(V_n)= \frac{Var(T(X_1))}{n}\) , (why ??) . So, clearly, \(Var(V_n) \downarrow 0\) a \( n \uparrow \infty\), fulfilling both required conditions for being a consistent estimator. So, \(V_n= \sum_{i=1}^n(T(X_i)+a)\) is a consistent estimator for \(\theta \).

(b) For this part one may also use Rao-Blackwell theorem, but I always prefer using as less formulas and theorem as possible, and in this case we can do the required problem from the previous part. Since given \(S_n\) is MVUE for \(\theta \) and we found that \(V_n\) is consistent for \(\theta \), so, by the nature of MVUE,

\(Var(S_n) \le Var(V_n) \), so as n gets bigger, \( \lim_{ n \to \infty} Var(S_n) \le \lim{n \to infty} Var(V_n) \Rightarrow \lim_{n \to \infty}Var(S_n) \le 0\)

again, \(Var(S_n) \ge 0\), so, \(\lim_{n \to \infty }Var(S_n)= 0\). Hence, we conclude.


Food For Thought

Lets extend this problem a liitle bit just to increase the fun!!

Let, \(X_1,....,X_n\) are independent but not identical, but still \(T(X_1),T(X_2),.....,T(X_n)\), remains unbiased of \(\theta\) , and \(Var(T(X_i)= {\sigma_i}^2 \), and

\( Cov(T(X_i),T(X_j))=0\) if \( i \neq j \).

Can you show that of all the estimators of form \( \sum a_iT(X_i)\), where \(a_i\)'s are constants, and \(E_{\theta}(\sum a_i T(X_i))=\theta\), the estimator,

\(T*= \frac{\sum \frac{T(X_i)}{{\sigma_i}^2}}{\sum\frac{1}{{\sigma_i}^2}} \) has minimum variance.

Can you find the variance ? Think it over !!


ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Knowledge Partner

Cheenta is a knowledge partner of Aditya Birla Education Academy
Cheenta

Cheenta Academy

Aditya Birla Education Academy

Aditya Birla Education Academy

Cheenta. Passion for Mathematics

Advanced Mathematical Science. Taught by olympians, researchers and true masters of the subject.
JOIN TRIAL
support@cheenta.com
Menu
Trial
Whatsapp
rockethighlight