This is a very elegant sample problem from ISI MStat PSB 2010 Problem 10. It's mostly based on propertes of uniform, and its behaviour when modified . Try it!
Let \(X\) be a random variable uniformly distributed over \((0,2\theta )\), \(\theta>0\), and \(Y=max(X,2\theta -X)\).
(a) Find \(\mu =E(Y)\).
(b) Let \(X_1,X_2,.....X_n\) be a random sample from the above distribution with unknown \(\theta\). Find two distinct unbiased estimators of \(\mu\), as defined in (a), based on the entire sample.
Uniform Distribution
Law of Total Expectation
Unbiased Estimators
Well, this is a very straight forward problem, where we just need to be aware the way \(Y\) is defined.
As, we need \(E(Y)\) and by definition of \(Y\) , we clearly see that \(Y\) is dependent in \(X\) where \(X \sim Unif( 0, 2\theta)\).
So, using Law of Total Expectation,
\(E(Y)= E(X|X>2\theta-X)P(X>2\theta-X)+E(2\theta-X|X \le 2\theta-X)P(X \le 2\theta-X).
Observe that, \(P(X \le \theta)=\frac{1}{2}\), why ??
Also, conditional pdf of \(X|X>\theta\) is,
\(f_{X|X>\theta}(x)=\frac{f_X(x)}{P(X>\theta)}==\frac{1}{\theta} \& \theta< x \le 2\theta \). [where \(f_X\) is the pdf of \(X\)].
the other conditional pdf is also same due to symmetry.(Verify!!).
So, \(E(Y)=E(X|X\sim Unif(\theta,2\theta))\frac{1}{2}+E(X|X\sim Unif(0,\theta))\\frac{1}{2}=\frac{1}{2}(\frac{3\theta}{2}+2\theta-\frac{\theta}{2})=\frac{3\theta}{2} \).
hence, \(\mu=\frac{3\theta}{2}\).
Now, for the next part, one trivial unbiased estimator of \(\theta\) is \(T_n=\frac{1}{n}\sum_{i=1}^n X_i \) (based on the given sample). So,
\(\frac{3T_n}{2}=\frac{3}{2n}\sum_{i=1}^n X_i \) is an obvious unbiased estimator of \(\mu\).
For another we need to change our way of looking on conventional way and look for the order statistics, since we know that \(X_{(n)}\) is sufficient for \(\theta\).(Don't Know ?? Look for Factorization Theorem .)
So, verify that \(E(X_{(n)})=\frac{2n}{n+1}\theta\).
Hence, \(\frac{n+1}{2n}X_{(n)} \) is another unbiased estimator of \(theta\). So, \(\frac{3(n+1)}{4n}X_{(n)}\) is also another unbiased estimator of \(\mu\) a defined in (a).
Hence the solution concludes.
Let us think about some unpopular but very beautiful relationship between discrete random variables besides the Universality of uniform. Let \(X\)be a discrete random variable with cdf \(F_X(x)\) and define the random variable \(Y=F_X(x)\).
Can you verify that, \(Y\) is stochastically greater that a uniform(0,1) random variable \(U\). i.e.
\(P(Y>y) \ge P(U>y)=1-y\) for all \(y\), \(0<y<1\),
\(P(Y>y) > P(U>y) =1-y \), for some \(y\), \(0<y<1\).
Hint: Draw a typical picture of a discrete cdf, and observe the jump points ! you may jump to the solution!! Think it over.
This is a very elegant sample problem from ISI MStat PSB 2010 Problem 10. It's mostly based on propertes of uniform, and its behaviour when modified . Try it!
Let \(X\) be a random variable uniformly distributed over \((0,2\theta )\), \(\theta>0\), and \(Y=max(X,2\theta -X)\).
(a) Find \(\mu =E(Y)\).
(b) Let \(X_1,X_2,.....X_n\) be a random sample from the above distribution with unknown \(\theta\). Find two distinct unbiased estimators of \(\mu\), as defined in (a), based on the entire sample.
Uniform Distribution
Law of Total Expectation
Unbiased Estimators
Well, this is a very straight forward problem, where we just need to be aware the way \(Y\) is defined.
As, we need \(E(Y)\) and by definition of \(Y\) , we clearly see that \(Y\) is dependent in \(X\) where \(X \sim Unif( 0, 2\theta)\).
So, using Law of Total Expectation,
\(E(Y)= E(X|X>2\theta-X)P(X>2\theta-X)+E(2\theta-X|X \le 2\theta-X)P(X \le 2\theta-X).
Observe that, \(P(X \le \theta)=\frac{1}{2}\), why ??
Also, conditional pdf of \(X|X>\theta\) is,
\(f_{X|X>\theta}(x)=\frac{f_X(x)}{P(X>\theta)}==\frac{1}{\theta} \& \theta< x \le 2\theta \). [where \(f_X\) is the pdf of \(X\)].
the other conditional pdf is also same due to symmetry.(Verify!!).
So, \(E(Y)=E(X|X\sim Unif(\theta,2\theta))\frac{1}{2}+E(X|X\sim Unif(0,\theta))\\frac{1}{2}=\frac{1}{2}(\frac{3\theta}{2}+2\theta-\frac{\theta}{2})=\frac{3\theta}{2} \).
hence, \(\mu=\frac{3\theta}{2}\).
Now, for the next part, one trivial unbiased estimator of \(\theta\) is \(T_n=\frac{1}{n}\sum_{i=1}^n X_i \) (based on the given sample). So,
\(\frac{3T_n}{2}=\frac{3}{2n}\sum_{i=1}^n X_i \) is an obvious unbiased estimator of \(\mu\).
For another we need to change our way of looking on conventional way and look for the order statistics, since we know that \(X_{(n)}\) is sufficient for \(\theta\).(Don't Know ?? Look for Factorization Theorem .)
So, verify that \(E(X_{(n)})=\frac{2n}{n+1}\theta\).
Hence, \(\frac{n+1}{2n}X_{(n)} \) is another unbiased estimator of \(theta\). So, \(\frac{3(n+1)}{4n}X_{(n)}\) is also another unbiased estimator of \(\mu\) a defined in (a).
Hence the solution concludes.
Let us think about some unpopular but very beautiful relationship between discrete random variables besides the Universality of uniform. Let \(X\)be a discrete random variable with cdf \(F_X(x)\) and define the random variable \(Y=F_X(x)\).
Can you verify that, \(Y\) is stochastically greater that a uniform(0,1) random variable \(U\). i.e.
\(P(Y>y) \ge P(U>y)=1-y\) for all \(y\), \(0<y<1\),
\(P(Y>y) > P(U>y) =1-y \), for some \(y\), \(0<y<1\).
Hint: Draw a typical picture of a discrete cdf, and observe the jump points ! you may jump to the solution!! Think it over.