INTRODUCING 5 - days-a-week problem solving session for Math Olympiad and ISI Entrance. Learn More 

April 2, 2020

Maximum Likelihood Estimation | ISI MStat 2017 PSB Problem 8

This problem based on Maximum Likelihood Estimation, gives a detailed solution to ISI M.Stat 2017 PSB Problem 8, with a tinge of simulation and code.

Problem

Let \(\theta>0\) be an unknown parameter, and \(X_{1}, X_{2}, \ldots, X_{n}\) be a random sample from the distribution with density.

\(
f(x) = \begin{cases}
\frac{2x}{\theta^{2}} & , 0 \leq x \leq \theta \\
0 & , \text { otherwise }
\end{cases}
\)

Find the maximum likelihood estimator of \(\theta\) and its mean squared error.

Prerequisites

  • Proof algorithm to find the MLE of \( \theta\) for U\((0, \theta)\)
  • Order Statistics \( X_{(1)}, X_{(2)}, \ldots, X_{(n)} \)
  • Mean Square Error

Solution

Do you remember the method of finding the MLE of \( \theta\) for U\((0, \theta)\) ? Just proceed along a similar line.

\( L(\theta) = f\left(x_{1}, \cdots, x_{n} | \theta\right) \overset{ X_{1}, X_{2}, \ldots, X_{n} \text{are iid}}{=}f\left(x_{1} | \theta\right) \cdots f\left(x_{n} | \theta\right) \\ =

\begin{cases}
\frac{ 2^n \prod_{i=1}^{\infty} X_{i}}{ \theta^{2n}} & , 0 \leq X_{(1)} \leq X_{(2)} \leq \ldots \leq X_{(n)} \leq \theta \\
0 & , \text { otherwise }
\end{cases}
\)

Let's draw the diagram.

likelihood function graph

Thus, you can see that \( L(\theta) \) is maximized at \( \theta = X_{(n)}\).

Hence, \( \hat{\theta}_{mle} = X_{(n)}\).

MSE

Now, we need to find the distribution of \( X_{(n)}\).

For, that we need to find the distribution function of \(X_i\).

Observe \( F_{X_i}(x) = \begin{cases}
\frac{x^2}{\theta^{2}} & , 0 \leq x \leq \theta \\
0 & , \text { otherwise }
\end{cases} \)

\( F_{X_{(n)}}(x) \overset{\text{Order Statistics}}{=} \begin{cases}
0 &, x \leq 0 \\
\frac{x^{2n}}{\theta^{2n}} & , 0 \leq x \leq \theta \\
1 & , \text { otherwise }
\end{cases} \)

\( f_{X_{(n)}}(x) = \begin{cases}
\frac{2n.x^{2n-1}}{\theta^{2n}} & , 0 \leq x \leq \theta \\
0 & , \text { otherwise }
\end{cases} \)

MSE(\( X_{(n)} \)) = E\(((X_{(n)} - \theta)^2)\)

= \( \int_{0}^{\theta} (x-\theta)^2 f_{X_{(n)}}(x) dx \)

= \( \int_{0}^{\theta} (x-\theta)^2 \frac{2n.x^{2n-1}}{\theta^{2n}} dx \)

= \( \int_{0}^{\theta} (x^2 + {\theta}^2 - 2x\theta) \frac{2n.x^{2n-1}}{\theta^{2n}} dx \)

= \( \int_{0}^{\theta} \frac{2n.x^{2n+1}}{\theta^{2n}} dx \) + \( \int_{0}^{\theta} \frac{2n.x^{2n-1}}{\theta^{2n-2}} dx \) - \( \int_{0}^{\theta} \frac{4n.x^{2n}}{\theta^{2n-1}} dx \)

= \( {\theta}^2(\frac{2n}{2n+2} + 1 - \frac{4n}{2n+1}) = \frac{1}{(2n+1)(n+1)}\)

Observe that \( \lim_{ n \to \infty} { MSE( X_{(n)})} = 0\).

Let's add a computing dimension to it and verify it by simulation.

Let's take \( \theta = 1, n = 5\). MSE is expected to be around 0.002. You can change the \(\theta\) and n and play around.

v = NULL
n = 15
theta = 1
for (i in 1:1000) {
  r = runif(n, 0, theta)
  s = theta*sqrt(r) #We use Inverse Transformation Method to generate the random variables from the distribution.
  m = max(s)
  v = c(v,m)
}
hist(v, freq = FALSE)
k = replicate(1000,1)
mse(v,k) =  0.001959095
maximum likelihood estimation

You should also check out this link: Triangle Inequality Problems and Solutions

I hope that helps you. Stay tuned.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Cheenta. Passion for Mathematics

Advanced Mathematical Science. Taught by olympians, researchers and true masters of the subject.
JOIN TRIAL
support@cheenta.com
enter