INTRODUCING 5 - days-a-week problem solving session for Math Olympiad and ISI Entrance. Learn More

This post discusses the solutions to the problems from **IIT JAM Mathematical Statistics (MS) 2021 Question Paper - Set B. **You can find solutions in video or written form.

**Note:** This post is getting updated. Stay tuned for solutions, videos, and more.

A sample of size $n$ is drawn randomly (without replacement) from an urn couraining $5 n^{2}$ balls, of which $2 n^{2}$ are red balls and $3 n^{2}$ are black balls. Let $X_{n}$ denote the number of red balls in the selected sample. If $\ell=\lim _{n \rightarrow \infty} \frac{E\left(X{n}\right)}{n}$ and $m=\lim _{n \rightarrow \infty} \frac{Var (X{n})}{n},$ then which of the following statements is/are TR UE?

Options -

- $\frac{\ell}{m}=\frac{5}{3}$
- $\ell m=\frac{14}{125}$
- $\ell-m=\frac{3}{25}$
- $\ell+m=\frac{16}{25}$

**Answer:** $\frac{\ell}{m}=\frac{5}{3}$; $\ell+m=\frac{16}{25}$

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from a distribution with probability density function

$$

f(x ; \theta)=\begin{cases}

\frac{3 x^{2}}{\theta} e^{-x^{3} / \theta}, x>0 \\

0, \text { othervise }

\end{cases}.

$$

where $\theta \in(0, \infty)$ is unknown.

If $T=\sum_{i =1}^{n} X_{i}^{3}$, then which of the following statements is/are TRUE?

Options -

1.$\frac{n-1}{T}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$

2.$\frac{n}{T}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$

3.$(n-1) \sum_{i=1}^{n} \frac{1}{x_{i}^{3}}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$

- $\frac{n}{T}$ is the MLE of $\frac{1}{\theta}$

**Answer:**

$\frac{n-1}{T}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$

$\frac{n}{T}$ is the MLE of $\frac{1}{\theta}$

Consider the linear system $A \underline{x}=\underline{b}$, where $A$ is an $m \times n$ matrix, $\underline{x}$ is an $n \times 1$ vector of unknowns

and $b$ is an $m \times 1$ vector. Further, suppose there exists an $m \times 1$ vector $c$ such that the linear system $A \underline{x}=c$ has No solution. Then, which of the following statements is/are necessarily TRUE?

Options -

1.If $m \leq n$ and $d$ is the first column of $A$, then the linear system $A \underline{x}=\underline{d}$ has a unique solution

2.If $m>n,$ then the linear system $A x=0$ has a solution other than $x=0$

- If $m \geq n,$ then $Rank(A)<n$
- $Rank(A)<m$

.**Answer:**

$Rank(A)<m$

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be independent and identically distributed random variables with probability density function

$$

f(x)=\begin{cases}

\frac{1}{x^{2}}, x \geq 1 \\

0, \text { otherwise }

\end{cases}.

$$

Then, which of the following random variables has/have finite expectation?

Options -

- $\frac{1}{X_{2}}$
- $\sqrt{X_{1}}$
- $X_{1}$
- $\min \{X_{1}, \ldots, X_{n}\}$

**Answer:** $\frac{1}{X_{2}}$, $\sqrt{X_{1}}$, $\min \{X_{1}, \ldots, X_{n}\}$

Let $X_{1}, X_{2}, \ldots, X_{n}$ be a random sample from $N(\theta, 1),$ where $\theta \in(-\infty, \infty)$ is unknown. Consider the problem of testing $H_{0}: \theta \leq 0$ against $H_{1}: \theta>0 .$ Let $\beta(\theta)$ denote the power function of the likelihood ratio test of size $\alpha(0<\alpha<1)$ for testing $H_{0}$ against $H_{1}$. Then. which of the following statements is/are TRUE?

Options -

1.The critical region of the likelihood test of size $\alpha$ is

$$

\{\left(x_{1}, x_{2}, \ldots, x_{n}\right) \in \mathbb{R}^{n}: \sqrt{n} \frac{\sum_{i=1}^{n} x_{i}}{n}<\tau_{\alpha}\} $$ where $\tau_{\alpha}$ is a fixed point such that $P\left(Z>\tau_{\alpha}\right)=\alpha, Z \sim N(0,1)$

- $\beta(\theta)>\beta(0),$ for all $\theta>0$
- The critical region of the likelihood test of size $\alpha$ is
{(x1,x2,…,xn)∈Rn:n−−√∑ni=1xin>τα/2}

where $\tau_{\alpha / 2}$ is a fixed point such that $P\left(Z>\tau_{\alpha / 2}\right)=\frac{\alpha}{2}, Z \sim N(0,1)$ - $\beta(\theta)<\beta(0),$ for all $\theta>0$

**Answer:** $\beta(\theta)>\beta(0),$ for all $\theta>0$

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from a distribution with probability density function

$$

f(x ; \theta)=\begin{cases}

\frac{1}{2 \theta}, -\theta \leq x \leq \theta \\

0, |x|>\theta

\end{cases}.

$$

where $\theta \in(0, \infty)$ is unknown. If $R=\min \{X_{1}, X_{2}, \ldots, X_{n}\}$ and $S=\max \{X_{1}, X_{2}, \ldots, X_{n}\},$ then which

of the following statements is/are TRUE?

Options -

1.$\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\}$ is a complete and sufficient statistic for $\theta$

- $S$ is an $\mathrm{MLE}$ of $\theta$
- $(R, S)$ is jointly sufficient for $\theta$
- Distribution of $\frac{R}{S}$ does NOT depend on $\theta$

**Answer:**

$\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\}$ is a complete and sufficient statistic for $\theta$

$(R, S)$ is jointly sufficient for $\theta$

Distribution of $\frac{R}{S}$ does NOT depend on $\theta$

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from a distribution with probability density function

$$

f(x ; \theta)=\begin{cases}

\theta x^{\theta-1}, 0 \leq x \leq 1 \\

0, \text { otherwise }

\end{cases}.

$$

where $\theta \in(0, \infty)$ is unknown. Then, which of the following statements is/are TRUE?

Options -

1 .There does NOT exist any unbiased estimator of $\frac{1}{\theta}$ which attains the Cramer-Rao lower bound

2.Cramer-Rao lower bound, based on $X_{1}, X_{2}, \ldots, X_{n},$ for the estimand $\theta^{3}$ is $\frac{\theta^{2}}{n}$

3 .Cramer-Rao lower bound. based on $X_{1}, X_{2}, \ldots, X_{n},$ for the estimand $\theta^{3}$ is $9 \frac{\theta^{6}}{n}$

4 .There exists an unbiased estimator of $\frac{1}{\theta}$ which attains the Cramer-Rao lower bound

**Answer:**

Cramer-Rao lower bound. based on $X_{1}, X_{2}, \ldots, X_{n},$ for the estimand $\theta^{3}$ is $9 \frac{\theta^{6}}{n}$

There exists an unbiased estimator of $\frac{1}{\theta}$ which attains the Cramer-Rao lower bound

Let $f: \mathbb{R} \rightarrow \mathbb{R}$ be a twice differentiable function. Then, which of the following statements is/are necessarily TRUE?

Options-

- $f^{\prime \prime}$ is continuous
- $f^{\prime \prime}$ is bounded on (0,1)
- If $f^{\prime}(0)=f^{\prime}(1),$ then $f^{\prime \prime}(x)=0$ has a solution in (0,1)
- $f^{\prime}$ is bounded on [8,10]

**Answer:**

If $f^{\prime}(0)=f^{\prime}(1),$ then $f^{\prime \prime}(x)=0$ has a solution in (0,1)

$f^{\prime}$ is bounded on [8,10]

Let $A$ be a $3 \times 3$ real matrix such that $A \neq I_{3}$ and the sum of the entries in each row of $A$ is $1$. Then which of the following statements is/are necessarily TRUE?

Options -

- The characteristic polynomial, $p(\lambda),$ of $A+2 A^{2}+A^{3}$ has $(\lambda-4)$ as a factor

- $A-I_{3}$ is an invertible matrix
- $A$ cannot be an orthogonal matrix
- The set $\{\underline{x} \in \mathbb{R}^{3}:\left(A-I_{3}\right) \underline{x}=\underline{0}\}$ has at least two elements $(\underline{x}$ is a column vector)

**Answer:**

The characteristic polynomial, $p(\lambda),$ of $A+2 A^{2}+A^{3}$ has $(\lambda-4)$ as a factor

Consider the function

$$

f(x, y)=3 x^{2}+4 x y+y^{2}, \quad(x, y) \in \mathbb{R}^{2}

$$

If $S=\{(x, y) \in \mathbb{R}^{2}: x^{2}+y^{2}=1\}$, then which of the following statements is/are TRUE?

Options -

- The maximum value of $f$ on $S$ is $2+\sqrt{5}$
- The maximum value of $f$ on $S$ is $3+\sqrt{5}$
- The minimum value of $f$ on $S$ is $3-\sqrt{5}$
- The minimum value of $f$ on $S$ is $2-\sqrt{5}$

**Answer:**

The maximum value of $f$ on $S$ is $2+\sqrt{5}$

The minimum value of $f$ on $S$ is $2-\sqrt{5}$

Advanced Mathematical Science. Taught by olympians, researchers and true masters of the subject.

JOIN TRIAL
Google