How Cheenta works to ensure student success?
Explore the Back-Story

IIT JAM MS 2021 Question Paper | Set B | Problems & Solutions

This post discusses the solutions to the problems from IIT JAM Mathematical Statistics (MS) 2021 Question Paper - Set B. You can find solutions in video or written form.

Note: This post is getting updated. Stay tuned for solutions, videos, and more.

IIT JAM Mathematical Statistics (MS) 2021 Problems & Solutions (Set B)

Problem 1

A sample of size n is drawn randomly (without replacement) from an urn couraining 5 n^{2} balls, of which 2 n^{2} are red balls and 3 n^{2} are black balls. Let X_{n} denote the number of red balls in the selected sample. If \ell=\lim _{n \rightarrow \infty} \frac{E\left(X{n}\right)}{n} and m=\lim _{n \rightarrow \infty} \frac{Var (X{n})}{n}, then which of the following statements is/are TR UE?

Options -

  1. \frac{\ell}{m}=\frac{5}{3}
  2. \ell m=\frac{14}{125}
  3. \ell-m=\frac{3}{25}
  4. \ell+m=\frac{16}{25}


Answer: \frac{\ell}{m}=\frac{5}{3}; \ell+m=\frac{16}{25}

Problem 2

Let X_{1}, X_{2}, \ldots, X_{n}(n \geq 2) be a random sample from a distribution with probability density function

    \[f(x ; \theta)=\begin{cases}\frac{3 x^{2}}{\theta} e^{-x^{3} / \theta},  x>0 \\0,  \text { othervise }\end{cases}.\]

where \theta \in(0, \infty) is unknown.
If T=\sum_{i =1}^{n} X_{i}^{3}, then which of the following statements is/are TRUE?

Options -

1.\frac{n-1}{T} is the unique uniformly minimum variance unbiased estimator of \frac{1}{\theta}
2.\frac{n}{T} is the unique uniformly minimum variance unbiased estimator of \frac{1}{\theta}
3.(n-1) \sum_{i=1}^{n} \frac{1}{x_{i}^{3}} is the unique uniformly minimum variance unbiased estimator of \frac{1}{\theta}

  1. \frac{n}{T} is the MLE of \frac{1}{\theta}


Answer:
\frac{n-1}{T} is the unique uniformly minimum variance unbiased estimator of \frac{1}{\theta}
\frac{n}{T} is the MLE of \frac{1}{\theta}

Problem 3

Consider the linear system A \underline{x}=\underline{b}, where A is an m \times n matrix, \underline{x} is an n \times 1 vector of unknowns
and b is an m \times 1 vector. Further, suppose there exists an m \times 1 vector c such that the linear system A \underline{x}=c has No solution. Then, which of the following statements is/are necessarily TRUE?

Options -

1.If m \leq n and d is the first column of A, then the linear system A \underline{x}=\underline{d} has a unique solution
2.If m>n, then the linear system A x=0 has a solution other than x=0

  1. If m \geq n, then Rank(A)<n
  2. Rank(A)<m

.
Answer:
Rank(A)<m

Problem 4

Let X_{1}, X_{2}, \ldots, X_{n}(n \geq 2) be independent and identically distributed random variables with probability density function

    \[f(x)=\begin{cases}\frac{1}{x^{2}},  x \geq 1 \\0,  \text { otherwise }\end{cases}.\]

Then, which of the following random variables has/have finite expectation?

Options -

  1. \frac{1}{X_{2}}
  2. \sqrt{X_{1}}
  3. X_{1}
  4. \min \{X_{1}, \ldots, X_{n}\}


Answer: \frac{1}{X_{2}}, \sqrt{X_{1}}, \min \{X_{1}, \ldots, X_{n}\}

Problem 5

Let X_{1}, X_{2}, \ldots, X_{n} be a random sample from N(\theta, 1), where \theta \in(-\infty, \infty) is unknown. Consider the problem of testing H_{0}: \theta \leq 0 against H_{1}: \theta>0 . Let \beta(\theta) denote the power function of the likelihood ratio test of size \alpha(0<\alpha<1) for testing H_{0} against H_{1}. Then. which of the following statements is/are TRUE?

Options -

1.The critical region of the likelihood test of size \alpha is

    \[\{\left(x_{1}, x_{2}, \ldots, x_{n}\right) \in \mathbb{R}^{n}: \sqrt{n} \frac{\sum_{i=1}^{n} x_{i}}{n}<\tau_{\alpha}\}\]

where \tau_{\alpha} is a fixed point such that P\left(Z>\tau_{\alpha}\right)=\alpha, Z \sim N(0,1)

  1. \beta(\theta)>\beta(0), for all \theta>0
  2. The critical region of the likelihood test of size \alpha is
    {(x1,x2,…,xn)Rn:nni=1xin>τα/2}

    where \tau_{\alpha / 2} is a fixed point such that P\left(Z>\tau_{\alpha / 2}\right)=\frac{\alpha}{2}, Z \sim N(0,1)
  3. \beta(\theta)<\beta(0), for all \theta>0


Answer: \beta(\theta)>\beta(0), for all \theta>0

Problem 6

Let X_{1}, X_{2}, \ldots, X_{n}(n \geq 2) be a random sample from a distribution with probability density function

    \[f(x ; \theta)=\begin{cases}\frac{1}{2 \theta},  -\theta \leq x \leq \theta \\0,  |x|>\theta\end{cases}.\]

where \theta \in(0, \infty) is unknown. If R=\min \{X_{1}, X_{2}, \ldots, X_{n}\} and S=\max \{X_{1}, X_{2}, \ldots, X_{n}\}, then which
of the following statements is/are TRUE?

Options -

1.\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\} is a complete and sufficient statistic for \theta

  1. S is an \mathrm{MLE} of \theta
  2. (R, S) is jointly sufficient for \theta
  3. Distribution of \frac{R}{S} does NOT depend on \theta


Answer:
\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\} is a complete and sufficient statistic for \theta
(R, S) is jointly sufficient for \theta
Distribution of \frac{R}{S} does NOT depend on \theta

Problem 7

Let X_{1}, X_{2}, \ldots, X_{n}(n \geq 2) be a random sample from a distribution with probability density function

    \[f(x ; \theta)=\begin{cases}\theta x^{\theta-1},  0 \leq x \leq 1 \\0,  \text { otherwise }\end{cases}.\]

where \theta \in(0, \infty) is unknown. Then, which of the following statements is/are TRUE?

Options -

1 .There does NOT exist any unbiased estimator of \frac{1}{\theta} which attains the Cramer-Rao lower bound
2.Cramer-Rao lower bound, based on X_{1}, X_{2}, \ldots, X_{n}, for the estimand \theta^{3} is \frac{\theta^{2}}{n}
3 .Cramer-Rao lower bound. based on X_{1}, X_{2}, \ldots, X_{n}, for the estimand \theta^{3} is 9 \frac{\theta^{6}}{n}
4 .There exists an unbiased estimator of \frac{1}{\theta} which attains the Cramer-Rao lower bound


Answer:
Cramer-Rao lower bound. based on X_{1}, X_{2}, \ldots, X_{n}, for the estimand \theta^{3} is 9 \frac{\theta^{6}}{n}
There exists an unbiased estimator of \frac{1}{\theta} which attains the Cramer-Rao lower bound

Problem 8

Let f: \mathbb{R} \rightarrow \mathbb{R} be a twice differentiable function. Then, which of the following statements is/are necessarily TRUE?

Options-

  1. f^{\prime \prime} is continuous
  2. f^{\prime \prime} is bounded on (0,1)
  3. If f^{\prime}(0)=f^{\prime}(1), then f^{\prime \prime}(x)=0 has a solution in (0,1)
  4. f^{\prime} is bounded on [8,10]


Answer:
If f^{\prime}(0)=f^{\prime}(1), then f^{\prime \prime}(x)=0 has a solution in (0,1)
f^{\prime} is bounded on [8,10]

Problem 9

Let A be a 3 \times 3 real matrix such that A \neq I_{3} and the sum of the entries in each row of A is 1. Then which of the following statements is/are necessarily TRUE?

Options -

  1. The characteristic polynomial, p(\lambda), of A+2 A^{2}+A^{3} has (\lambda-4) as a factor
  1. A-I_{3} is an invertible matrix
  2. A cannot be an orthogonal matrix
  3. The set \{\underline{x} \in \mathbb{R}^{3}:\left(A-I_{3}\right) \underline{x}=\underline{0}\} has at least two elements (\underline{x} is a column vector)


Answer:
The characteristic polynomial, p(\lambda), of A+2 A^{2}+A^{3} has (\lambda-4) as a factor

Problem 10

Consider the function

    \[f(x, y)=3 x^{2}+4 x y+y^{2}, \quad(x, y) \in \mathbb{R}^{2}\]

If S=\{(x, y) \in \mathbb{R}^{2}: x^{2}+y^{2}=1\}, then which of the following statements is/are TRUE?

Options -

  1. The maximum value of f on S is 2+\sqrt{5}
  2. The maximum value of f on S is 3+\sqrt{5}
  3. The minimum value of f on S is 3-\sqrt{5}
  4. The minimum value of f on S is 2-\sqrt{5}


Answer:
The maximum value of f on S is 2+\sqrt{5}
The minimum value of f on S is 2-\sqrt{5}

Some Useful Links:

This post discusses the solutions to the problems from IIT JAM Mathematical Statistics (MS) 2021 Question Paper - Set B. You can find solutions in video or written form.

Note: This post is getting updated. Stay tuned for solutions, videos, and more.

IIT JAM Mathematical Statistics (MS) 2021 Problems & Solutions (Set B)

Problem 1

A sample of size n is drawn randomly (without replacement) from an urn couraining 5 n^{2} balls, of which 2 n^{2} are red balls and 3 n^{2} are black balls. Let X_{n} denote the number of red balls in the selected sample. If \ell=\lim _{n \rightarrow \infty} \frac{E\left(X{n}\right)}{n} and m=\lim _{n \rightarrow \infty} \frac{Var (X{n})}{n}, then which of the following statements is/are TR UE?

Options -

  1. \frac{\ell}{m}=\frac{5}{3}
  2. \ell m=\frac{14}{125}
  3. \ell-m=\frac{3}{25}
  4. \ell+m=\frac{16}{25}


Answer: \frac{\ell}{m}=\frac{5}{3}; \ell+m=\frac{16}{25}

Problem 2

Let X_{1}, X_{2}, \ldots, X_{n}(n \geq 2) be a random sample from a distribution with probability density function

    \[f(x ; \theta)=\begin{cases}\frac{3 x^{2}}{\theta} e^{-x^{3} / \theta},  x>0 \\0,  \text { othervise }\end{cases}.\]

where \theta \in(0, \infty) is unknown.
If T=\sum_{i =1}^{n} X_{i}^{3}, then which of the following statements is/are TRUE?

Options -

1.\frac{n-1}{T} is the unique uniformly minimum variance unbiased estimator of \frac{1}{\theta}
2.\frac{n}{T} is the unique uniformly minimum variance unbiased estimator of \frac{1}{\theta}
3.(n-1) \sum_{i=1}^{n} \frac{1}{x_{i}^{3}} is the unique uniformly minimum variance unbiased estimator of \frac{1}{\theta}

  1. \frac{n}{T} is the MLE of \frac{1}{\theta}


Answer:
\frac{n-1}{T} is the unique uniformly minimum variance unbiased estimator of \frac{1}{\theta}
\frac{n}{T} is the MLE of \frac{1}{\theta}

Problem 3

Consider the linear system A \underline{x}=\underline{b}, where A is an m \times n matrix, \underline{x} is an n \times 1 vector of unknowns
and b is an m \times 1 vector. Further, suppose there exists an m \times 1 vector c such that the linear system A \underline{x}=c has No solution. Then, which of the following statements is/are necessarily TRUE?

Options -

1.If m \leq n and d is the first column of A, then the linear system A \underline{x}=\underline{d} has a unique solution
2.If m>n, then the linear system A x=0 has a solution other than x=0

  1. If m \geq n, then Rank(A)<n
  2. Rank(A)<m

.
Answer:
Rank(A)<m

Problem 4

Let X_{1}, X_{2}, \ldots, X_{n}(n \geq 2) be independent and identically distributed random variables with probability density function

    \[f(x)=\begin{cases}\frac{1}{x^{2}},  x \geq 1 \\0,  \text { otherwise }\end{cases}.\]

Then, which of the following random variables has/have finite expectation?

Options -

  1. \frac{1}{X_{2}}
  2. \sqrt{X_{1}}
  3. X_{1}
  4. \min \{X_{1}, \ldots, X_{n}\}


Answer: \frac{1}{X_{2}}, \sqrt{X_{1}}, \min \{X_{1}, \ldots, X_{n}\}

Problem 5

Let X_{1}, X_{2}, \ldots, X_{n} be a random sample from N(\theta, 1), where \theta \in(-\infty, \infty) is unknown. Consider the problem of testing H_{0}: \theta \leq 0 against H_{1}: \theta>0 . Let \beta(\theta) denote the power function of the likelihood ratio test of size \alpha(0<\alpha<1) for testing H_{0} against H_{1}. Then. which of the following statements is/are TRUE?

Options -

1.The critical region of the likelihood test of size \alpha is

    \[\{\left(x_{1}, x_{2}, \ldots, x_{n}\right) \in \mathbb{R}^{n}: \sqrt{n} \frac{\sum_{i=1}^{n} x_{i}}{n}<\tau_{\alpha}\}\]

where \tau_{\alpha} is a fixed point such that P\left(Z>\tau_{\alpha}\right)=\alpha, Z \sim N(0,1)

  1. \beta(\theta)>\beta(0), for all \theta>0
  2. The critical region of the likelihood test of size \alpha is
    {(x1,x2,…,xn)Rn:nni=1xin>τα/2}

    where \tau_{\alpha / 2} is a fixed point such that P\left(Z>\tau_{\alpha / 2}\right)=\frac{\alpha}{2}, Z \sim N(0,1)
  3. \beta(\theta)<\beta(0), for all \theta>0


Answer: \beta(\theta)>\beta(0), for all \theta>0

Problem 6

Let X_{1}, X_{2}, \ldots, X_{n}(n \geq 2) be a random sample from a distribution with probability density function

    \[f(x ; \theta)=\begin{cases}\frac{1}{2 \theta},  -\theta \leq x \leq \theta \\0,  |x|>\theta\end{cases}.\]

where \theta \in(0, \infty) is unknown. If R=\min \{X_{1}, X_{2}, \ldots, X_{n}\} and S=\max \{X_{1}, X_{2}, \ldots, X_{n}\}, then which
of the following statements is/are TRUE?

Options -

1.\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\} is a complete and sufficient statistic for \theta

  1. S is an \mathrm{MLE} of \theta
  2. (R, S) is jointly sufficient for \theta
  3. Distribution of \frac{R}{S} does NOT depend on \theta


Answer:
\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\} is a complete and sufficient statistic for \theta
(R, S) is jointly sufficient for \theta
Distribution of \frac{R}{S} does NOT depend on \theta

Problem 7

Let X_{1}, X_{2}, \ldots, X_{n}(n \geq 2) be a random sample from a distribution with probability density function

    \[f(x ; \theta)=\begin{cases}\theta x^{\theta-1},  0 \leq x \leq 1 \\0,  \text { otherwise }\end{cases}.\]

where \theta \in(0, \infty) is unknown. Then, which of the following statements is/are TRUE?

Options -

1 .There does NOT exist any unbiased estimator of \frac{1}{\theta} which attains the Cramer-Rao lower bound
2.Cramer-Rao lower bound, based on X_{1}, X_{2}, \ldots, X_{n}, for the estimand \theta^{3} is \frac{\theta^{2}}{n}
3 .Cramer-Rao lower bound. based on X_{1}, X_{2}, \ldots, X_{n}, for the estimand \theta^{3} is 9 \frac{\theta^{6}}{n}
4 .There exists an unbiased estimator of \frac{1}{\theta} which attains the Cramer-Rao lower bound


Answer:
Cramer-Rao lower bound. based on X_{1}, X_{2}, \ldots, X_{n}, for the estimand \theta^{3} is 9 \frac{\theta^{6}}{n}
There exists an unbiased estimator of \frac{1}{\theta} which attains the Cramer-Rao lower bound

Problem 8

Let f: \mathbb{R} \rightarrow \mathbb{R} be a twice differentiable function. Then, which of the following statements is/are necessarily TRUE?

Options-

  1. f^{\prime \prime} is continuous
  2. f^{\prime \prime} is bounded on (0,1)
  3. If f^{\prime}(0)=f^{\prime}(1), then f^{\prime \prime}(x)=0 has a solution in (0,1)
  4. f^{\prime} is bounded on [8,10]


Answer:
If f^{\prime}(0)=f^{\prime}(1), then f^{\prime \prime}(x)=0 has a solution in (0,1)
f^{\prime} is bounded on [8,10]

Problem 9

Let A be a 3 \times 3 real matrix such that A \neq I_{3} and the sum of the entries in each row of A is 1. Then which of the following statements is/are necessarily TRUE?

Options -

  1. The characteristic polynomial, p(\lambda), of A+2 A^{2}+A^{3} has (\lambda-4) as a factor
  1. A-I_{3} is an invertible matrix
  2. A cannot be an orthogonal matrix
  3. The set \{\underline{x} \in \mathbb{R}^{3}:\left(A-I_{3}\right) \underline{x}=\underline{0}\} has at least two elements (\underline{x} is a column vector)


Answer:
The characteristic polynomial, p(\lambda), of A+2 A^{2}+A^{3} has (\lambda-4) as a factor

Problem 10

Consider the function

    \[f(x, y)=3 x^{2}+4 x y+y^{2}, \quad(x, y) \in \mathbb{R}^{2}\]

If S=\{(x, y) \in \mathbb{R}^{2}: x^{2}+y^{2}=1\}, then which of the following statements is/are TRUE?

Options -

  1. The maximum value of f on S is 2+\sqrt{5}
  2. The maximum value of f on S is 3+\sqrt{5}
  3. The minimum value of f on S is 3-\sqrt{5}
  4. The minimum value of f on S is 2-\sqrt{5}


Answer:
The maximum value of f on S is 2+\sqrt{5}
The minimum value of f on S is 2-\sqrt{5}

Some Useful Links:

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Knowledge Partner

Cheenta is a knowledge partner of Aditya Birla Education Academy
Cheenta

Cheenta Academy

Aditya Birla Education Academy

Aditya Birla Education Academy

Cheenta. Passion for Mathematics

Advanced Mathematical Science. Taught by olympians, researchers and true masters of the subject.
JOIN TRIAL
support@cheenta.com
Menu
Menu
Trial
Whatsapp
magic-wandrockethighlight