When Maximum Likelihood = Method of Moments?

Is Maximum Likelihood = Method of Moments?

Maximum Likelihood Estimation is an algorithm to find a reasonable estimator. Personally, it really woos my mind - simple and yet so beautiful. Method of Moments is simpler. It doesn't woo me :p. However, still, they have a lot of similarities. Thus, we have set off to explore them. Finally, we ask for a lot of food for thought. After all, we are all explorers at heart.

We ask "Is MLE = MOM? If not, when?"

We discover a rich relationship between the two. We discover the score function and so much more exciting.

Hints, Solution, and More

Enjoy the video

Build your foundations.

Ace your Exams.

Learn. Enjoy. Practice. Repeat.

Some Useful Links:

ISI MStat 2020 PSB Problem 9 | Discussion & Solution

ISI MStat 2020 PSB Problem 9

This post discuses the problem 9 of the ISI MStat 2020 PSB Entrance Exam.

A finite population has \(N\) units, with \(x_{i}\) being the value associated with the \(i^{\text {th }}\) unit, \(i=1,2, \ldots, N\). Let \(\bar{x}{N}\) be the population mean.

A statistician carries out the following experiment.

Step 1: Draw a SRSWOR of size \(n({1}\) and denote the sample mean by \(\bar{X}{n}\).

Step 2: Draw a SRSWR of size \(m\) from \(S{1}\). The \(x\) -values of the sampled units are denoted by {\(Y_{1}, \cdots, Y_{m} \)}.

Hints, Solution, and More

Do subscribe to our channel to get instant notification of Live Session, so that you can join us live in the master classes!

Build your foundations.

Ace your Exams.

Learn. Enjoy. Practice. Repeat.

Some Useful Links:

Is MLE always a function of a Sufficient Statistic?

Is MLE always a function of a Sufficient Statistic?

MLE is an algorithm to find a reasonable estimator (personally, it really woos my mind - simple and yet so beautiful.).

Now, well - Life is hard. People has devised ways to check if an esimator is good or not - why will they care I like it or not.

So, they have developed Small Sample Properties and Large Sample Properties to do the quality control of MLE.

This post tests the flamboyancy of MLE is terms of the idea of "Sufficiency".

We ask "Is MLE sufficient? How is MLE and Sufficiency related?"

We discover a rich relationship between the two. Again, MLE wins my heart. Does it win yours? Check with the hints and the solution.

Hints, Solution, and More

Do subscribe to our channel to get instant notification of Live Session, so that you can join us live in the master classes!

Build your foundations.

Ace your Exams.

Learn. Enjoy. Practice. Repeat.

Some Useful Links:

ISI MStat 2020 PSB Problem 6 Problem & Solution

ISI MStat 2020 PSB Problem 6

This post discuses the problem 6 of the ISI MStat 2020 PSB Entrance Exam.

Suppose individuals are classified into three categories C1,C2 and C3.

Let p2,(1p)2 and 2p(1p) be the respective population proportions, where p∈(0,1). A random sample of N individuals is selected from the population and the category of each selected individual recorded.

For i=1,2,3, let Xi denote the number of individuals in the sample belonging to category Ci. Define U=X1+X32.

Hints, Solution, and More

Do subscribe to our channel to get instant notification of Live Session, so that you can join us live in the master classes!

Build your foundations.

Ace your Exams.

Learn. Enjoy. Practice. Repeat.

Some Useful Links:

IIT JAM Statistics Entrance Exam books based on Syllabus

Indian Institute of Technology is one of the top statistics departments in the country with great research and placement opportunities. It conducts an entrance exam for the aspirants who want to pursue a master's in Statistics called the IIT JAM MS Entrance Exam. To crack this exam, the best source of study materials is the right books. So, I am here to provide you the list of useful books for IIT JAM MS Entrance Exam preparation based on Syllabus.

A short note on the IIT JAM MS Entrance Exam: The MS programme offers advance level training in the theory, methods and applications of Statistics along with specialized training in selected areas of Statistics and allied fields. Depending on the area of specialization, students would be able to pursue an academic/research career in Statistics, Mathematics, Economics, Computer Science, and allied fields.

IIT JAM MS Entrance Exams Books according to the syllabus:

As mentioned in the website Entrance Exam mainly consists of 2 topics:

Let's start with the Mathematics books required for IIT JAM MS Entrance Exam Preparation:

High School Mathematics - The mathematics part is super easy. So just be fluent in your 10+2 syllabus of mathematics and have a piece of sound knowledge in Calculus and Linear Algebra. Also, solve past year problems.

Although High School Mathematics part come less in the examination.

Let's discuss the books for Probability and Statistics Part, breaking it into different subsections according to the syllabus:

  1. Combinatorics and Probability Theory
    1. Book 1: (Chapter 1 - 8) -
      1. A First Course in Probability by Sheldon Ross
      2. Pdf File
    2. Book 2: (Chapter 1 - 7)
      1. An Introduction to Probability and Statistics 
    3. Book 3: (Chapter 1 - 6) [For Quicker Purposes]
      1. Mathematical Statistics and Data Analysis by J.A.Rice
      2.  Pdf File
  2. Linear Algebra
    1. Book: (Chapter 1 - 5, 7)
      1. Linear Algebra and Its Applications by Gilbert Strang
      2. Pdf File 
      3. Lecture Series: Linear Algebra Lecture Series
  3. Calculus and Real Analysis
    1. Book: (Chapter 2, 4, 5, 7)
      1. Understanding Analysis by Stephen Abbott
      2. Pdf File
  4. Statistics
    1. Statistical Inference
      1. Book 1: (Chapter 7, 8, 9)
        1. Statistical Inference by Casella Berger 
        2. Pdf File
      2. Book 2: Chapter (8,9)
        1. Mathematical Statistics and Data Analysis by J.A.Rice
        2.  Pdf File

5. A book named "Solutions to IIT JAM for Mathematical Statistics" by Amit Mishra and Mohd. Arshad covers previous year solutions till 2018. You can consider buying that book for your IIT JAM MS Preparation.

Other Useful Resources

IIT JAM Statistics Crash Course for 2022

Early bird Registration is Going On. Classes start from 1st week of October, 2021.

IIT JAM MS 2021 Question Paper | Set C | Problems & Solutions

This post discusses the solutions to the problems from IIT JAM Mathematical Statistics (MS) 2021 Question Paper - Set C. You can find solutions in video or written form.

Note: This post is getting updated. Stay tuned for solutions, videos, and more.

IIT JAM Mathematical Statistics (MS) 2021 Problems & Solutions (Set C)

Problem 1

Let $f_{0}$ and $f_{1}$ be the probability mass functions given by

Consider the problem of testing the mull hypothesis $H_{0}: X \sim f_{0}$ a gainst $H_{1}: X \sim f_{1}$ based on a single
sample $X .$ If $\alpha$ and $\beta$, respectively, denote the size and power of the test with critical region
${x \in \mathbb{R}: x>3},$ then $10(\alpha+\beta)$ is equal to ______________________


Answer: $13$

Problem 2

Let,

$$
\alpha=\lim _{n \rightarrow \infty} \sum{m=n^{2}}^{2 n^{2}} \frac{1}{\sqrt{5 n^{4}+n^{3}+m}}
$$

Then, $10 \sqrt{5} \alpha$ is equal to _________


Answer: 10

Problem 3

Let $\alpha, \beta$ and $\gamma$ be the eigenvalues of $M=\left[\begin{array}{ccc}0 & 1 & 0 \\ 1 & 3 & 3 \\ -1 & 2 & 2\end{array}\right] .$ If $y=1$ and $\alpha>\beta,$ then the value of
$2 \alpha+3 \beta$ is ___________________________________


Answer: $7$

Problem 4

Let $S=\{(x, y) \in \mathbb{R}^{2}: 2 \leq x \leq y \leq 4\}$. Then, the value of the integral

$$
\iint_{S} \frac{1}{4-x} d x d y
$$

is _______


Answer: 2

Problem 5

Let $M=\left(\begin{array}{cc}5 & -6 \ 3 & -4\end{array}\right)$ be a $2 \times 2$ matrix. If $\alpha=det \left(M^{4}-6 I_{2}\right),$ then the value of $\alpha^{2}$ is ________


Answer: 2500

Problem 6

Let $X$ be a random variable with moment generating function

$$
M_{X}(t)=\frac{1}{12}+\frac{1}{6} e^{t}+\frac{1}{3} e^{2 t}+\frac{1}{4} e^{-t}+\frac{1}{6} e^{-2 t}, t \in \mathbb{R}
$$

Then, $8 E(X)$ is equal to _______


Answer: 2

Problem 7

Let $5,10,4,15,6$ be an observed random sample of size 5 from a distribution with probability density function

$$
f(x ; \theta)=\begin{cases}
e^{-(x-\theta)}, x \geq \theta \\
0, \text { otherwise }
\end{cases}.
$$

$\theta \in(-\infty, 3]$ is unknown. Then, the maximum likelihood estimate of $\theta$ based on the observed sample is equal to ________


Answer: 3

Problem 8

Let $X$ be a random variable having the probability density function

$$
f(x)=\frac{1}{8 \sqrt{2 \pi}}\left(2 e^{-\frac{x^{2}}{2}}+3 e^{-\frac{x^{2}}{8}}\right), \quad-\infty<x<\infty .
$$

Then, $4 E\left(X^{4}\right)$ is equal to _____


Answer: 147

Problem 9

Let $\beta$ denote the length of the curve $y=\ln (\sec x)$ from $x=0$ to $x=\frac{\pi}{4}$. Then, the value of $3 \sqrt{2}\left(e^{\beta}-1\right)$ is equal to _____


Answer: $6$

Problem 10

Let $A=\{(x, y, z) \in \mathbb{R}^{3}: 0 \leq x \leq y \leq z \leq 1\}$. Let $\alpha$ be the value of the integral

$$
\iiint_{A} x y z d x d y d z
$$

Then, $384 \alpha$ is equal to _______


Answer: $8$

Problem 11

Let,

$$
a_{n}=\sum_{k=2}^{n}\left(\begin{array}{l}
n \\
k
\end{array}\right) \frac{2^{k}(n-2)^{n-k}}{n^{n}}, \quad n=2,3, \ldots
$$

Then, $e^{2} \lim _{n \rightarrow \infty}\left(1-a{n}\right)$ is equal to ____

Answer: 3

Problem 12

Let $E_{1}, E_{2}, E_{3}$ and $E_{4}$ be four independent events such that $P\left(E_{1}\right)=\frac{1}{2}, P\left(E_{2}\right)=\frac{1}{3}, P\left(E_{3}\right)=\frac{1}{4}$ and $P\left(E_{4}\right)=\frac{1}{5} .$ Let $p$ be the probability that at most two events among $E_{1}, E_{2}, E_{3}$ and $E_{4}$ occur. Then, $240 p$ is equal to ____

Answer: 218

Problem 13

The number of real roots of the polynomial

$$
f(x)=x^{11}-13 x+5
$$

is ____


Answer:$3$

Problem 14

Let $S \subseteq \mathbb{R}^{2}$ be the region bounded by the parallelogram with vertices at the points (1,0),(3,2) ,
(3,5) and $(1,3) .$ Then. the value of the integral $\iint_{S}(x+2 y) d x d y$ is equal to ___


Answer: 42

Problem 15

Let $\alpha=\lim _{n \rightarrow \infty}\left(1+n \sin \frac{3}{n^{2}}\right)^{2 n}$. Then, $\ln \alpha$ is equal to ____


Answer: 6

Problem 16

Let $A=\{(x, y) \in \mathbb{R}^{2}: x^{2}-\frac{1}{2 \sqrt{\pi}}<y<x^{2}+\frac{1}{2 \sqrt{\pi}}\}$ and let the joint probability density function
of $(X, Y)$ be

$$
f(x, y)=\begin{cases}
e^{-(x-1)^{2}}, & (x, y) \in A \\
0, \text { otherwise }
\end{cases}.
$$

Then, the covariance between the random variables $X$ and $Y$ is equal to ____

Answer: 1

Problem 17

Let $\phi:(-1,1) \rightarrow \mathbb{R}$ be defined by

$$
\phi(x)=\int_{x^{7}}^{x^{4}} \frac{1}{1+t^{3}} d t
$$

If $\alpha=\lim _{x \rightarrow 0} \frac{\phi(x)}{e^{2 x^{4}-1}},$ then $42 \alpha$ is equal to ____


Answer: 21

Problem 18

Let $S=\{(x, y) \in \mathbb{R}^{2} ; 0 \leq x \leq \pi, \min {\sin x, \cos x} \leq y \leq \max {\sin x, \cos x}\}$.
If $\alpha$ is the area of $S$, then the value of $2 \sqrt{2} \alpha$ is equal to ____

Answer: 8

Problem 19

Let the random vector $(X, Y)$ have the joint probability mass function

$f(x, y)=\begin{cases}{10 \choose x}{5 \choose y}(\frac{1}{4})^{x-y+5}(\frac{3}{4})^{y-x+10}, x=0,1, \ldots, 10 ; y=0,1, \ldots, 5 \\ 0, \text { otherwise }\end{cases}$.

Let $Z=Y-X+10 .$ If $\alpha=E(Z)$ and $\beta=Var(Z),$ then $8 \alpha+48 \beta$ is equal to ____

Answer: 225

Problem 20

Let $X_{1}$ and $X_{2}$ be independent $N(0,1)$ random variables. Define

$$
sgn(u)=\begin{cases}
-1, \text { if } u<0 \\ 0, \text { if } u=0 \\ 1, \text { if } u>0
\end{cases}.
$$

Let $Y_{1}=X_{1} sgn\left(X_{2}\right)$ and $Y_{2}=X_{2} sgn\left(X_{1}\right)$. If the correlation coefficient between $Y_{1}$ and $Y_{2}$ is $\alpha$,
then $\pi \alpha$ is equal to ____


Answer: 2

Some Useful Links:


IIT JAM MS 2021 Question Paper | Set A | Problems & Solutions

This post discusses the solutions to the problems from IIT JAM Mathematical Statistics (MS) 2021 Question Paper - Set A. You can find solutions in video or written form.

Note: This post is getting updated. Stay tuned for solutions, videos, and more.

IIT JAM Mathematical Statistics (MS) 2021 Problems & Solutions (Set A)

Problem 1

The value of the limit

$$
\lim_{n \rightarrow \infty} \sum_{k=0}^{n}\left(\begin{array}{c}
2 n \\
k
\end{array}\right) \frac{1}{4^{n}}
$$

is equal to

Options-

  1. $\frac{1}{4}$
  2. $\frac{1}{2}$
  3. $1$
  4. $0$


Answer: $\frac{1}{2}$

Problem 2

If the series $\sum_{n=1}^{\infty} a_{n}$ converges absolutely, then which of the following series diverges?

Options-

  1. $\sum_{n=1}^{\infty}\left|a_{2 n}\right|$
  2. $\sum_{n=1}^{\infty}\left(a_{n}\right)^{3}$
  3. $\sum_{n=2}^{\infty}\left(\frac{1}{(\ln n)^{2}}+a_{n}\right)$
  4. $\sum_{n=1}^{\infty} \frac{a_{n}+a_{n+1}}{2}$


Answer: $\sum_{n=2}^{\infty}\left(\frac{1}{(\ln n)^{2}}+a_{n}\right)$

Problem 3

Let $X$ be a $U(0,1)$ random variable and let $Y=X^{2}$. If $\rho$ is the correlation coefficient between the random variables $X$ and $Y$, then $48 \rho^{2}$ is equal to

Options-

1.$48$;
2.$30$;
3.$45$;
4.$35$.


Answer: $45$
Solution:

Problem 4

Let $\{X_{n}\}_{n \geq 1}$ be a sequence of independent and identically distributed random variables with probability density function

f(x)={1,0, if 0<x<1 otherwise 

Then. the value of the limit

limn→∞P(1ni=1nlnXi≤1+1n)
is equal to

Options -

  1. $0$;
  2. $\Phi(2)$;
  3. $\Phi(1)$;
  4. $\frac{1}{2}$.


Answer: $\Phi(1)$

Problem 5

Let $f: \mathbb{R} \rightarrow \mathbb{R}$ be a function defined by

$$
f(x)=x^{7}+5 x^{3}+11 x+15, x \in \mathbb{R}
$$

Then, which of the following statements is TRUE?

Options -

  1. $f$ is onto but NOT one-one
  2. $f$ is one-one but NOT onto
  3. $f$ is both one-one and onto
  4. $f$ is neither one-one nor onto


Answer: $f$ is both one-one and onto

Problem 6

There are three urns, labeled. Urn $1$ , Urn $2$ and Urn $3$ . Urn $1$ contains $2$ white balls and $2$ black balls, Urn $2$ contains $1$ white ball and $3$ black balls and Urn $3$ contains $3$ white balls and $1$ black ball. Consider two coins with probability of obtaining head in their single trials as $0.2$ and $0.3 .$ The two coins are tossed independently once, and an urn is selected according to the following scheme:
Urn $1$ is selected if $2$ heads are obtained: Urn $3$ is selected if $2$ tails are obtained; otherwise Urn $2$ is
selected. A ball is then drawn at random from the selected urn. Then
$P($ Urn 1 is selected $\mid$ the ball drawn is white $)$ is equal to

Options -

  1. $\frac{12}{109}$
  2. $\frac{1}{18}$
  3. $\frac{6}{109}$
  4. $\frac{1}{9}$


Answer: $\frac{6}{109}$

Problem 7

Let $X$ be a random variable with probability density function

$$
f(x)=\frac{1}{2} e^{-|x|}, \quad-\infty<x<\infty
$$

Then, which of the following statements is FALSE?

Options -

  1. $E\left(|X| \sin \left(\frac{X}{|X|}\right)\right)=0$
  2. $E(X|X|)=0$
  3. $E\left(|X| \sin ^{2}\left(\frac{X}{|X|}\right)\right)=0$
  4. $E\left(X|X|^{2}\right)=0$


Answer: $E\left(|X| \sin ^{2}\left(\frac{X}{|X|}\right)\right)=0$

Problem 8

The value of the limit

.

$$
\lim _{n \rightarrow \infty}\left(\left(1+\frac{1}{n}\right)\left(1+\frac{2}{n}\right) \cdots\left(1+\frac{n}{n}\right)\right)^{\frac{1}{n}}
$$

is equal to

Options-

  1. $\frac{3}{e}$
  2. $\frac{4}{e}$
  3. $e$
  4. $\frac{1}{e}$


Answer: $\frac{4}{e}$
Solution:

Problem 9

Let $M$ be a $3 \times 3$ real matrix. Let $\left(\begin{array}{l}1 \\ 2 \\ 3\end{array}\right),\left(\begin{array}{l}1 \\ 1 \\ 1\end{array}\right)$ and $\left(\begin{array}{c}0 \\ -1 \\ \alpha\end{array}\right)$ be the eigenvectors of $M$ corresponding to three distinct eigenvalues of $M$. where $\alpha$ is a real number. Then. which of the following is NOT a possible value of $\alpha$ ?

Options-

1.$1$
2 .$-2$
3.$2$
4.$0$


Answer: $-2$

Problem 10

The value of the limit

$$
\lim _{x \rightarrow 0} \frac{e^{-3 x}-e^{x}+4 x}{5(1-\cos x)}
$$ is equal to

Options -

  1. $\frac{2}{5}$
  2. 0
  3. $\frac{8}{5}$
  4. 1


Answer: $\frac{8}{5}$

Problem 11

Consider a sequence of independent Bernoulli trials with probability of success in each trial as $\frac{1}{3}$. The probability that three successes occur before four failures is equal to

Options -

1.$\frac{179}{841}$
2.$\frac{179}{243}$
3.$\frac{233}{729}$
4.$\frac{179}{1215}$


Answer: $\frac{179}{1215}$

Problem 12

Let,

$$
S=\sum_{k=1}^{\infty}(-1)^{k-1} \frac{1}{k}\left(\frac{1}{4}\right)^{k} \text { and } T=\sum_{k=1}^{\infty} \frac{1}{k}\left(\frac{1}{5}\right)^{k}
$$

Then, which of the following statements is TRUE?

Options -

1.$5 S-4 T=0$
2.$S-T=0$

  1. $16 S-25 T=0$
  2. $4 S-5 T=0$

Answer: $S-T=0$

IIT JAM 2021 - Problem 13

Let $a_{1}=5$ and define recursively

$$
a_{n+1}=3^{\frac{1}{4}}\left(a_{n}\right)^{\frac{3}{4}}, \quad n \geq 1
$$

Then, which of the following statements is TRUE?

Options-

  1. $\{a_{n}\}$ is monotone decreasing, and $\lim _{n \rightarrow \infty} a{n}=3$
  2. $\{a_{n}\}$ is decreasing, and $\lim_ {n \rightarrow \infty} a{n}=0$
  3. $\{a_{n}\}$ is non-monotone, and $\lim_ {n \rightarrow \infty} a{n}=3$
  4. $\{a_{n}\}$ is monotone increasing, and $\lim _{n \rightarrow \infty} a{n}=3$


Answer:$\{a_{n}\}$ is monotone decreasing, and $\lim _{n \rightarrow \infty} a{n}=3$

Problem 14

Let $E_{1}, E_{2}$ and $E_{3}$ be three events such that $P\left(E_{1}\right)=\frac{4}{5}, P\left(E_{2}\right)=\frac{1}{2}$ and $P\left(E_{3}\right)=\frac{9}{10}$
Then. which of the following statements is FALSE?

  1. $P\left(E_{1} \cup E_{2} \cup E_{3}\right) \geq \frac{9}{10}$
  2. $P\left(E_{1} \cup E_{2}\right) \geq \frac{4}{5}$
  3. $P\left(E_{2} \cap E_{3}\right) \leq \frac{1}{2}$
  4. $P\left(E_{1} \cap E_{2} \cap E_{3}\right) \leq \frac{1}{6}$


Answer: $P\left(E_{1} \cap E_{2} \cap E_{3}\right) \leq \frac{1}{6}$

Problem 15

Let $E_{1}, E_{2}, E_{3}$ and $E_{4}$ be four events such that
$$
P\left(E_{i} \mid E_{4}\right)=\frac{2}{3}, i=1,2,3 ; P\left(E_{i} \cap E_{j}^{c} \mid E_{4}\right)=\frac{1}{6}, i, j=1,2,3 ; i \neq j \text { and } P\left(E_{1} \cap E_{2} \cap E_{3}^{c} \mid E_{4}\right)=\frac{1}{6}
$$
Then. $P\left(E_{1} \cup E_{2} \cup E_{3} \mid E_{4}\right)$ is equal to

  1. $\frac{1}{2}$
  2. $\frac{5}{6}$
  3. $\frac{2}{3}$
  4. $\frac{7}{12}$


Answer: $\frac{5}{6}$

Problem 16

Let $X$ be a random variable having the probability density function

$$
f(x)=\begin{cases}
e^{-x}, & x>0 \\
0, & x \leq 0
\end{cases}.
$$

Define $Y=[X]$, where $[X]$ denotes the largest integer not exceeding $X$. Then, $E\left(Y^{2}\right)$ is equal to

Options -

1.$\frac{e+1}{(e-1)^{2}}$

  1. $\frac{(e+1)^{2}}{(e-1)^{2}}$
  2. $\frac{e(e+1)^{2}}{e-1}$
  3. $\frac{e(e+1)}{e-1}$


Answer: $\frac{e+1}{(e-1)^{2}}$

Problem 17

Let $X$ be a continuous random variable with distribution function

$$
F(x)=\begin{cases}
0, & \text { if } x<0 \\
a x^{2}, & \text { if } 0 \leq x<2 \\
1, & \text { if } x \geq 2
\end{cases}.
$$

for some real constant $a$. Then, $E(X)$ is equal to

Options -

1.1
2 .$ \frac{4}{3}$

  1. $\frac{1}{4}$
  2. 0


Answer: $ \frac{4}{3}$

Problem 18

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from $U(\theta-5, \theta+5),$ where $\theta \in(0, \infty)$ is unknown. Let $T=\max \{X_{1}, X_{2}, \ldots, X_{n}\}$ and $U=\min \{X_{1}, X_{2}, \ldots, X_{n}\} .$ Then, which of the following statements is TRUE?

Options -

  1. $U+8$ is an MLE of $\theta$
  2. $\frac{T+U}{2}$ is the unique $\mathrm{MLE}$ of $\theta$
  3. MLE of $\frac{1}{\theta}$ does NOT exist
  4. $\frac{2}{T+U}$ is an $\mathrm{MLE}$ of $\frac{1}{\theta}$


Answer: $\frac{2}{T+U}$ is an $\mathrm{MLE}$ of $\frac{1}{\theta}$

Problem 19

Consider the problem of testing $H_{0}: X \sim f_{0}$ against $H_{1}: X \sim f_{1}$ based on a sample of size 1 , where

$f_{0}(x)=\begin{cases}1, 0 \leq x \leq 1 \\ 0, \text { otherwise }\end{cases}.$ and $f_{1}(x)=\begin{cases}2-2 x , 0 \leq x \leq 1 \\ 0, \text { otherwise }\end{cases}$.

Then, the probability of Type II error of the most powerful test of size $\alpha=0.1$ is equal to

Options -

  1. 0.1
  2. 1
  3. 0.91
  4. 0.81


Answer: 0.81
Solution:

Problem 20

Let $X$ and $Y$ be random variables having chi-square distributions with 6 and 3 degrees of freedom, respectively. Then, which of the following statements is TRUE?

Options -

  1. $P(X<6)>P(Y<6)$
  2. $P(X>0.7)>P(Y>0.7)$
  1. $P(X>3)3)$
  2. $P(X>0.7)0.7)$


Answer: $P(X>0.7)>P(Y>0.7)$

Problem 21

Let $f: \mathbb{R}^{2} \rightarrow \mathbb{R}$ be a function defined by

$f(x, y)=\begin{cases}\frac{y^{3}}{x^{2}+y^{2}}, & (x, y) \neq(0,0) \ 0, & (x, y)=(0,0)\end{cases}$.

Let $f_{x}(x, y)$ and $f_{y}(x, y)$ denote the first order partial derivatives of $f(x, y)$ with respect to $x$ and $y$,
respectively, at the point $(x, y)$. Then, which of the following statements is FALSE?

Options -

  1. $f$ is NOT differentiable at (0,0)
  2. $f_{y}(0,0)$ exists and $f_{y}(x, y)$ is continuous at (0,0)
  3. $f_{y}(x, y)$ exists and is bounded at every $(x, y) \in \mathbb{R}^{2}$
  4. $f_{x}(x, y)$ exists and is bounded at every $(x, y) \in \mathbb{R}^{2}$


Answer: $f_{y}(0,0)$ exists and $f_{y}(x, y)$ is continuous at (0,0)

Problem 22

Let $(X, Y)$ be a random vector with joint moment generating function

$$
M\left(t_{1}, t_{2}\right)=\frac{1}{\left(1-\left(t_{1}+t_{2}\right)\right)\left(1-t_{2}\right)}, \quad-\infty<t_{1}<\infty,-\infty<t_{2}<\min \{1,1-t_{1}\}
$$

Let $Z=X+Y$. Then. $Var(Z)$ is equal to

Options -

1.3

2.4

3.5

4.6


Answer: 5

Problem 23

Let $X_{1}, X_{2}, \ldots, X_{n}$ be a random sample from an exponential distribution with probability density function

$$
f(x ; \theta)=\begin{cases}
\theta e^{-\theta x}, x>0 \\
0, \text { otherwise }
\end{cases}
$$

where $\theta \in(0, \infty)$ is unknown. Let $\alpha \in(0,1)$ be fixed and let $\beta$ be the power of the most powerful test of size $\alpha$ for testing $H_{0}: \theta=1$ against $H_{1}: \theta=2$.
Consider the critical region

$R=\{\left(x_{1}, x_{2}, \ldots, x_{n}\right) \in \mathbb{R}^{n} ; \sum_{l=1}^{n} x_{i}>\frac{1}{2} \chi_{2 n}^{2}(1-\alpha)\}$

where for any $\gamma \in(0,1), \chi_{2 n}^{2}(\gamma)$ is a fixed point such that $P\left(x_{2 n}^{2}>x_{2 n}^{2}(\gamma)\right)=\gamma .$ Then, the
critical region $R$ corresponds to the

Options-

1.most powerful test of size $\beta$ for testing $H_{0}^{}: \theta=2$ against $H_{1}^{}: \theta=1$
2.most powerful test of size $\alpha$ for testing $H_{0}: \theta=1$ against $H_{1}: \theta=2$
3.most powerful test of size $1-\beta$ for testing $H_{0}^{}: \theta=2$ against $H_{1}^{}: \theta=1$
4.most powerful test of size $1-\alpha$ for testing $H_{0}^{}: \theta=2$ against $H_{1}^{}: \theta=1$


Answer: most powerful test of size $\alpha$ for testing $H_{0}^{}: \theta=2$ against $H_{1}^{}: \theta=1$ [No Option]

Problem 24

Consider three coins having probabilities of obtaining head in a single trial as $\frac{1}{4}, \frac{1}{2}$ and $\frac{3}{4}$, respectively, A player selects one of these three coins at random (each coin is equally likely to be selected). If the player tosses the selected coin five times independently, then the probability of obtaining two tails in five tosses is equal to

Options -

  1. $\frac{64}{384}$
  2. $\frac{125}{384}$
  3. $\frac{255}{384}$
  4. $\frac{85}{384}$


Answer: $\frac{85}{384}$

Problem 25

For $a \in \mathbb{R}$, consider the system of linear equations

$\begin{array}{ll}a x+a y & =a+2 \\ x+a y+(a-1) z & =a-4 \\ a x+a y+(a-2) z & =-8\end{array}$

in the unknowns $x, y$ and $z$. Then. which of the following statements is $\mathbf{T R U E}$ ?

Options -

  1. The given system has a unique solution for $a=-2$
  2. The given system has a unique solution for $a=1$
  3. The given system has infinitely many solutions for $a=-2$
  4. The given system has infinitely many solutions for $a=2$


Answer: The given system has a unique solution for $a=-2$

Problem 26

Let $X$ and $Y$ be independent $N(0,1)$ random variables and $Z=\frac{|X|}{|Y|} .$ Then, which of the
following expectations is finite?

Options -

  1. $E(Z)$
  2. $E\left(\frac{1}{Z \sqrt{Z}}\right)$
  3. $E(Z \sqrt{Z})$
  4. $E\left(\frac{1}{\sqrt{Z}}\right)$

Answer: $E\left(\frac{1}{\sqrt{Z}}\right)$

Problem 27

Let $\{X_{n}\}_{n>1}$ be a sequence of independent and identically distributed $N(0,1)$ random variables.
Then,

$$
\lim _{n \rightarrow \infty} P\left(\frac{\sum{i=1}^{n} X_{i}^{4}-3 n}{\sqrt{32 n}} \leq \sqrt{6}\right)
$$ is equal to

Options -

  1. 0
  2. $\Phi(\sqrt{2})$
  3. $\frac{1}{2}$
  4. $\Phi(1)$


Answer: $\Phi(\sqrt{2})$

Problem 28

Let $X$ be a continuous random variable having the moment generating function

$$
M(t)=\frac{e^{t}-1}{t}, \quad t \neq 0
$$

Let $\alpha=P\left(48 X^{2}-40 X+3>0\right)$ and $\beta=P\left((\ln X)^{2}+2 \ln X-3>0\right)$.
Then, the value of $\alpha-2 \ln \beta$ is equal to

Options-

  1. $\frac{10}{3}$
  2. $\frac{13}{3}$
  3. $\frac{19}{3}$
  4. $\frac{17}{3}$


Answer: $\frac{19}{3}$

Problem 29

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 3)$ be a random sample from Poisson $(\theta),$ where $\theta \in(0, \infty)$ is unknown and
let

$$
T=\sum_{i=1}^{n} X_{i}
$$

Then, the uniformly minimum variance unbiased estimator of $e^{-2 \theta} \theta^{3}$

Options -

  1. is $\quad \frac{T}{n}\left(\frac{T}{n}-1\right)\left(\frac{T}{n}-2\right)\left(1-\frac{2}{n}\right)^{T-3}$
  2. is $\frac{T(T-1)(T-2)(n-2)^{T-3}}{n^{T}}$
  3. does NOT exist
  4. is $e^{-\frac{2 T}{n}\left(\frac{T}{n}\right)^{3}}$


Answer: $\frac{T(T-1)(T-2)(n-2)^{T-3}}{n^{T}}$

Problem 30

Let $\{a_{n}\}_{n \geq 1}$ be a sequence of real numbers such that $a_{n} \geq 1$, for all $n \geq 1$. Then, which of the following conditions imply the divergence of $\{a_{n}\}_{n \geq 1} ?$

Options -

1$\sum_{n=1}^{\infty} b_{n}$ converges, where $b_{1}=a_{1}$ and $b_{n}=a_{n+1}-a_{n},$ for all $n>1$

  1. $\{\sqrt{a_{n}}\}_{n \geq 1}$ converges
  2. $\lim _{n \rightarrow \infty} \frac{a{2 n+1}}{a_{2 n}}=\frac{1}{2}$
  3. $\{a_{n}\}_{n} \geq 1$ is non-increasing


Answer: $\lim _{n \rightarrow \infty} \frac{a{2 n+1}}{a_{2 n}}=\frac{1}{2}$

Some Useful Links:

IIT JAM MS 2021 Question Paper | Set B | Problems & Solutions

This post discusses the solutions to the problems from IIT JAM Mathematical Statistics (MS) 2021 Question Paper - Set B. You can find solutions in video or written form.

Note: This post is getting updated. Stay tuned for solutions, videos, and more.

IIT JAM Mathematical Statistics (MS) 2021 Problems & Solutions (Set B)

Problem 1

A sample of size $n$ is drawn randomly (without replacement) from an urn couraining $5 n^{2}$ balls, of which $2 n^{2}$ are red balls and $3 n^{2}$ are black balls. Let $X_{n}$ denote the number of red balls in the selected sample. If $\ell=\lim _{n \rightarrow \infty} \frac{E\left(X{n}\right)}{n}$ and $m=\lim _{n \rightarrow \infty} \frac{Var (X{n})}{n},$ then which of the following statements is/are TR UE?

Options -

  1. $\frac{\ell}{m}=\frac{5}{3}$
  2. $\ell m=\frac{14}{125}$
  3. $\ell-m=\frac{3}{25}$
  4. $\ell+m=\frac{16}{25}$


Answer: $\frac{\ell}{m}=\frac{5}{3}$; $\ell+m=\frac{16}{25}$

Problem 2

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from a distribution with probability density function

$$
f(x ; \theta)=\begin{cases}
\frac{3 x^{2}}{\theta} e^{-x^{3} / \theta}, x>0 \\
0, \text { othervise }
\end{cases}.
$$

where $\theta \in(0, \infty)$ is unknown.
If $T=\sum_{i =1}^{n} X_{i}^{3}$, then which of the following statements is/are TRUE?

Options -

1.$\frac{n-1}{T}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$
2.$\frac{n}{T}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$
3.$(n-1) \sum_{i=1}^{n} \frac{1}{x_{i}^{3}}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$

  1. $\frac{n}{T}$ is the MLE of $\frac{1}{\theta}$


Answer:
$\frac{n-1}{T}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$
$\frac{n}{T}$ is the MLE of $\frac{1}{\theta}$

Problem 3

Consider the linear system $A \underline{x}=\underline{b}$, where $A$ is an $m \times n$ matrix, $\underline{x}$ is an $n \times 1$ vector of unknowns
and $b$ is an $m \times 1$ vector. Further, suppose there exists an $m \times 1$ vector $c$ such that the linear system $A \underline{x}=c$ has No solution. Then, which of the following statements is/are necessarily TRUE?

Options -

1.If $m \leq n$ and $d$ is the first column of $A$, then the linear system $A \underline{x}=\underline{d}$ has a unique solution
2.If $m>n,$ then the linear system $A x=0$ has a solution other than $x=0$

  1. If $m \geq n,$ then $Rank(A)<n$
  2. $Rank(A)<m$

.
Answer:
$Rank(A)<m$

Problem 4

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be independent and identically distributed random variables with probability density function

$$
f(x)=\begin{cases}
\frac{1}{x^{2}}, x \geq 1 \\
0, \text { otherwise }
\end{cases}.
$$

Then, which of the following random variables has/have finite expectation?

Options -

  1. $\frac{1}{X_{2}}$
  2. $\sqrt{X_{1}}$
  3. $X_{1}$
  4. $\min \{X_{1}, \ldots, X_{n}\}$


Answer: $\frac{1}{X_{2}}$, $\sqrt{X_{1}}$, $\min \{X_{1}, \ldots, X_{n}\}$

Problem 5

Let $X_{1}, X_{2}, \ldots, X_{n}$ be a random sample from $N(\theta, 1),$ where $\theta \in(-\infty, \infty)$ is unknown. Consider the problem of testing $H_{0}: \theta \leq 0$ against $H_{1}: \theta>0 .$ Let $\beta(\theta)$ denote the power function of the likelihood ratio test of size $\alpha(0<\alpha<1)$ for testing $H_{0}$ against $H_{1}$. Then. which of the following statements is/are TRUE?

Options -

1.The critical region of the likelihood test of size $\alpha$ is
$$
\{\left(x_{1}, x_{2}, \ldots, x_{n}\right) \in \mathbb{R}^{n}: \sqrt{n} \frac{\sum_{i=1}^{n} x_{i}}{n}<\tau_{\alpha}\} $$ where $\tau_{\alpha}$ is a fixed point such that $P\left(Z>\tau_{\alpha}\right)=\alpha, Z \sim N(0,1)$

  1. $\beta(\theta)>\beta(0),$ for all $\theta>0$
  2. The critical region of the likelihood test of size $\alpha$ is
    {(x1,x2,…,xn)Rn:nni=1xin>τα/2}

    where $\tau_{\alpha / 2}$ is a fixed point such that $P\left(Z>\tau_{\alpha / 2}\right)=\frac{\alpha}{2}, Z \sim N(0,1)$
  3. $\beta(\theta)<\beta(0),$ for all $\theta>0$


Answer: $\beta(\theta)>\beta(0),$ for all $\theta>0$

Problem 6

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from a distribution with probability density function

$$
f(x ; \theta)=\begin{cases}
\frac{1}{2 \theta}, -\theta \leq x \leq \theta \\
0, |x|>\theta
\end{cases}.
$$

where $\theta \in(0, \infty)$ is unknown. If $R=\min \{X_{1}, X_{2}, \ldots, X_{n}\}$ and $S=\max \{X_{1}, X_{2}, \ldots, X_{n}\},$ then which
of the following statements is/are TRUE?

Options -

1.$\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\}$ is a complete and sufficient statistic for $\theta$

  1. $S$ is an $\mathrm{MLE}$ of $\theta$
  2. $(R, S)$ is jointly sufficient for $\theta$
  3. Distribution of $\frac{R}{S}$ does NOT depend on $\theta$


Answer:
$\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\}$ is a complete and sufficient statistic for $\theta$
$(R, S)$ is jointly sufficient for $\theta$
Distribution of $\frac{R}{S}$ does NOT depend on $\theta$

Problem 7

Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from a distribution with probability density function

$$
f(x ; \theta)=\begin{cases}
\theta x^{\theta-1}, 0 \leq x \leq 1 \\
0, \text { otherwise }
\end{cases}.
$$

where $\theta \in(0, \infty)$ is unknown. Then, which of the following statements is/are TRUE?

Options -

1 .There does NOT exist any unbiased estimator of $\frac{1}{\theta}$ which attains the Cramer-Rao lower bound
2.Cramer-Rao lower bound, based on $X_{1}, X_{2}, \ldots, X_{n},$ for the estimand $\theta^{3}$ is $\frac{\theta^{2}}{n}$
3 .Cramer-Rao lower bound. based on $X_{1}, X_{2}, \ldots, X_{n},$ for the estimand $\theta^{3}$ is $9 \frac{\theta^{6}}{n}$
4 .There exists an unbiased estimator of $\frac{1}{\theta}$ which attains the Cramer-Rao lower bound


Answer:
Cramer-Rao lower bound. based on $X_{1}, X_{2}, \ldots, X_{n},$ for the estimand $\theta^{3}$ is $9 \frac{\theta^{6}}{n}$
There exists an unbiased estimator of $\frac{1}{\theta}$ which attains the Cramer-Rao lower bound

Problem 8

Let $f: \mathbb{R} \rightarrow \mathbb{R}$ be a twice differentiable function. Then, which of the following statements is/are necessarily TRUE?

Options-

  1. $f^{\prime \prime}$ is continuous
  2. $f^{\prime \prime}$ is bounded on (0,1)
  3. If $f^{\prime}(0)=f^{\prime}(1),$ then $f^{\prime \prime}(x)=0$ has a solution in (0,1)
  4. $f^{\prime}$ is bounded on [8,10]


Answer:
If $f^{\prime}(0)=f^{\prime}(1),$ then $f^{\prime \prime}(x)=0$ has a solution in (0,1)
$f^{\prime}$ is bounded on [8,10]

Problem 9

Let $A$ be a $3 \times 3$ real matrix such that $A \neq I_{3}$ and the sum of the entries in each row of $A$ is $1$. Then which of the following statements is/are necessarily TRUE?

Options -

  1. The characteristic polynomial, $p(\lambda),$ of $A+2 A^{2}+A^{3}$ has $(\lambda-4)$ as a factor
  1. $A-I_{3}$ is an invertible matrix
  2. $A$ cannot be an orthogonal matrix
  3. The set $\{\underline{x} \in \mathbb{R}^{3}:\left(A-I_{3}\right) \underline{x}=\underline{0}\}$ has at least two elements $(\underline{x}$ is a column vector)


Answer:
The characteristic polynomial, $p(\lambda),$ of $A+2 A^{2}+A^{3}$ has $(\lambda-4)$ as a factor

Problem 10

Consider the function

$$
f(x, y)=3 x^{2}+4 x y+y^{2}, \quad(x, y) \in \mathbb{R}^{2}
$$

If $S=\{(x, y) \in \mathbb{R}^{2}: x^{2}+y^{2}=1\}$, then which of the following statements is/are TRUE?

Options -

  1. The maximum value of $f$ on $S$ is $2+\sqrt{5}$
  2. The maximum value of $f$ on $S$ is $3+\sqrt{5}$
  3. The minimum value of $f$ on $S$ is $3-\sqrt{5}$
  4. The minimum value of $f$ on $S$ is $2-\sqrt{5}$


Answer:
The maximum value of $f$ on $S$ is $2+\sqrt{5}$
The minimum value of $f$ on $S$ is $2-\sqrt{5}$

Some Useful Links:

IIT JAM Stat Mock Test Toppers

IIT JAM Stat Mock Test Toppers

We are really happy with the performance of our students and thus, we have initiated to name the Toppers of IIT JAM Stat Mock Test. These toppers are named in this leader board according to their performance in IIT JAM Stat Mock Tests.

So, here goes the list:

Mock Test nameTopper's name and their scores
IIT JAM Mock Test 1 (Full)1. Somyadipta Ghosh - 88.5%
2. Mainack Paul - 83.7%
3. Abhradiptaa Ghosh - 78.7%
4. Prabirkumar Das - 71.2%
5. Debepsita Mukherjee - 68%
IIT JAM Mock Test 2 (Full)1. Somyadipta Ghosh - 74.2%
2. Mainack Paul - 68.2%
3. Prabirkumar Das - 58.6%
4. Saikat Kar - 57.6%
5. Debepsita Mukherjee - 49.7%
IIT JAM Mathematics Mock Test 11. Bidisha Ghosh - 51.4%
2. Mainack Paul - 51%
3. Somyadipta Ghosh - 50.3%
IIT JAM Mathematics Mock Test 21. Abhradiptaa Ghosh - 57.8%
2. Debepsita Mukherjee - 54.7%
3. Srija Mukherjee - 52.5%
IIT JAM Statistics Mock Test 11. Somyadipta Ghosh - 68%
2. Mainack Paul - 64%
3. Debepsita Mukherjee - 56%
4. Srija Mukherjee - 52%
5. Abhradiptaa Ghosh - 52%
IIT JAM Statistics Mock Test 21. Somyadipta Ghosh - 56.7%
2. Mainack Paul - 56.7%
IIT JAM Probability Mock Test 11. Mainack Paul - 80%
2. Anis Pakrashi - 76.7%
3. Somyadipta Ghosh - 76.7%
4. Prabirkumar Das - 73.3%
IIT JAM Probability Mock Test 21. Abhradiptaa Ghosh - 80%
2. Mainack Paul - 76%
3. Srija Mukherjee - 76%
4. Anis Pakrashi - 68%
5. Prabirkumar Das - 68%

These Mock Tests are part of our Cheenta Statistics Bronze Learning Path. You can learn more about it here.

Some Useful Links:

Testing of Hypothesis | ISI MStat 2016 PSB Problem 9

This is a problem from the ISI MStat Entrance Examination, 2016 involving the basic idea of Type 1 error of Testing of Hypothesis but focussing on the fundamental relationship of Exponential Distribution and the Geometric Distribution.

The Problem:

Suppose \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from an exponential distribution with mean \(\lambda\).

Assume that the observed data is available on \(\left[X_{1}\right], \ldots,\left[X_{n}\right]\), instead of \(X_{1}, \ldots, X_{n},\) where \([x]\) denotes the largest integer less than or equal to \(x\).

Consider a test for \(H_{0}: \lambda=1\) vs \(H_{1}: \lambda>1\) which rejects \(H_{0}\) when \(\sum_{i=1}^{n}\left[X_{i}\right]>c_{n} .\)

Given \(\alpha \in(0,1),\) obtain values of \(c_{n}\) such that the size of the test converges to \(\alpha\) as \(n \rightarrow \infty\).

Prerequisites:

(a) Testing of Hypothesis

(b) Type 1 Error

(c) Exponential Distribution

(d) Relationship of Exponential Distribution and Geometric Distribution

(e) Central Limit Theorem

Solution:

Proof:

\(Y\) is clearly discrete taking values in the set of non-negative integers, due to the flooring. Then, for any integer \(n \geq 0\) we have
\(
\begin{array}{c}
P(Y=n)=P(X \in[\text {an, } a(n+1))) \
=\int_{a n}^{a(n+1)} \lambda \mathrm{e}^{-\lambda x} d x=(1-p)^{n} p
\end{array}
\)
where \(p=1-e^{-\lambda a} \in(0,1),\) as \(\lambda>0\) and \(a>0\).

Testing of Hypothesis

\(H_{0}: \lambda=1\) vs \(H_{1}: \lambda>1\)

We reject \(H_{0}\) when \(\sum_{i=1}^{n}\left[X_{i}\right]>c_{n} .\)

Here, the size of the test i.e the Type 1 error (for simple hypothesis), \( \alpha_n\) = \( P(S_n > c_{n} | \lambda=1)\).

We want to select \(c_n\) such that \(\alpha_n \to \alpha\).

\(S_n\) ~ NBinom(\(n,p\)), where \( p = 1-e^{-1} \) under \(H_0\).

Now, \(\frac{\sqrt{n}(\frac{S_n}{n} - \frac{1}{p})}{\sqrt{\frac{1-p}{p^2}}} \rightarrow Z = N(0,1)\) by Central Limit Theorem.

Observe that thus, \( \alpha_n = P(S_n > c_{n} | \lambda=1) \rightarrow P(Z > \frac{\sqrt{n}(\frac{c_n}{n} - \frac{1}{p})}{\sqrt{\frac{1-p}{p^2}}}) = \alpha\).

Thus, \( \frac{\sqrt{n}(\frac{c_n}{n} - \frac{1}{p})}{\sqrt{\frac{1-p}{p^2}}} = Z_{\alpha} \).

We can solve this to find \(c_n\), where \( p = 1-e^{-1} \)

Food for Thought

If X ~ Exponential(\(\lambda\)), then what is the distribution of {X} [ The fractional part of X]. This question is crucial is getting back Exponential Distrbution from Geometric Distribution.

Rather, the food for thought, asks you how do we get Exponential Distribution from Geometric Distribution.

Stay Tuned. Stay Blessed! See you in the next post.