Maximum Likelihood Estimation is an algorithm to find a reasonable estimator. Personally, it really woos my mind - simple and yet so beautiful. Method of Moments is simpler. It doesn't woo me :p. However, still, they have a lot of similarities. Thus, we have set off to explore them. Finally, we ask for a lot of food for thought. After all, we are all explorers at heart.
We ask "Is MLE = MOM? If not, when?"
We discover a rich relationship between the two. We discover the score function and so much more exciting.
Hints, Solution, and More
Find out examples where the estimates of Maximum Likelihood and Method of Moments are same.
Find out examples where the estimates of Maximum Likelihood and Method of Moments are not same.
Prove that Maximum Likelihood Estimation is same as solving \(\sum_{i=1}^{n} \frac{\partial}{\partial \theta} \log f\left(X_{i} \mid \theta\right)=0\).
Prove that Method of Moments Estimation is same as solving \((\frac{1}{n} \sum_{i=1}^{n} X_{i}^{k}-\mu_{k}(\theta)=0)\).
Let's explore the connection in the video.
Don't forget the food for thought.
Enjoy the video
Build your foundations.
Ace your Exams.
Learn. Enjoy. Practice. Repeat.
Above all, prove that \(E(h(X, \theta))=0\) for Maximum Likelihood Estimation and Method of Moments Estimation.
In addition, what is the intuition of the score function? Thus, we ask what is the intuition of the variance of the score function?
Do you think that method of moments and maximum likelihood estimate is equal for the exponential family?
As a result, we explore a one-parameter family. Thus, can you find out the pdf of the distributions for which the two estimates will be the same?
However, can you comment on the sufficient statistic, if the estimates are the same?
ISI MStat 2020 PSB Problem 9 | Discussion & Solution
ISI MStat 2020 PSB Problem 9
This post discuses the problem 9 of the ISI MStat 2020 PSB Entrance Exam.
A finite population has \(N\) units, with \(x_{i}\) being the value associated with the \(i^{\text {th }}\) unit, \(i=1,2, \ldots, N\). Let \(\bar{x}{N}\) be the population mean.
A statistician carries out the following experiment.
Step 1: Draw a SRSWOR of size \(n({1}\) and denote the sample mean by \(\bar{X}{n}\).
Step 2: Draw a SRSWR of size \(m\) from \(S{1}\). The \(x\) -values of the sampled units are denoted by {\(Y_{1}, \cdots, Y_{m} \)}.
Hints, Solution, and More
\(\tilde{X}\) follows SRSWOR on population with mean \(\mu\).
\(E_{\tilde{X}}\left(\bar{X}{n}\right)=\mu\)
\(\tilde{Y} \mid \tilde{X}\) follows SRSWR on \(\tilde{X}\) with mean \(\bar{X}{n}\)
This post discuses the problem 6 of the ISI MStat 2020 PSB Entrance Exam.
Suppose individuals are classified into three categories C1,C2 and C3.
Let p2,(1−p)2 and 2p(1−p) be the respective population proportions, where p∈(0,1). A random sample of N individuals is selected from the population and the category of each selected individual recorded.
For i=1,2,3, let Xi denote the number of individuals in the sample belonging to category Ci. Define U=X1+X32.
Is \(U\) sufficient for \(p\) ? Justify your answer.
Show that the mean squared error of \(\frac{U}{N}\) is \(\frac{p(1-p)}{2 N}\).
Hints, Solution, and More
Prove that the joint distribution of \((X_1,X_2,X_3)\) follows Multinomial Distribution.
Write the Likelihood of the data.
Use Neymann Factorization to prove the sufficiency of \(U\).
Show that \(\frac{U}{N}\) is unbiased.
Show that \(2U\) follows Binomial Distribution.
Do subscribe to our channel to get instant notification of Live Session, so that you can join us live in the master classes!
Build your foundations.
Ace your Exams.
Learn. Enjoy. Practice. Repeat.
Prove that \(\frac{U}{N}\) is the UMVUE of \(p\).
Find the minimal sufficient and complete statistic of \(p\).
For other Food for Thought, refer to the youtube video for full solution.
IIT JAM Statistics Entrance Exam books based on Syllabus
Indian Institute of Technology is one of the top statistics departments in the country with great research and placement opportunities. It conducts an entrance exam for the aspirants who want to pursue a master's in Statistics called the IIT JAM MS Entrance Exam. To crack this exam, the best source of study materials is the right books. So, I am here to provide you the list of useful books for IIT JAM MS Entrance Exam preparation based on Syllabus.
A short note on the IIT JAM MS Entrance Exam: The MS programme offers advance level training in the theory, methods and applications of Statistics along with specialized training in selected areas of Statistics and allied fields. Depending on the area of specialization, students would be able to pursue an academic/research career in Statistics, Mathematics, Economics, Computer Science, and allied fields.
IIT JAM MS Entrance Exams Books according to the syllabus:
As mentioned in the website Entrance Exam mainly consists of 2 topics:
Mathematics (40%)
Probability and Statistics (60%)
Let's start with the Mathematics books required for IIT JAM MS Entrance Exam Preparation:
High School Mathematics - The mathematics part is super easy. So just be fluent in your 10+2 syllabus of mathematics and have a piece of sound knowledge in Calculus and Linear Algebra. Also, solve past year problems.
Although High School Mathematics part come less in the examination.
Let's discuss the books for Probability and Statistics Part, breaking it into different subsections according to the syllabus:
5. A book named "Solutions to IIT JAM for Mathematical Statistics" by Amit Mishra and Mohd. Arshad covers previous year solutions till 2018. You can consider buying that book for your IIT JAM MS Preparation.
Other Useful Resources
IIT JAM Statistics Crash Course for 2022
Early bird Registration is Going On. Classes start from 1st week of October, 2021.
IIT JAM MS 2021 Question Paper | Set C | Problems & Solutions
This post discusses the solutions to the problems from IIT JAM Mathematical Statistics (MS) 2021 Question Paper - Set C. You can find solutions in video or written form.
Note: This post is getting updated. Stay tuned for solutions, videos, and more.
IIT JAM Mathematical Statistics (MS) 2021 Problems & Solutions (Set C)
Problem 1
Let $f_{0}$ and $f_{1}$ be the probability mass functions given by
Consider the problem of testing the mull hypothesis $H_{0}: X \sim f_{0}$ a gainst $H_{1}: X \sim f_{1}$ based on a single sample $X .$ If $\alpha$ and $\beta$, respectively, denote the size and power of the test with critical region ${x \in \mathbb{R}: x>3},$ then $10(\alpha+\beta)$ is equal to ______________________
Let $\alpha, \beta$ and $\gamma$ be the eigenvalues of $M=\left[\begin{array}{ccc}0 & 1 & 0 \\ 1 & 3 & 3 \\ -1 & 2 & 2\end{array}\right] .$ If $y=1$ and $\alpha>\beta,$ then the value of $2 \alpha+3 \beta$ is ___________________________________
Answer: $7$
Problem 4
Let $S=\{(x, y) \in \mathbb{R}^{2}: 2 \leq x \leq y \leq 4\}$. Then, the value of the integral
$$ \iint_{S} \frac{1}{4-x} d x d y $$
is _______
Answer: 2
Problem 5
Let $M=\left(\begin{array}{cc}5 & -6 \ 3 & -4\end{array}\right)$ be a $2 \times 2$ matrix. If $\alpha=det \left(M^{4}-6 I_{2}\right),$ then the value of $\alpha^{2}$ is ________
Answer: 2500
Problem 6
Let $X$ be a random variable with moment generating function
Let $\beta$ denote the length of the curve $y=\ln (\sec x)$ from $x=0$ to $x=\frac{\pi}{4}$. Then, the value of $3 \sqrt{2}\left(e^{\beta}-1\right)$ is equal to _____
Answer: $6$
Problem 10
Let $A=\{(x, y, z) \in \mathbb{R}^{3}: 0 \leq x \leq y \leq z \leq 1\}$. Let $\alpha$ be the value of the integral
$$ \iiint_{A} x y z d x d y d z $$
Then, $384 \alpha$ is equal to _______
Answer: $8$
Problem 11
Let,
$$ a_{n}=\sum_{k=2}^{n}\left(\begin{array}{l} n \\ k \end{array}\right) \frac{2^{k}(n-2)^{n-k}}{n^{n}}, \quad n=2,3, \ldots $$
Then, $e^{2} \lim _{n \rightarrow \infty}\left(1-a{n}\right)$ is equal to ____
Answer: 3
Problem 12
Let $E_{1}, E_{2}, E_{3}$ and $E_{4}$ be four independent events such that $P\left(E_{1}\right)=\frac{1}{2}, P\left(E_{2}\right)=\frac{1}{3}, P\left(E_{3}\right)=\frac{1}{4}$ and $P\left(E_{4}\right)=\frac{1}{5} .$ Let $p$ be the probability that at most two events among $E_{1}, E_{2}, E_{3}$ and $E_{4}$ occur. Then, $240 p$ is equal to ____
Answer: 218
Problem 13
The number of real roots of the polynomial
$$ f(x)=x^{11}-13 x+5 $$
is ____
Answer:$3$
Problem 14
Let $S \subseteq \mathbb{R}^{2}$ be the region bounded by the parallelogram with vertices at the points (1,0),(3,2) , (3,5) and $(1,3) .$ Then. the value of the integral $\iint_{S}(x+2 y) d x d y$ is equal to ___
Answer: 42
Problem 15
Let $\alpha=\lim _{n \rightarrow \infty}\left(1+n \sin \frac{3}{n^{2}}\right)^{2 n}$. Then, $\ln \alpha$ is equal to ____
Answer: 6
Problem 16
Let $A=\{(x, y) \in \mathbb{R}^{2}: x^{2}-\frac{1}{2 \sqrt{\pi}}<y<x^{2}+\frac{1}{2 \sqrt{\pi}}\}$ and let the joint probability density function of $(X, Y)$ be
Then, the covariance between the random variables $X$ and $Y$ is equal to ____
Answer: 1
Problem 17
Let $\phi:(-1,1) \rightarrow \mathbb{R}$ be defined by
$$ \phi(x)=\int_{x^{7}}^{x^{4}} \frac{1}{1+t^{3}} d t $$
If $\alpha=\lim _{x \rightarrow 0} \frac{\phi(x)}{e^{2 x^{4}-1}},$ then $42 \alpha$ is equal to ____
Answer: 21
Problem 18
Let $S=\{(x, y) \in \mathbb{R}^{2} ; 0 \leq x \leq \pi, \min {\sin x, \cos x} \leq y \leq \max {\sin x, \cos x}\}$. If $\alpha$ is the area of $S$, then the value of $2 \sqrt{2} \alpha$ is equal to ____
Answer: 8
Problem 19
Let the random vector $(X, Y)$ have the joint probability mass function
Let $Z=Y-X+10 .$ If $\alpha=E(Z)$ and $\beta=Var(Z),$ then $8 \alpha+48 \beta$ is equal to ____
Answer: 225
Problem 20
Let $X_{1}$ and $X_{2}$ be independent $N(0,1)$ random variables. Define
$$ sgn(u)=\begin{cases} -1, \text { if } u<0 \\ 0, \text { if } u=0 \\ 1, \text { if } u>0 \end{cases}. $$
Let $Y_{1}=X_{1} sgn\left(X_{2}\right)$ and $Y_{2}=X_{2} sgn\left(X_{1}\right)$. If the correlation coefficient between $Y_{1}$ and $Y_{2}$ is $\alpha$, then $\pi \alpha$ is equal to ____
IIT JAM MS 2021 Question Paper | Set A | Problems & Solutions
This post discusses the solutions to the problems from IIT JAM Mathematical Statistics (MS) 2021 Question Paper - Set A. You can find solutions in video or written form.
Note: This post is getting updated. Stay tuned for solutions, videos, and more.
IIT JAM Mathematical Statistics (MS) 2021 Problems & Solutions (Set A)
Problem 1
The value of the limit
$$ \lim_{n \rightarrow \infty} \sum_{k=0}^{n}\left(\begin{array}{c} 2 n \\ k \end{array}\right) \frac{1}{4^{n}} $$
is equal to
Options-
$\frac{1}{4}$
$\frac{1}{2}$
$1$
$0$
Answer: $\frac{1}{2}$
Problem 2
If the series $\sum_{n=1}^{\infty} a_{n}$ converges absolutely, then which of the following series diverges?
Let $X$ be a $U(0,1)$ random variable and let $Y=X^{2}$. If $\rho$ is the correlation coefficient between the random variables $X$ and $Y$, then $48 \rho^{2}$ is equal to
Options-
1.$48$; 2.$30$; 3.$45$; 4.$35$.
Answer: $45$ Solution:
Problem 4
Let $\{X_{n}\}_{n \geq 1}$ be a sequence of independent and identically distributed random variables with probability density function
f(x)={1,0, if 0<x<1 otherwise
Then. the value of the limit
limn→∞P(−1n∑i=1nlnXi≤1+1n−−√)
is equal to
Options -
$0$;
$\Phi(2)$;
$\Phi(1)$;
$\frac{1}{2}$.
Answer: $\Phi(1)$
Problem 5
Let $f: \mathbb{R} \rightarrow \mathbb{R}$ be a function defined by
$$ f(x)=x^{7}+5 x^{3}+11 x+15, x \in \mathbb{R} $$
Then, which of the following statements is TRUE?
Options -
$f$ is onto but NOT one-one
$f$ is one-one but NOT onto
$f$ is both one-one and onto
$f$ is neither one-one nor onto
Answer: $f$ is both one-one and onto
Problem 6
There are three urns, labeled. Urn $1$ , Urn $2$ and Urn $3$ . Urn $1$ contains $2$ white balls and $2$ black balls, Urn $2$ contains $1$ white ball and $3$ black balls and Urn $3$ contains $3$ white balls and $1$ black ball. Consider two coins with probability of obtaining head in their single trials as $0.2$ and $0.3 .$ The two coins are tossed independently once, and an urn is selected according to the following scheme: Urn $1$ is selected if $2$ heads are obtained: Urn $3$ is selected if $2$ tails are obtained; otherwise Urn $2$ is selected. A ball is then drawn at random from the selected urn. Then $P($ Urn 1 is selected $\mid$ the ball drawn is white $)$ is equal to
Options -
$\frac{12}{109}$
$\frac{1}{18}$
$\frac{6}{109}$
$\frac{1}{9}$
Answer: $\frac{6}{109}$
Problem 7
Let $X$ be a random variable with probability density function
Let $M$ be a $3 \times 3$ real matrix. Let $\left(\begin{array}{l}1 \\ 2 \\ 3\end{array}\right),\left(\begin{array}{l}1 \\ 1 \\ 1\end{array}\right)$ and $\left(\begin{array}{c}0 \\ -1 \\ \alpha\end{array}\right)$ be the eigenvectors of $M$ corresponding to three distinct eigenvalues of $M$. where $\alpha$ is a real number. Then. which of the following is NOT a possible value of $\alpha$ ?
Options-
1.$1$ 2 .$-2$ 3.$2$ 4.$0$
Answer: $-2$
Problem 10
The value of the limit
$$ \lim _{x \rightarrow 0} \frac{e^{-3 x}-e^{x}+4 x}{5(1-\cos x)} $$ is equal to
Options -
$\frac{2}{5}$
0
$\frac{8}{5}$
1
Answer: $\frac{8}{5}$
Problem 11
Consider a sequence of independent Bernoulli trials with probability of success in each trial as $\frac{1}{3}$. The probability that three successes occur before four failures is equal to
$$ S=\sum_{k=1}^{\infty}(-1)^{k-1} \frac{1}{k}\left(\frac{1}{4}\right)^{k} \text { and } T=\sum_{k=1}^{\infty} \frac{1}{k}\left(\frac{1}{5}\right)^{k} $$
Then, which of the following statements is TRUE?
Options -
1.$5 S-4 T=0$ 2.$S-T=0$
$16 S-25 T=0$
$4 S-5 T=0$
Answer: $S-T=0$
IIT JAM 2021 - Problem 13
Let $a_{1}=5$ and define recursively
$$ a_{n+1}=3^{\frac{1}{4}}\left(a_{n}\right)^{\frac{3}{4}}, \quad n \geq 1 $$
Then, which of the following statements is TRUE?
Options-
$\{a_{n}\}$ is monotone decreasing, and $\lim _{n \rightarrow \infty} a{n}=3$
$\{a_{n}\}$ is decreasing, and $\lim_ {n \rightarrow \infty} a{n}=0$
$\{a_{n}\}$ is non-monotone, and $\lim_ {n \rightarrow \infty} a{n}=3$
$\{a_{n}\}$ is monotone increasing, and $\lim _{n \rightarrow \infty} a{n}=3$
Answer:$\{a_{n}\}$ is monotone decreasing, and $\lim _{n \rightarrow \infty} a{n}=3$
Problem 14
Let $E_{1}, E_{2}$ and $E_{3}$ be three events such that $P\left(E_{1}\right)=\frac{4}{5}, P\left(E_{2}\right)=\frac{1}{2}$ and $P\left(E_{3}\right)=\frac{9}{10}$ Then. which of the following statements is FALSE?
Let $E_{1}, E_{2}, E_{3}$ and $E_{4}$ be four events such that $$ P\left(E_{i} \mid E_{4}\right)=\frac{2}{3}, i=1,2,3 ; P\left(E_{i} \cap E_{j}^{c} \mid E_{4}\right)=\frac{1}{6}, i, j=1,2,3 ; i \neq j \text { and } P\left(E_{1} \cap E_{2} \cap E_{3}^{c} \mid E_{4}\right)=\frac{1}{6} $$ Then. $P\left(E_{1} \cup E_{2} \cup E_{3} \mid E_{4}\right)$ is equal to
$\frac{1}{2}$
$\frac{5}{6}$
$\frac{2}{3}$
$\frac{7}{12}$
Answer: $\frac{5}{6}$
Problem 16
Let $X$ be a random variable having the probability density function
Define $Y=[X]$, where $[X]$ denotes the largest integer not exceeding $X$. Then, $E\left(Y^{2}\right)$ is equal to
Options -
1.$\frac{e+1}{(e-1)^{2}}$
$\frac{(e+1)^{2}}{(e-1)^{2}}$
$\frac{e(e+1)^{2}}{e-1}$
$\frac{e(e+1)}{e-1}$
Answer: $\frac{e+1}{(e-1)^{2}}$
Problem 17
Let $X$ be a continuous random variable with distribution function
$$ F(x)=\begin{cases} 0, & \text { if } x<0 \\ a x^{2}, & \text { if } 0 \leq x<2 \\ 1, & \text { if } x \geq 2 \end{cases}. $$
for some real constant $a$. Then, $E(X)$ is equal to
Options -
1.1 2 .$ \frac{4}{3}$
$\frac{1}{4}$
0
Answer: $ \frac{4}{3}$
Problem 18
Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from $U(\theta-5, \theta+5),$ where $\theta \in(0, \infty)$ is unknown. Let $T=\max \{X_{1}, X_{2}, \ldots, X_{n}\}$ and $U=\min \{X_{1}, X_{2}, \ldots, X_{n}\} .$ Then, which of the following statements is TRUE?
Options -
$U+8$ is an MLE of $\theta$
$\frac{T+U}{2}$ is the unique $\mathrm{MLE}$ of $\theta$
MLE of $\frac{1}{\theta}$ does NOT exist
$\frac{2}{T+U}$ is an $\mathrm{MLE}$ of $\frac{1}{\theta}$
Answer: $\frac{2}{T+U}$ is an $\mathrm{MLE}$ of $\frac{1}{\theta}$
Problem 19
Consider the problem of testing $H_{0}: X \sim f_{0}$ against $H_{1}: X \sim f_{1}$ based on a sample of size 1 , where
$f_{0}(x)=\begin{cases}1, 0 \leq x \leq 1 \\ 0, \text { otherwise }\end{cases}.$ and $f_{1}(x)=\begin{cases}2-2 x , 0 \leq x \leq 1 \\ 0, \text { otherwise }\end{cases}$.
Then, the probability of Type II error of the most powerful test of size $\alpha=0.1$ is equal to
Options -
0.1
1
0.91
0.81
Answer: 0.81 Solution:
Problem 20
Let $X$ and $Y$ be random variables having chi-square distributions with 6 and 3 degrees of freedom, respectively. Then, which of the following statements is TRUE?
Options -
$P(X<6)>P(Y<6)$
$P(X>0.7)>P(Y>0.7)$
$P(X>3)3)$
$P(X>0.7)0.7)$
Answer: $P(X>0.7)>P(Y>0.7)$
Problem 21
Let $f: \mathbb{R}^{2} \rightarrow \mathbb{R}$ be a function defined by
Let $f_{x}(x, y)$ and $f_{y}(x, y)$ denote the first order partial derivatives of $f(x, y)$ with respect to $x$ and $y$, respectively, at the point $(x, y)$. Then, which of the following statements is FALSE?
Options -
$f$ is NOT differentiable at (0,0)
$f_{y}(0,0)$ exists and $f_{y}(x, y)$ is continuous at (0,0)
$f_{y}(x, y)$ exists and is bounded at every $(x, y) \in \mathbb{R}^{2}$
$f_{x}(x, y)$ exists and is bounded at every $(x, y) \in \mathbb{R}^{2}$
Answer: $f_{y}(0,0)$ exists and $f_{y}(x, y)$ is continuous at (0,0)
Problem 22
Let $(X, Y)$ be a random vector with joint moment generating function
where $\theta \in(0, \infty)$ is unknown. Let $\alpha \in(0,1)$ be fixed and let $\beta$ be the power of the most powerful test of size $\alpha$ for testing $H_{0}: \theta=1$ against $H_{1}: \theta=2$. Consider the critical region
where for any $\gamma \in(0,1), \chi_{2 n}^{2}(\gamma)$ is a fixed point such that $P\left(x_{2 n}^{2}>x_{2 n}^{2}(\gamma)\right)=\gamma .$ Then, the critical region $R$ corresponds to the
Options-
1.most powerful test of size $\beta$ for testing $H_{0}^{}: \theta=2$ against $H_{1}^{}: \theta=1$ 2.most powerful test of size $\alpha$ for testing $H_{0}: \theta=1$ against $H_{1}: \theta=2$ 3.most powerful test of size $1-\beta$ for testing $H_{0}^{}: \theta=2$ against $H_{1}^{}: \theta=1$ 4.most powerful test of size $1-\alpha$ for testing $H_{0}^{}: \theta=2$ against $H_{1}^{}: \theta=1$
Answer: most powerful test of size $\alpha$ for testing $H_{0}^{}: \theta=2$ against $H_{1}^{}: \theta=1$ [No Option]
Problem 24
Consider three coins having probabilities of obtaining head in a single trial as $\frac{1}{4}, \frac{1}{2}$ and $\frac{3}{4}$, respectively, A player selects one of these three coins at random (each coin is equally likely to be selected). If the player tosses the selected coin five times independently, then the probability of obtaining two tails in five tosses is equal to
Options -
$\frac{64}{384}$
$\frac{125}{384}$
$\frac{255}{384}$
$\frac{85}{384}$
Answer: $\frac{85}{384}$
Problem 25
For $a \in \mathbb{R}$, consider the system of linear equations
$\begin{array}{ll}a x+a y & =a+2 \\ x+a y+(a-1) z & =a-4 \\ a x+a y+(a-2) z & =-8\end{array}$
in the unknowns $x, y$ and $z$. Then. which of the following statements is $\mathbf{T R U E}$ ?
Options -
The given system has a unique solution for $a=-2$
The given system has a unique solution for $a=1$
The given system has infinitely many solutions for $a=-2$
The given system has infinitely many solutions for $a=2$
Answer: The given system has a unique solution for $a=-2$
Problem 26
Let $X$ and $Y$ be independent $N(0,1)$ random variables and $Z=\frac{|X|}{|Y|} .$ Then, which of the following expectations is finite?
Options -
$E(Z)$
$E\left(\frac{1}{Z \sqrt{Z}}\right)$
$E(Z \sqrt{Z})$
$E\left(\frac{1}{\sqrt{Z}}\right)$
Answer: $E\left(\frac{1}{\sqrt{Z}}\right)$
Problem 27
Let $\{X_{n}\}_{n>1}$ be a sequence of independent and identically distributed $N(0,1)$ random variables. Then,
$$ \lim _{n \rightarrow \infty} P\left(\frac{\sum{i=1}^{n} X_{i}^{4}-3 n}{\sqrt{32 n}} \leq \sqrt{6}\right) $$ is equal to
Options -
0
$\Phi(\sqrt{2})$
$\frac{1}{2}$
$\Phi(1)$
Answer: $\Phi(\sqrt{2})$
Problem 28
Let $X$ be a continuous random variable having the moment generating function
$$ M(t)=\frac{e^{t}-1}{t}, \quad t \neq 0 $$
Let $\alpha=P\left(48 X^{2}-40 X+3>0\right)$ and $\beta=P\left((\ln X)^{2}+2 \ln X-3>0\right)$. Then, the value of $\alpha-2 \ln \beta$ is equal to
Options-
$\frac{10}{3}$
$\frac{13}{3}$
$\frac{19}{3}$
$\frac{17}{3}$
Answer: $\frac{19}{3}$
Problem 29
Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 3)$ be a random sample from Poisson $(\theta),$ where $\theta \in(0, \infty)$ is unknown and let
$$ T=\sum_{i=1}^{n} X_{i} $$
Then, the uniformly minimum variance unbiased estimator of $e^{-2 \theta} \theta^{3}$
Options -
is $\quad \frac{T}{n}\left(\frac{T}{n}-1\right)\left(\frac{T}{n}-2\right)\left(1-\frac{2}{n}\right)^{T-3}$
is $\frac{T(T-1)(T-2)(n-2)^{T-3}}{n^{T}}$
does NOT exist
is $e^{-\frac{2 T}{n}\left(\frac{T}{n}\right)^{3}}$
Answer: $\frac{T(T-1)(T-2)(n-2)^{T-3}}{n^{T}}$
Problem 30
Let $\{a_{n}\}_{n \geq 1}$ be a sequence of real numbers such that $a_{n} \geq 1$, for all $n \geq 1$. Then, which of the following conditions imply the divergence of $\{a_{n}\}_{n \geq 1} ?$
Options -
1$\sum_{n=1}^{\infty} b_{n}$ converges, where $b_{1}=a_{1}$ and $b_{n}=a_{n+1}-a_{n},$ for all $n>1$
IIT JAM MS 2021 Question Paper | Set B | Problems & Solutions
This post discusses the solutions to the problems from IIT JAM Mathematical Statistics (MS) 2021 Question Paper - Set B. You can find solutions in video or written form.
Note: This post is getting updated. Stay tuned for solutions, videos, and more.
IIT JAM Mathematical Statistics (MS) 2021 Problems & Solutions (Set B)
Problem 1
A sample of size $n$ is drawn randomly (without replacement) from an urn couraining $5 n^{2}$ balls, of which $2 n^{2}$ are red balls and $3 n^{2}$ are black balls. Let $X_{n}$ denote the number of red balls in the selected sample. If $\ell=\lim _{n \rightarrow \infty} \frac{E\left(X{n}\right)}{n}$ and $m=\lim _{n \rightarrow \infty} \frac{Var (X{n})}{n},$ then which of the following statements is/are TR UE?
where $\theta \in(0, \infty)$ is unknown. If $T=\sum_{i =1}^{n} X_{i}^{3}$, then which of the following statements is/are TRUE?
Options -
1.$\frac{n-1}{T}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$ 2.$\frac{n}{T}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$ 3.$(n-1) \sum_{i=1}^{n} \frac{1}{x_{i}^{3}}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$
$\frac{n}{T}$ is the MLE of $\frac{1}{\theta}$
Answer: $\frac{n-1}{T}$ is the unique uniformly minimum variance unbiased estimator of $\frac{1}{\theta}$ $\frac{n}{T}$ is the MLE of $\frac{1}{\theta}$
Problem 3
Consider the linear system $A \underline{x}=\underline{b}$, where $A$ is an $m \times n$ matrix, $\underline{x}$ is an $n \times 1$ vector of unknowns and $b$ is an $m \times 1$ vector. Further, suppose there exists an $m \times 1$ vector $c$ such that the linear system $A \underline{x}=c$ has No solution. Then, which of the following statements is/are necessarily TRUE?
Options -
1.If $m \leq n$ and $d$ is the first column of $A$, then the linear system $A \underline{x}=\underline{d}$ has a unique solution 2.If $m>n,$ then the linear system $A x=0$ has a solution other than $x=0$
If $m \geq n,$ then $Rank(A)<n$
$Rank(A)<m$
. Answer: $Rank(A)<m$
Problem 4
Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be independent and identically distributed random variables with probability density function
Let $X_{1}, X_{2}, \ldots, X_{n}$ be a random sample from $N(\theta, 1),$ where $\theta \in(-\infty, \infty)$ is unknown. Consider the problem of testing $H_{0}: \theta \leq 0$ against $H_{1}: \theta>0 .$ Let $\beta(\theta)$ denote the power function of the likelihood ratio test of size $\alpha(0<\alpha<1)$ for testing $H_{0}$ against $H_{1}$. Then. which of the following statements is/are TRUE?
Options -
1.The critical region of the likelihood test of size $\alpha$ is $$ \{\left(x_{1}, x_{2}, \ldots, x_{n}\right) \in \mathbb{R}^{n}: \sqrt{n} \frac{\sum_{i=1}^{n} x_{i}}{n}<\tau_{\alpha}\} $$ where $\tau_{\alpha}$ is a fixed point such that $P\left(Z>\tau_{\alpha}\right)=\alpha, Z \sim N(0,1)$
$\beta(\theta)>\beta(0),$ for all $\theta>0$
The critical region of the likelihood test of size $\alpha$ is
{(x1,x2,…,xn)∈Rn:n−−√∑ni=1xin>τα/2}
where $\tau_{\alpha / 2}$ is a fixed point such that $P\left(Z>\tau_{\alpha / 2}\right)=\frac{\alpha}{2}, Z \sim N(0,1)$
$\beta(\theta)<\beta(0),$ for all $\theta>0$
Answer: $\beta(\theta)>\beta(0),$ for all $\theta>0$
Problem 6
Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from a distribution with probability density function
where $\theta \in(0, \infty)$ is unknown. If $R=\min \{X_{1}, X_{2}, \ldots, X_{n}\}$ and $S=\max \{X_{1}, X_{2}, \ldots, X_{n}\},$ then which of the following statements is/are TRUE?
Options -
1.$\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\}$ is a complete and sufficient statistic for $\theta$
$S$ is an $\mathrm{MLE}$ of $\theta$
$(R, S)$ is jointly sufficient for $\theta$
Distribution of $\frac{R}{S}$ does NOT depend on $\theta$
Answer: $\max \{\left|X_{1}\right|,\left|X_{2}\right|, \ldots,\left|X_{n}\right|\}$ is a complete and sufficient statistic for $\theta$ $(R, S)$ is jointly sufficient for $\theta$ Distribution of $\frac{R}{S}$ does NOT depend on $\theta$
Problem 7
Let $X_{1}, X_{2}, \ldots, X_{n}(n \geq 2)$ be a random sample from a distribution with probability density function
where $\theta \in(0, \infty)$ is unknown. Then, which of the following statements is/are TRUE?
Options -
1 .There does NOT exist any unbiased estimator of $\frac{1}{\theta}$ which attains the Cramer-Rao lower bound 2.Cramer-Rao lower bound, based on $X_{1}, X_{2}, \ldots, X_{n},$ for the estimand $\theta^{3}$ is $\frac{\theta^{2}}{n}$ 3 .Cramer-Rao lower bound. based on $X_{1}, X_{2}, \ldots, X_{n},$ for the estimand $\theta^{3}$ is $9 \frac{\theta^{6}}{n}$ 4 .There exists an unbiased estimator of $\frac{1}{\theta}$ which attains the Cramer-Rao lower bound
Answer: Cramer-Rao lower bound. based on $X_{1}, X_{2}, \ldots, X_{n},$ for the estimand $\theta^{3}$ is $9 \frac{\theta^{6}}{n}$ There exists an unbiased estimator of $\frac{1}{\theta}$ which attains the Cramer-Rao lower bound
Problem 8
Let $f: \mathbb{R} \rightarrow \mathbb{R}$ be a twice differentiable function. Then, which of the following statements is/are necessarily TRUE?
Options-
$f^{\prime \prime}$ is continuous
$f^{\prime \prime}$ is bounded on (0,1)
If $f^{\prime}(0)=f^{\prime}(1),$ then $f^{\prime \prime}(x)=0$ has a solution in (0,1)
$f^{\prime}$ is bounded on [8,10]
Answer: If $f^{\prime}(0)=f^{\prime}(1),$ then $f^{\prime \prime}(x)=0$ has a solution in (0,1) $f^{\prime}$ is bounded on [8,10]
Problem 9
Let $A$ be a $3 \times 3$ real matrix such that $A \neq I_{3}$ and the sum of the entries in each row of $A$ is $1$. Then which of the following statements is/are necessarily TRUE?
Options -
The characteristic polynomial, $p(\lambda),$ of $A+2 A^{2}+A^{3}$ has $(\lambda-4)$ as a factor
$A-I_{3}$ is an invertible matrix
$A$ cannot be an orthogonal matrix
The set $\{\underline{x} \in \mathbb{R}^{3}:\left(A-I_{3}\right) \underline{x}=\underline{0}\}$ has at least two elements $(\underline{x}$ is a column vector)
Answer: The characteristic polynomial, $p(\lambda),$ of $A+2 A^{2}+A^{3}$ has $(\lambda-4)$ as a factor
We are really happy with the performance of our students and thus, we have initiated to name the Toppers of IIT JAM Stat Mock Test. These toppers are named in this leader board according to their performance in IIT JAM Stat Mock Tests.
Testing of Hypothesis | ISI MStat 2016 PSB Problem 9
This is a problem from the ISI MStat Entrance Examination, 2016 involving the basic idea of Type 1 error of Testing of Hypothesis but focussing on the fundamental relationship of Exponential Distribution and the Geometric Distribution.
The Problem:
Suppose \(X_{1}, X_{2}, \ldots, X_{n}\) is a random sample from an exponential distribution with mean \(\lambda\).
Assume that the observed data is available on \(\left[X_{1}\right], \ldots,\left[X_{n}\right]\), instead of \(X_{1}, \ldots, X_{n},\) where \([x]\) denotes the largest integer less than or equal to \(x\).
Consider a test for \(H_{0}: \lambda=1\) vs \(H_{1}: \lambda>1\) which rejects \(H_{0}\) when \(\sum_{i=1}^{n}\left[X_{i}\right]>c_{n} .\)
Given \(\alpha \in(0,1),\) obtain values of \(c_{n}\) such that the size of the test converges to \(\alpha\) as \(n \rightarrow \infty\).
Prerequisites:
(a) Testing of Hypothesis
(b)Type 1 Error
(c) Exponential Distribution
(d) Relationship of Exponential Distribution and Geometric Distribution
(e) Central Limit Theorem
Solution:
Proof:
\(Y\) is clearly discrete taking values in the set of non-negative integers, due to the flooring. Then, for any integer \(n \geq 0\) we have \( \begin{array}{c} P(Y=n)=P(X \in[\text {an, } a(n+1))) \ =\int_{a n}^{a(n+1)} \lambda \mathrm{e}^{-\lambda x} d x=(1-p)^{n} p \end{array} \) where \(p=1-e^{-\lambda a} \in(0,1),\) as \(\lambda>0\) and \(a>0\).
Testing of Hypothesis
\(H_{0}: \lambda=1\) vs \(H_{1}: \lambda>1\)
We reject \(H_{0}\) when \(\sum_{i=1}^{n}\left[X_{i}\right]>c_{n} .\)
Here, the size of the test i.e the Type 1 error (for simple hypothesis), \( \alpha_n\) = \( P(S_n > c_{n} | \lambda=1)\).
We want to select \(c_n\) such that \(\alpha_n \to \alpha\).
\(S_n\) ~ NBinom(\(n,p\)), where \( p = 1-e^{-1} \) under \(H_0\).
Now, \(\frac{\sqrt{n}(\frac{S_n}{n} - \frac{1}{p})}{\sqrt{\frac{1-p}{p^2}}} \rightarrow Z = N(0,1)\) by Central Limit Theorem.
We can solve this to find \(c_n\), where \( p = 1-e^{-1} \)
Food for Thought
If X ~ Exponential(\(\lambda\)), then what is the distribution of {X} [ The fractional part of X]. This question is crucial is getting back Exponential Distrbution from Geometric Distribution.
Rather, the food for thought, asks you how do we get Exponential Distribution from Geometric Distribution.
Stay Tuned. Stay Blessed! See you in the next post.