Categories

## IIT JAM Stat MS 2021 Problem Solving Crash Course

IIT JAM Stat Entrance Exam 2021 is just around the corner. Cheenta Statistics Department has declared a Crash Course for the aspirants of IIT JAM Stat Masters Entrance 2021. This course includes:

• Weekly Two Live Classes – Tuesday and Wednesday
• Quality Group Discussion
• Recorded Sessions (in case, you miss it)
• 12 – 15 + hours of classes
• Past year IIT JAM Stat Problems
• Full review of Techniques
• Class notes

This Course is available till the IIT JAM Statistics Entrance Exam 2021 and it costs Rs. 2021 INR.

## ISI MStat Entrance 2020 Problems and Solutions

This post contains Indian Statistical Institute, ISI MStat Entrance 2020 Problems and Solutions. Try to solve them out.

## Subjective Paper – ISI MStat Entrance 2020 Problems and Solutions

• Let $f(x)=x^{2}-2 x+2$. Let $L_{1}$ and $L_{2}$ be the tangents to its graph at $x=0$ and $x=2$ respectively. Find the area of the region enclosed by the graph of $f$ and the two lines $L_{1}$ and $L_{2}$.

Solution
• Find the number of $3 \times 3$ matrices $A$ such that the entries of $A$ belong to the set $\mathbb{Z}$ of all integers, and such that the trace of $A^{t} A$ is 6 . $\left(A^{t}\right.$ denotes the transpose of the matrix $\left.A\right)$.

Solution
• Consider $n$ independent and identically distributed positive random variables $X_{1}, X_{2}, \ldots, X_{n},$ Suppose $S$ is a fixed subset of ${1,2, \ldots, n}$ consisting of $k$ distinct elements where $1 \leq k<n$
(a) Compute $\mathbb{E}\left[\frac{\sum_{i \in S} X_{i}}{\sum_{i=1}^{n} X_{i}}\right]$

(b) Assume that $X_{i}$ ‘s have mean $\mu$ and variance $\sigma^{2}, 0<\sigma^{2}<\infty$. If $j \notin S,$ show that the correlation between $\left(\sum_{i \in S} X_{i}\right) X_{j}$ and $\sum_{i \in S} X_{i}$ lies between -$\frac{1}{\sqrt{k+1}} \text { and } \frac{1}{\sqrt{k+1}}$.

Solution
• Let $X_{1,} X_{2}, \ldots, X_{n}$ be independent and identically distributed random variables. Let $S_{n}=X_{1}+\cdots+X_{n}$. For each of the following statements, determine whether they are true or false. Give reasons in each case.

(a) If $S_{n} \sim E_{x p}$ with mean $n,$ then each $X_{i} \sim E x p$ with mean 1 .

(b) If $S_{n} \sim B i n(n k, p),$ then each $X_{i} \sim B i n(k, p)$

Solution
• Let $U_{1}, U_{2}, \ldots, U_{n}$ be independent and identically distributed random variables each having a uniform distribution on (0,1) . Let $X=\min \{U_{1}, U_{2}, \ldots, U_{n}\}$, $Y=\max \{U_{1}, U_{2}, \ldots, U_{n}\}$

Evaluate $\mathbb{E}[X \mid Y=y]$ and $\mathbb{E}[Y \mid X=x]$.

Solution
• Suppose individuals are classified into three categories $C_{1}, C_{2}$ and $C_{3}$ Let $p^{2},(1-p)^{2}$ and $2 p(1-p)$ be the respective population proportions, where $p \in(0,1)$. A random sample of $N$ individuals is selected from the population and the category of each selected individual recorded.

For $i=1,2,3,$ let $X_{i}$ denote the number of individuals in the sample belonging to category $C_{i} .$ Define $U=X_{1}+\frac{X_{3}}{2}$

(a) Is $U$ sufficient for $p ?$ Justify your answer.

(b) Show that the mean squared error of $\frac{U}{N}$ is $\frac{p(1-p)}{2 N}$

Solution
• Consider the following model: $y_{i}=\beta x_{i}+\varepsilon_{i} x_{i}, \quad i=1,2, \ldots, n$, where $y_{i}, i=1,2, \ldots, n$ are observed; $x_{i}, i=1,2, \ldots, n$ are known positive constants and $\beta$ is an unknown parameter. The errors $\varepsilon_{1}, \varepsilon_{2}, \ldots, \varepsilon_{n}$ are independent and identically distributed random variables having the probability density function $f(u)=\frac{1}{2 \lambda} \exp \left(-\frac{|u|}{\lambda}\right), \quad-\infty<u<\infty$ and $\lambda$ is an unknown parameter.

(a) Find the least squares estimator of $\beta$.

(b) Find the maximum likelihood estimator of $\beta$.

Solution
• Assume that $X_{1}, \ldots, X_{n}$ is a random sample from $N(\mu, 1),$ with $\mu \in \mathbb{R}$. We want to test $H_{0}: \mu=0$ against $H_{1}: \mu=1$. For a fixed integer $m \in{1, \ldots, n},$ the following statistics are defined:

\begin{aligned}
T_{1} &= \frac{\left(X_{1}+\ldots+X_{m}\right)}{m} \\
T_{2} &= \frac{\left(X_{2}+\ldots+X_{m+1}\right)}{m} \\
\vdots &=\vdots \\
T_{n-m+1} &= \frac{\left(X_{n-m+1}+\ldots+X_{n}\right)}{m}
\end{aligned}

$\operatorname{Fix} \alpha \in(0,1) .$ Consider the test

Reject $H_{0}$ if $\max \{T_{i}: 1 \leq i \leq n-m+1\}>c_{m, \alpha}$

Find a choice of $c_{m, \alpha} \in \mathbb{R}$ in terms of the standard normal distribution function $\Phi$ that ensures that the size of the test is at most $\alpha$.

Solution
• A finite population has $N$ units, with $x_{i}$ being the value associated with the $i$ th unit, $i=1,2, \ldots, N$. Let $\bar{x}{N}$ be the population mean. A statistician carries out the following experiment.

Step 1: Draw an SRSWOR of size $n({1}$ and denote the sample mean by $\bar{X}{n}$

Step 2: Draw a SRSWR of size $m$ from $S_{1}$. The $x$ -values of the sampled units are denoted by $\{Y_{1}, \ldots, Y_{m}\}$

An estimator of the population mean is defined as,

$\widehat{T}{m}=\frac{1}{m} \sum{i=1}^{m} Y_{i}$

(a) Show that $\widehat{T}{m}$ is an unbiased estimator of the population mean.

(b) Which of the following has lower variance: $\widehat{T}{m}$ or $\bar{X}_{n} ?$

Solution

## Objective Paper

 1. C 2. D 3. A 4. B 5. A 6. B 7. C 8. A 9. C 10. A 11. C 12. D 13. C 14. B 15. B 16. C 17. D 18. B 19. B 20. C 21. C 22. D 23. A 24. B 25. D 26. B 27. D 28. D 29. B 30. C

Watch videos related to the ISI MStat Problems here.

## How to roll a Dice by tossing aÂ Coin ? Cheenta Statistics Department

How can you roll a dice by tossing a coin? Can you use your probability knowledge? Use your conditioning skills.

Suppose, you have gone to a picnic with your friends. You have planned to play the physical version of the Snake and Ladder game. You found out that you have lost your dice.

The shit just became real!

Now, you have an unbiased coin in your wallet / purse. You know Probability.

### Aapna Time Aayega

starts playing in the background. :p

## Can you simulate the dice from the coin?

Ofcourse, you know chances better than others. :3

Take a coin.

Toss it 3 times. Record the outcomes.

HHH = Number 1

HHT = Number 2

HTH = Number 3

HTT = Number 4

THH = Number 5

THT = Number 6

TTH = Reject it, don’t ccount the toss and toss again

TTT = Reject it, don’t ccount the toss and toss again

Voila done!

What is the probability of HHH in this experiment?

Let X be the outcome in the restricted experiment as shown.

How is this experiment is different from the actual experiment?

This experiment is conditioning on the event A = {HHH, HHT, HTH, HTT, THH, THT}.

$P( X = HHH) = P (X = HHH | X \in A ) = \frac{P (X = HHH)}{P (X \in A)} = \frac{1}{6}$

Beautiful right?

Can you generalize this idea?

## Food for thought

• Give an algorithm to simulate any conditional probability.
• Give an algorithm to simulate any event with probability $\frac{m}{2^k}$, where $m \leq 2^k$.
• Give an algorithm to simulate any event with probability $\frac{m}{2^k}$, where $n \leq 2^k$.
• Give an algorithm to simulate any event with probability $\frac{m}{n}$, where $m \leq n \leq 2^k$ using conditional probability.

## Watch the Video here:

Books for ISI MStat Entrance Exam

How to Prepare for ISI MStat Entrance Exam

ISI MStat and IIT JAM Stat Problems and Solutions

Cheenta Statistics Program for ISI MStat and IIT JAM Stat

Simple Linear Regression – Playlist on YouTube

Categories

## ISI MStat PSB 2015 Problem 2 | Vector Space & its Dimension

This is a beautiful problem from ISI MStat PSB 2015 Problem 2. We provide detailed solution with prerequisite mentioned explicitly.

## Problem- ISI MStat PSB 2015 Problem 2

For any $n \times n$ matrix $A=\left(\left(a_{i j}\right)\right),$ consider the following three proper-
ties:

1. $a_{i j}$ is real valued for all $i, j$ and $A$ is upper triangular.
2. $\sum_{j=1}^{n} a_{i j}=0,$ for all $1 \leq i \leq n$
3. $\sum_{i=1}^{n} a_{i j}=0,$ for all $1 \leq j \leq n$
Define the following set of matrices:
$c_n$ = {A: A is $n \times n$ and satisfies (1),(2) and (3) above }

(a) Show that $c_n$ is a vector space for any $n \geq 1$ .

(b) Find the dimension of , $c_n$ when n = 2 and n = 3.

## Prerequisites

• Upper triangular matrix
• Subspace of a vector space
• Dimension of a vector space

## Solution

(a) To show that $c_n$ is a vector space for any $n \geq 1$

So, here if we can show that $c_n$ is a subspace of the vector space of $n\times n$ real matrices with usual matrix addition and scalar multiplication then we are done!

Let’s try to show this ,

Putting $a_{i j} =0$ for all i,j then $A= \left(\left(a_{i j}\right)\right),$ satisfies all the properties (1),(2) & (3) .

So, $\begin{pmatrix} 0 & 0 &… & 0 \\ 0 & 0 &… & 0 \\ \vdots & \vdots & \vdots \\ 0 & 0 &… & 0 \end{pmatrix}$ $\epsilon$ $c_n$

Shall show that (i) for all $A , B$ $\epsilon$ $c_n$ , $A + B \epsilon c_n$ and

(ii) for all $A$ $\epsilon$ $c_n$ for all $p_1 \epsilon$ {$\mathbb{R}$ }-{0} , $p_1 A \epsilon c_n$

For (i) Take any $A=((a_{i j})) , B=(( b_{i j}))$ $\epsilon$ $c_n$

Let , D=$A + B$ and if $D=(( d_{i j}))$ then $d_{ij}= a_{i j} + b_{i j}$

Now we will see whether D satisfies all the three properties (1),(2) and (3)

$d_{ij} =0$ when $a_{i j}=0$ and $b_{i j} =0$

Hence as A and B are upper triangular matrix , D is also an upper triangular matrix .

So it satisfies property (1)

Again , $\sum_{j=1}^{n} a_{i j}=0,$ for all $1 \leq i \leq n$ and $\sum_{j=1}^{n} b_{i j}=0,$ for all $1 \leq i \leq n$ ,

then $\sum_{j=1}^{n} d_{i j}=0,$ for all $1 \leq i \leq n$ as $d_{ij}=a_{i j} + b_{i j}$

Hence it satisfies property (2) .

Now we have $\sum_{i=1}^{n} a_{i j}=0,$ for all $1 \leq j \leq n$ and $\sum_{i=1}^{n} b_{i j}=0,$ for all $1 \leq j \leq n$ ,then $\sum_{i=1}^{n} d_{i j}=0,$ for all $1 \leq j \leq n$ as $d_{ij}=a_{i j} + b_{i j}$

Hence it satisfies the properties (3)

For (ii) Take any $A=((a_{i j}))$ $\epsilon$ $c_n$

take any $p_1 \epsilon$ {$\mathbb{R}$ }-{0}

Let, $K=p_1 A$ and if $K=(( k_{i j}))$ then $d_{ij}= p_1 a_{i j}$

Then , $k_{ij} =0$ when $a_{i j}=0$

Hence as A is an upper triangular matrix , K is also an upper triangular matrix .

So it satisfies property (1)

Again , $\sum_{j=1}^{n} a_{i j}=0,$ for all $1 \leq i \leq n$ then $\sum_{j=1}^{n} k_{i j}=0,$ for all $1 \leq i \leq n$ as $k_{ij}=p_1 a_{i j}$

Hence it satisfies property (2) .

Now we have $\sum_{i=1}^{n} a_{i j}=0,$ for all $1 \leq j \leq n$ ,then $\sum_{i=1}^{n} k_{i j}=0,$ for all $1 \leq j \leq n$ as $k_{ij}=p_1 a_{i j}$

Hence it satisfies the properties (3)

So, $c_n$ is closed under vector addition and scalar multiplication.

Therefore , $c_n$ is a subspace of the vector space of $n \times n$ real matrices with usual matrix addition and scalar multiplication . Hence we are done !

(b) n=2 ,

$A=((a_{i j}))$ $\epsilon$ $c_2$ then , $A= \begin{pmatrix} a_{11} & a_{12} \\ 0 & a_{22} \end{pmatrix}$by property (1) , $a_{11}+a_{12}=0 , a_{22}=0$—(I) by property (2) and $a_{11}=0 , a_{12}+a_{22}=0$—(II) by property (3) .

Now solving (I) and (II) we get $A= \begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$

Giving , $c_2$ = { $\begin{pmatrix} 0 & 0 \\ 0 & 0 \end{pmatrix}$ } hence $dim(c_2)=0$

n=3

$A=((a_{i j}))$ $\epsilon$ $c_3$ then , $A= \begin{pmatrix} a_{11} & a_{12} & a_{13} \\ 0 & a_{22}& a_{23} \\ 0 & 0& a_{33} \end{pmatrix}$ by property (1) , $a_{11}+a_{12}+ a_{13}=0 , a_{22}+a_{23}=0 , a_{33}=0$—(I) by property (2) and $a_{11}=0 , a_{12}+a_{22}=0 a_{13}+a_{23}+a_{33}=0$—(II) by property (3) .

Now solving (I) and (II) we get $a_{11}=0 , a_{33}=0$ $a_{13}=-a_{12}=a_{22}=-a_{23}=-a_{13}=t$ (say) then ,

$A= t \begin{pmatrix} 0 & -1 & -1 \\ 0 & 1 & -1 \\ 0 & 0& 0\end{pmatrix}$ , $t \epsilon R$

Giving , $c_3$= {t $\begin{pmatrix} 0 & -1 & -1 \\ 0 & 1 & -1 \\ 0 & 0& 0\end{pmatrix}$} ,$t \epsilon R$ .

Hence , $dim ( c_3 )=1$

Categories

## ISI MStat 2016 Problem 1 | Area bounded by the curves | PSB Sample

This is a beautiful problem from ISI MStat 2016 Problem 1, PSB Sample, based on area bounded by the curves. We provide a detailed solution with the prerequisites mentioned explicitly.

## Problem- ISI MStat 2016 Problem 1

In the diagram below, $L(x)$ is a straight line that intersects the graph of a polynomial $P(x)$ of degree 2 at the points $A=(-1,0)$ and $B=(5,12) .$ The area of the shaded region is 36 square units. Obtain the expression for $P(x)$.

## Prerequisites

• Area bounded by the curve
• Polynomials of degree 2
• Area of a triangle

## Solution

Let $P(x)=ax^2 +bx + c$ as given $P(x)$ is of degree 2 .

Now from the figure we can see that $L(x)$ intersect $P(x)$ at points $A=(-1,0)$ and $B=(5,12) .$

Hence we have $P(-1)=0$ and $P(5)=12$ , which gives ,

$a-b+c=0$ —(1) and $25a+5b +c =12$ —-(2)

Then ,

See from Fig-1 we can say that Area of the shaded region = (Area bounded by the curve P(x) and x-axis )- (Area of the triangle ABC) – (Area bounded by the curve P(x) , x=5 and x=L )

= $\int^{L}_{-1} P(x)\,dx – \frac{1}{2} \times (5+1) \times 12 -\int^{5}_{L} P(x)\,dx$

=$\int^{L}_{-1} P(x)\,dx – \int^{5}_{L} P(x)\,dx$ -36

=$\int^{5}_{-1} P(x)\,dx$ -36

Again it is given that area of the shaded region is 36 square units.

So, $\int^{5}_{-1} P(x)\,dx$ -36 =36 $\Rightarrow$ $\int^{5}_{-1} P(x)\,dx$ =$2 \times 36$

$\int^{5}_{-1} (ax^2+bx+c) \,dx = 2 \times 36$ . After integration we get ,

$7a + 2b +c =12$ —(3)

Now we have three equations and three unknows

$a-b+c=0$

$25a+5b +c =12$

$7a + 2b +c =12$

Solving this three equations by elimination and substitution we get ,

$a=-1 , b=6 , c=7$

Therefore , the expression for $P(x)$ is $P(x)= -x^2+6x+7$ .

Categories

## Kernel of a linear transformation | ISI MStat 2016 Problem 4 | PSB Sample

This is a beautiful problem from ISI MStat 2016 Problem 4 PSB (sample) based on Vector space. It uses several concepts to solve it. We provide a detailed solution with prerequisites mentioned explicitly.

## Problem– ISI MStat 2016 Problem 4

For each $c \in \mathbb{R},$ define a function $T_{c}: \mathbb{R}^{4} \rightarrow \mathbb{R}^{4}$ by
$T_{c}\left(x_{1}, x_{2}, x_{3}, x_{4}\right):=\left((1+c) x_{1}, x_{2}+c x_{3}, x_{3}+c x_{2},(1+c) x_{4}\right)$
For every $c \in \mathbb{R},$ find the dimension of the null space of $T_{c}$.

## Prerequisites

• kernel or Null space of a linear transformation
• Dimension
• Spanning & Linearly Independent vectors of a vector space

## Solution

Here we have to find the Kernel or null space of $T_{c}$ i.e { $\vec{x}$ : $T_{c} ( \vec{x} )=\vec{0}$ } .

$T_{c}$ is defined as $T_{c}\left(x_{1}, x_{2}, x_{3}, x_{4}\right):=\left((1+c) x_{1}, x_{2}+c x_{3}, x_{3}+c x_{2},(1+c) x_{4}\right)$

So, $T_{c} ( \vec{x} )=\vec{0} \Rightarrow ((1+c) x_{1}, x_{2}+c x_{3}, x_{3}+c x_{2},(1+c) x_{4}) = (0,0,0,0)$ , which gives

(i) $(1+c)x_{1}=0 \Rightarrow x_{1} =0$ if $c \ne -1$

(ii)$(1+c)x_{4}=0 \Rightarrow x_{4} =0$ if $c \ne -1$

(iii) $x_{2}+c x_{3} =0 \Rightarrow x_{2}=-c x_{3}$

(iv) $x_{3}+c x_{2} \Rightarrow x_{3}=-c x_{2}$

(iii) & (iv)$\Rightarrow x_{2}=-c x_{3}=c^2 x_{2} \Rightarrow x_{2} (1-c^2) =0 \Rightarrow x_{2}=0$ if $c \ne \pm 1$

And if $x_{2}=0$ then $x_{3}= 0$ if c $\ne 0$ .

Now for different values of c and using (i),(ii),(iii) and (iv) we will find the Null space $(N(T_{c}) )$ as follows ,

$N(T_{c})$ = $\begin{cases} (x_{1} ,x_{2},x_{2} , x_{4}) & , c=-1 , and { x_{1} ,x_{2}, , x_{4}} \in \mathbb{R} \\ (0,0,0, 0) & c=0 \\ (0,x_{2} ,-x_{2} , 0) &, c=1 , and { x_{2}} \in \mathbb{R} \\ (0,0,0,0) & , c \ne 0,-1,1 \end{cases}$

Therefore for different values of c we will get different dimension of $N(T_{c})$ as follows ,

If c=-1 then $N(T_{c}) = (x_{1} ,x_{2},x_{2} , x_{4} ) = x_{1}(1,0,0,0) + x_{2} (0,1,1,0) + x_{4}(0,0,0,1)$ . Hence the vectors {(1,0,0,0) , (0,1,1,0) ,(0,0,0,1) } spans $N(T_{c})$ and they are Linearly Independent . Thus on this case dimension of null space $N(T_{c})$ is 3 .

If c=0 then $N(T_{c}) =(0,0,0 , 0)$ . Thus on this case dimension of null space $N(T_{c})$ is 0.

If c=1 then $N(T_{c}) =(0,x_{2} ,-x_{2} , 0) = x_{2} (0,1,-1,0)$ .Hence the vectors { (0,1,-1,0) } spans $N(T_{c})$ and they are Linearly Independent . Thus on this case dimension of null space $N(T_{c})$ is 1 .

Finally if $c \ne -1,0,1$ then $N(T_{c}) =(0,0,0,0)$ . Thus on this case dimension of null space $N(T_{c})$ is 0.