Get inspired by the success stories of our students in IIT JAM MS, ISI  MStat, CMI MSc DS.  Learn More 

ISI MStat 2016 Problem 5 | Order Statistics | PSB Sample

This is a beautiful problem from ISI MStat 2016 Problem 5 (sample) PSB based on order statistics. We provide a detailed solution with the prerequisites mentioned explicitly.

Problem- ISI MStat 2016 Problem 5

Let n \geq 2, and X_{1}, X_{2}, \ldots, X_{n} be independent and identically distributed Poisson (\lambda) random variables for some \lambda>0 . Let X_{(1)} \leq X_{(2)} \leq \cdots \leq X_{(n)} denote the corresponding order statistics.
(a) Show that \mathrm{P}\left(X_{(2)}=0\right) \geq 1-n\left(1-e^{-\lambda}\right)^{n-1}
(b) Evaluate the limit of \mathrm{P}\left(X_{(2)}>0\right) as the sample size n \rightarrow \infty .

Prerequisites

Solution

(a) Given , n \geq 2, and X_{1}, X_{2}, \ldots, X_{n} be independent and identically distributed Poisson (\lambda) random variables for some \lambda>0 . Let X_{(1)} \leq X_{(2)} \leq \cdots \leq X_{(n)} denote the corresponding order statistics.

Let , F(j) be the CDF of X_{1}, X_{2}, \ldots, X_{n} i.e CDF of Poisson (\lambda)

Then , Pmf of k-th Order Statistic i.e x_{(k)}

P(x_{(k)} = j)= F_{k} (j)-F_{k} (j-0) , where F_{k} (j) =P(x_{(k)} \le j) i.e the CDF of k-th Order Statistic

F_{k} (j) = \sum_{i=k}^{n} {n \choose i} {(F(j))}^{i} {(1-F(j))}^{n-i}

So, P(x_{(k)} = j) = \sum_{i=k}^{n} {n \choose i} [ {(F(j))}^{i} {(1-F(j))}^{n-i}-{(F(j-0))}^{i} {(1-F(j-0))}^{n-i}]

Here we have to find , P(x_{(2)} = 0)= \sum_{i=2}^{n} {n \choose i} [{(F(0))}^{i} {(1-F(0))}^{n-i} - 0]

since , Poisson random variable takes values 0 ,1,2,.... i.e it takes all values < 0 with probabiliy 0 , that's why {(F(j-0))}^{i} {(1-F(j-0))}^{n-i} =0 here for j=0 .

And , F(0)=P(x \le 0) = P(X=0)={e}^{- \lambda} \frac{{\lambda}^{0}}{0!} ={e}^{- \lambda} , as X follows Poisson (\lambda) .

So, {(F(0))}^{i} {(1-F(0))}^{n-i}= {({e}^{- \lambda})}^{i} {(1-{e}^{- \lambda})}^{n-i}

Therefore , P(x_{(2)} = 0)= \sum_{i=2}^{n} {n \choose i} [{({e}^{- \lambda})}^{i} {(1-{e}^{- \lambda})}^{n-i} ]

=  {({e}^{- \lambda}+1-{e}^{- \lambda})}^{n} - {n \choose 0}[{({e}^{- \lambda})}^{0} {(1-{e}^{- \lambda})}^{n-0} ]- {n \choose 1}[{({e}^{- \lambda})}^{1} {(1-{e}^{- \lambda})}^{n-1}] =1-{(1-{e}^{- \lambda})}^{n}- n {e}^{- \lambda} {(1-{e}^{- \lambda})}^{n-1}

=1-{(1-{e}^{- \lambda})}^{n-1}[1-{e}^{- \lambda} +n{e}^{- \lambda}]

=1-{(1-{e}^{- \lambda})}^{n-1}[1+(n-1){e}^{- \lambda}] \ge 1- n{(1-{e}^{- \lambda})}^{n-1} .

Since , 1+(n-1){e}^{- \lambda} \le n \Longleftrightarrow {e}^{ \lambda} \ge 1 for n \ge 2 and \lambda >0 which is true hence our inequality hold's true (proved)

Hence , \mathrm{P}\left(X_{(2)}=0\right) \geq 1-n\left(1-e^{-\lambda}\right)^{n-1} (proved )

(b) 0 \le P(x_{(2)} >0) =1-P(x_{(2)}= 0) \le 1-1+n\left(1-e^{-\lambda}\right)^{n-1} ( Using inequality in (a) )

So, 0 \le P(x_{(2)} >0) =1-P(x_{(2)}= 0) \le n\left(1-e^{-\lambda}\right)^{n-1} -----(1)

As 0< 1-{e}^{- \lambda} <1 for \lambda >0 i.e it's a fraction so it can be written as \frac{1}{a} for some a>1 , Hence \lim_{n\to\infty} n\left(1-e^{-\lambda}\right)^{n-1} = \lim_{n\to\infty} \frac{n}{a^n} =0 (Proof -Use l'hospital rule or think intutively that as n tends to infinity the exponential functions grows more rapidly than any polynomial function ).

Now taking limit n \to \infty in (1) , we get by squeeze (or sandwichtheorem

\lim_{n\to\infty} P(x_{(2)} >0) =0

This is a beautiful problem from ISI MStat 2016 Problem 5 (sample) PSB based on order statistics. We provide a detailed solution with the prerequisites mentioned explicitly.

Problem- ISI MStat 2016 Problem 5

Let n \geq 2, and X_{1}, X_{2}, \ldots, X_{n} be independent and identically distributed Poisson (\lambda) random variables for some \lambda>0 . Let X_{(1)} \leq X_{(2)} \leq \cdots \leq X_{(n)} denote the corresponding order statistics.
(a) Show that \mathrm{P}\left(X_{(2)}=0\right) \geq 1-n\left(1-e^{-\lambda}\right)^{n-1}
(b) Evaluate the limit of \mathrm{P}\left(X_{(2)}>0\right) as the sample size n \rightarrow \infty .

Prerequisites

Solution

(a) Given , n \geq 2, and X_{1}, X_{2}, \ldots, X_{n} be independent and identically distributed Poisson (\lambda) random variables for some \lambda>0 . Let X_{(1)} \leq X_{(2)} \leq \cdots \leq X_{(n)} denote the corresponding order statistics.

Let , F(j) be the CDF of X_{1}, X_{2}, \ldots, X_{n} i.e CDF of Poisson (\lambda)

Then , Pmf of k-th Order Statistic i.e x_{(k)}

P(x_{(k)} = j)= F_{k} (j)-F_{k} (j-0) , where F_{k} (j) =P(x_{(k)} \le j) i.e the CDF of k-th Order Statistic

F_{k} (j) = \sum_{i=k}^{n} {n \choose i} {(F(j))}^{i} {(1-F(j))}^{n-i}

So, P(x_{(k)} = j) = \sum_{i=k}^{n} {n \choose i} [ {(F(j))}^{i} {(1-F(j))}^{n-i}-{(F(j-0))}^{i} {(1-F(j-0))}^{n-i}]

Here we have to find , P(x_{(2)} = 0)= \sum_{i=2}^{n} {n \choose i} [{(F(0))}^{i} {(1-F(0))}^{n-i} - 0]

since , Poisson random variable takes values 0 ,1,2,.... i.e it takes all values < 0 with probabiliy 0 , that's why {(F(j-0))}^{i} {(1-F(j-0))}^{n-i} =0 here for j=0 .

And , F(0)=P(x \le 0) = P(X=0)={e}^{- \lambda} \frac{{\lambda}^{0}}{0!} ={e}^{- \lambda} , as X follows Poisson (\lambda) .

So, {(F(0))}^{i} {(1-F(0))}^{n-i}= {({e}^{- \lambda})}^{i} {(1-{e}^{- \lambda})}^{n-i}

Therefore , P(x_{(2)} = 0)= \sum_{i=2}^{n} {n \choose i} [{({e}^{- \lambda})}^{i} {(1-{e}^{- \lambda})}^{n-i} ]

=  {({e}^{- \lambda}+1-{e}^{- \lambda})}^{n} - {n \choose 0}[{({e}^{- \lambda})}^{0} {(1-{e}^{- \lambda})}^{n-0} ]- {n \choose 1}[{({e}^{- \lambda})}^{1} {(1-{e}^{- \lambda})}^{n-1}] =1-{(1-{e}^{- \lambda})}^{n}- n {e}^{- \lambda} {(1-{e}^{- \lambda})}^{n-1}

=1-{(1-{e}^{- \lambda})}^{n-1}[1-{e}^{- \lambda} +n{e}^{- \lambda}]

=1-{(1-{e}^{- \lambda})}^{n-1}[1+(n-1){e}^{- \lambda}] \ge 1- n{(1-{e}^{- \lambda})}^{n-1} .

Since , 1+(n-1){e}^{- \lambda} \le n \Longleftrightarrow {e}^{ \lambda} \ge 1 for n \ge 2 and \lambda >0 which is true hence our inequality hold's true (proved)

Hence , \mathrm{P}\left(X_{(2)}=0\right) \geq 1-n\left(1-e^{-\lambda}\right)^{n-1} (proved )

(b) 0 \le P(x_{(2)} >0) =1-P(x_{(2)}= 0) \le 1-1+n\left(1-e^{-\lambda}\right)^{n-1} ( Using inequality in (a) )

So, 0 \le P(x_{(2)} >0) =1-P(x_{(2)}= 0) \le n\left(1-e^{-\lambda}\right)^{n-1} -----(1)

As 0< 1-{e}^{- \lambda} <1 for \lambda >0 i.e it's a fraction so it can be written as \frac{1}{a} for some a>1 , Hence \lim_{n\to\infty} n\left(1-e^{-\lambda}\right)^{n-1} = \lim_{n\to\infty} \frac{n}{a^n} =0 (Proof -Use l'hospital rule or think intutively that as n tends to infinity the exponential functions grows more rapidly than any polynomial function ).

Now taking limit n \to \infty in (1) , we get by squeeze (or sandwichtheorem

\lim_{n\to\infty} P(x_{(2)} >0) =0

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Knowledge Partner

Cheenta is a knowledge partner of Aditya Birla Education Academy
Cheenta

Cheenta Academy

Aditya Birla Education Academy

Aditya Birla Education Academy

Cheenta. Passion for Mathematics

Advanced Mathematical Science. Taught by olympians, researchers and true masters of the subject.
JOIN TRIAL
support@cheenta.com
Menu
Trial
Whatsapp
rockethighlight