ISI MStat PSB 2014 Problem 4 | The Machine's Failure

This is a very simple sample problem from ISI MStat PSB 2014 Problem 4. It is based on order statistics, but generally due to one's ignorance towards order statistics, one misses the subtleties . Be Careful !

Problem- ISI MStat PSB 2014 Problem 4


Consider a machine with three components whose times to failure are independently distributed as exponential random variables with mean \(\lambda\). the machine continue to work as long as at least two components work. Find the expected time to failure of the machine.

Prerequisites


Exponential Distribution

Order statistics

Basic counting

Solution :

In the problem as it is said, let the 3 component part of the machine be A,B and C respectively, where \(X_A, X_B\) and \(X_C\) are the survival time of the respective parts. Now it is also told that \(X_A, X_B\) and \(X_C\) follows \(exponential(\lambda) \), and clearly these random variables are also i.id.

Now, here comes the trick ! It is told that the machine stops when two or all parts of the machine stop working. Here, we sometimes gets confused and start thinking combinatorially. But the we forget the basic counting of combinatorics lies in ordering ! Suppose we start ordering the life time of the individual components .i.e. among \(X_A, X_B\) and \(X_C\), there exists a ordering and say if we write it in order, we have \(X_{(1)} \le X_{(2)} \le X_{(3)} \).

Now observe that, after \(X_{(2)}\) units of time, the machine will stop !! (Are you sure ?? think it over ).

So, expected time till the machine stops , is just \(E(X_{(2)})\), but to find this we need to know the distribution of \(X_{(2)}\).

We have the pdf of \(X_{(2)}\) as, \(f_{(2)}(x)= \frac{3!}{(2-1)!(3-2)!} [P(X \le x)]^{2-1}[P(X>x)]^{3-2}f_X(x) \).

Where \(f_X(x)\) is the pdf of exponentional with mean \(\lambda\).

So, \(E(X(2))= \int^{\infty}_0 xf_{(2)}(x)dx \). which will turn out to be \(\frac{5\lambda}{6}\), which I leave on the readers to verify , hence concluding my solution.


Food For Thought

Now, suppose, you want install an alarm system, which will notify you some times before the machine wears our!! So, what do you think your strategy should be ? Given that you have a strategy, you now replace the weared out part of the machine within the time period between the alarm rings and the machine stops working, to continue uninterrupted working.What is the expected time within which you must act ?

Keep the machine running !!


ISI MStat PSB 2008 Problem 10
Outstanding Statistics Program with Applications

Outstanding Statistics Program with Applications

Subscribe to Cheenta at Youtube


ISI MStat 2016 Problem 5 | Order Statistics | PSB Sample

This is a beautiful problem from ISI MStat 2016 Problem 5 (sample) PSB based on order statistics. We provide a detailed solution with the prerequisites mentioned explicitly.

Problem- ISI MStat 2016 Problem 5

Let \( n \geq 2,\) and \( X_{1}, X_{2}, \ldots, X_{n}\) be independent and identically distributed Poisson \( (\lambda) \) random variables for some \( \lambda>0 .\) Let \( X_{(1)} \leq\) \( X_{(2)} \leq \cdots \leq X_{(n)}\) denote the corresponding order statistics.
(a) Show that \( \mathrm{P}\left(X_{(2)}=0\right) \geq 1-n\left(1-e^{-\lambda}\right)^{n-1}\)
(b) Evaluate the limit of \( \mathrm{P}\left(X_{(2)}>0\right)\) as the sample size \( n \rightarrow \infty \) .

Prerequisites

Solution

(a) Given , \( n \geq 2,\) and \( X_{1}, X_{2}, \ldots, X_{n}\) be independent and identically distributed Poisson \( (\lambda) \) random variables for some \( \lambda>0 .\) Let \( X_{(1)} \leq\) \( X_{(2)} \leq \cdots \leq X_{(n)}\) denote the corresponding order statistics.

Let , F(j) be the CDF of \( X_{1}, X_{2}, \ldots, X_{n}\) i.e CDF of Poisson \( (\lambda) \)

Then , Pmf of k-th Order Statistic i.e \( x_{(k)} \)

\( P(x_{(k)} = j)= F_{k} (j)-F_{k} (j-0) \) , where \( F_{k} (j) =P(x_{(k)} \le j) \) i.e the CDF of k-th Order Statistic

\( F_{k} (j) = \sum_{i=k}^{n} \) \({n \choose i} \) \( {(F(j))}^{i} {(1-F(j))}^{n-i} \)

So, \( P(x_{(k)} = j) = \sum_{i=k}^{n} {n \choose i} [ {(F(j))}^{i} {(1-F(j))}^{n-i}-{(F(j-0))}^{i} {(1-F(j-0))}^{n-i}] \)

Here we have to find , \( P(x_{(2)} = 0)= \sum_{i=2}^{n} {n \choose i} [{(F(0))}^{i} {(1-F(0))}^{n-i} - 0] \)

since , Poisson random variable takes values 0 ,1,2,.... i.e it takes all values < 0 with probabiliy 0 , that's why \( {(F(j-0))}^{i} {(1-F(j-0))}^{n-i} =0\) here for j=0 .

And , \( F(0)=P(x \le 0) = P(X=0)={e}^{- \lambda} \frac{{\lambda}^{0}}{0!} ={e}^{- \lambda} \) , as X follows Poisson \( (\lambda) \) .

So, \( {(F(0))}^{i} {(1-F(0))}^{n-i}= {({e}^{- \lambda})}^{i} {(1-{e}^{- \lambda})}^{n-i} \)

Therefore , \( P(x_{(2)} = 0)= \sum_{i=2}^{n} {n \choose i} [{({e}^{- \lambda})}^{i} {(1-{e}^{- \lambda})}^{n-i} ] \)

\( = {({e}^{- \lambda}+1-{e}^{- \lambda})}^{n} - {n \choose 0}[{({e}^{- \lambda})}^{0} {(1-{e}^{- \lambda})}^{n-0} ]- {n \choose 1}[{({e}^{- \lambda})}^{1} {(1-{e}^{- \lambda})}^{n-1}] =1-{(1-{e}^{- \lambda})}^{n}- n {e}^{- \lambda} {(1-{e}^{- \lambda})}^{n-1} \)

\( =1-{(1-{e}^{- \lambda})}^{n-1}[1-{e}^{- \lambda} +n{e}^{- \lambda}] \)

\( =1-{(1-{e}^{- \lambda})}^{n-1}[1+(n-1){e}^{- \lambda}] \ge 1- n{(1-{e}^{- \lambda})}^{n-1} \) .

Since , \( 1+(n-1){e}^{- \lambda} \le n \Longleftrightarrow {e}^{ \lambda} \ge 1 \) for \( n \ge 2\) and \( \lambda >0 \) which is true hence our inequality hold's true (proved)

Hence , \( \mathrm{P}\left(X_{(2)}=0\right) \geq 1-n\left(1-e^{-\lambda}\right)^{n-1}\) (proved )

(b) \( 0 \le P(x_{(2)} >0) =1-P(x_{(2)}= 0) \) \( \le 1-1+n\left(1-e^{-\lambda}\right)^{n-1}\) ( Using inequality in (a) )

So, \( 0 \le P(x_{(2)} >0) =1-P(x_{(2)}= 0) \) \( \le n\left(1-e^{-\lambda}\right)^{n-1}\) -----(1)

As \( 0< 1-{e}^{- \lambda} <1\) for \( \lambda >0 \) i.e it's a fraction so it can be written as \( \frac{1}{a} \) for some \( a>1\) , Hence \( \lim_{n\to\infty} n\left(1-e^{-\lambda}\right)^{n-1} = \lim_{n\to\infty} \frac{n}{a^n} =0 \) (Proof -Use l'hospital rule or think intutively that as n tends to infinity the exponential functions grows more rapidly than any polynomial function ).

Now taking limit \( n \to \infty \) in (1) , we get by squeeze (or sandwichtheorem

\( \lim_{n\to\infty} P(x_{(2)} >0) =0 \)

Life Testing Experiment | ISI MStat 2017 PSB Problem 5

This is a problem from the ISI MStat 2017 Entrance Examination and tests how good are your skills in modeling a life testing experiment using an exponential distribution.

The Problem:

The lifetime in hours of each bulb manufactured by a particular company follows an independent exponential distribution with mean \( \lambda \). We need to test the null hypothesis \( H_0: \lambda=1000 \) against \( H_1:\lambda=500 \).
A statistician sets up an experiment with \(50\) bulbs, with \(5\) bulbs in each of \(10\) different locations, to examine their lifetimes.

To get quick preliminary results,the statistician decides to stop the experiment as soon as one bulb fails at each location.Let \(Y_i\) denote the lifetime of the first bulb to fail at location \(i\).Obtain the most powerful test of \(H_0\) against \(H_1\) based on \(Y_1,Y_2,…Y_{10}\) and compute its power.

Prerequisites:

1.Properties of Exponential/Gamma distribution.

2.Neyman Pearson Lemma.

3.Order Statistics.

Proof:

As it is clear from the arrangement of the bulbs, the first to fail(among 5 in a given location) has the smallest lifetime among the same.

That is, in more mathematical terms, for a location \( i \), we can write \( Y_i = \text{min}(X_{i1},X_{i2},..,X_{i5}) \).

Here, \(X_{ij} \) denotes the \( j \) th unit in the \( i th \) location where \(i=1,2,..,10 \) and \(j=1,2,..,5\)

It is given that \( X_{ij} \sim \text{Exp}(\lambda) \).

Can you see that \( Y_i \sim \text{Exp}(5 \lambda) \)? You may try to prove this result for this:

If \( X_1,..,X_n \) be a random sample from \( \text{Exp}(\lambda) \) distribution,

then \(X_{(1)}=\text{min}(X_1,....,X_n) \sim \text{Exp}(n \lambda) \).

So, now we have \(Y_1,Y_2,..Y_{10} \) in hand each having \(\text{Exp}(5 \lambda) \) distribution.

Let the joint pdf be \( f(\mathbf{y} )=\frac{1}{(5 \lambda)^{10}} e^{-\frac{\sum_{i=1}^{10} y_i}{5 \lambda}} \).

For testing \( H_0: \lambda=1000 \) against \( H_1:\lambda=500 \), we use the Neyman Pearson Lemma.

We have the critical region of the most powerful test as \(\frac{f_{H_1}(\mathbf{y})}{f_{H_0}(\mathbf{y})} >c \)

which after simplification comes out to be \(\bar{Y} > K \) where \(K\) is an appropriate constant.

Also, see that \( \bar{Y} \sim \text{Gamma}(10,50 \lambda) \).

Can you use this fact to find the value of \(K\) using the size (\( \alpha\)) criterion ? (Exercise to the reader)

Also, find the power of the test.

Challenge Problem:

The exponential distribution is used widely to model lifetime of appliances. The following scenario is based on such a model.

Suppose electric bulbs have a lifetime distribution with pdf \(f(t)=\lambda e^{-\lambda t} \) where \( t \in [0, \infty) \) .

These bulbs are used individually for street lighting in a large number of posts.A bulb is replaced immediately after it burns out.

Let's break down the problem in steps.

(i)Starting from time \(t=0 \) , the process is observed till \(t=T\).Can you calculate the expected number of replacements in a post during the interval \( (0,T) \) ?

(ii) Hence,deduce \( g(t) \text{dt} \) ,the probability of a bulb being replaced in \( (t,t+ \text{dt}) \) for \( t < T \),irrespective of when the bulb was put in.

(iii)Next,suppose that at the end of the first interval of time \(T\),all bulbs which were put in the posts before time \(X < T \) and have not burned out are replaced by new ones,but the bulbs replaced after ttime \(X\) continue to be used,provided,of course,that they have not burned out.

Prove that with such a mixture of old and new bulbs, the probability of a bulb having an expected lifetime > \( \tau \) in the second interval of length \(T\) is given by

\( S_2(\tau)=\frac{1}{2}e^{-\lambda \tau}(1+ e^{-\lambda X}) \)

Also, try proving the general case where the lifetimes of the bulbs follow the pdf \(f(t)\) . Here, \(f(t)\) need not be the pdf of an exponential distribution .

You should be getting: \(S_2(\tau)=(1-p)S_1(\tau) + \int_{0}^{x} g(T-x)S_1(x)S_1(\tau +x) \text{dx}\) ; where \(\tau<T\)

where, \(p\) is the proportion of bulbs not replaced at time \(t=T\) and \(S_1(t)\) is the probability that a bulb has lifetime > \(t\).