Get inspired by the success stories of our students in IIT JAM MS, ISI  MStat, CMI MSc Data Science.  Learn More

# Bayes comes to rescue | ISI MStat PSB 2007 Problem 7

This is a very beautiful sample problem from ISI MStat PSB 2007 Problem 7. It's a very simple problem, which very much rely on conditioning and if you don't take it seriously, you will make thing complicated. Fun to think, go for it !!

## Problem- ISI MStat PSB 2007 Problem 7

Let $X$ and $Y$ be i.i.d. exponentially distributed random variables with mean $\lambda >0$. Define $Z$ by :

$Z = \begin{cases} 1 & if X <Y \\ 0 & otherwise \end{cases}$

Find the conditional mean, $E(X|Z=1)$.

### Prerequisites

Conditional Distribution

Bayes Theorem

Exponential Distribution

## Solution :

This is a very simple but very elegant problem to describe an unique and efficient technique to solve a class of problems, which may seem analytically difficult.

Here, for $X$, $Y$ and $Z$ as defined in the question, lets find out what we need first of all.

Sometimes, breaking a seemingly complex problem into some simpler sub-problems, makes our way towards the final solution easier. In this problem, the possible simpler sub problems, which I think would help us is, "Whats the value of $P(X<Y)$ (or similarly $P(Z=1)$) ?? ", "what is pdf of $X|X<Y$( or equivalently $X|Z=1$) ?" and finally " what is the conditional mean $E(X|Z=1)$ ??". We will attain these questions one by one.

for the very first question, "Whats the value of $P(X<Y)$ (or similarly $P(Z=1)$) ?? ", well the answer to this question is relatively simple, and I leave it as an exercise !! the probability value which one will find if done correctly is $\frac{1}{2}$. Verify it, then only move forward!!

The 2nd question is the most vital and beautiful part of the problem, we generally, do this kind of problems using the general definition of conditional probability, which you can obviously try, but will face some difficulties, which can be easily ignored by using the continuous form of Bayes' rule, which we are not often encouraged to use !! I don't really know why, though !

Let, find the conditional Cdf of $X|Z=1$,

$P(X \le x|Z=1) = \int^x_0 f_{X|X<Y}(x) dx, ........... x>0$

where $f_{X|X<Y}(x)$ is the conditional pdf, which we are interested in, So now we can use Bayes rule on $f_{X|X<Y}(x)$, we have,

$f_{X|X<Y}(x)$=$\frac{P(Z=1|X=x)f_X(x)}{P(Z=1)}$=$\frac{P(Y>x)f_X(x)}{P(X<Y)}$=$\frac{\frac{e^{-\frac{x}{\lambda}}e^{- \frac{x}{\lambda}}}{\lambda}}{\frac{1}{2}}$=$\frac{2}{\lambda}e^{-\frac{2x}{\lambda}}$

plugging this in the form of cdf we can easily verify, that $X|Z=1 \sim expo(\frac{\lambda}{2})$. (We can't say this directly from pdf because, pdfs are not unique, Can you give such an example ? think about it !)

So, now as we successfully answered the first 2 questions its easy to, answer the last and the final one, as $X|Z=1 \sim expo(\frac{\lambda}{2})$, its mean .i.e.

$E(X|Z=1)=\frac{\lambda}{2}.$

Hence the solution concludes.

## Food For Thought

Lets, provide an interesting problem before concluding,

There, are K+1 machineas in a shop, all engaged in the mass production of an item. the $i$th machine produces defectives with probability of $\frac{i}{k}$, i=0,1,2,.....,k.A machine is selected at random and then the items produced are repeatedly sampled. If the first n products are all defectives, then show that the conditional probability that (n+1)th sampled product will also be defective is approximately, equal to $\frac{(n+1)}{(n+2)}$ when k is large.

Can you show it? Give it a try !!

## Subscribe to Cheenta at Youtube

This is a very beautiful sample problem from ISI MStat PSB 2007 Problem 7. It's a very simple problem, which very much rely on conditioning and if you don't take it seriously, you will make thing complicated. Fun to think, go for it !!

## Problem- ISI MStat PSB 2007 Problem 7

Let $X$ and $Y$ be i.i.d. exponentially distributed random variables with mean $\lambda >0$. Define $Z$ by :

$Z = \begin{cases} 1 & if X <Y \\ 0 & otherwise \end{cases}$

Find the conditional mean, $E(X|Z=1)$.

### Prerequisites

Conditional Distribution

Bayes Theorem

Exponential Distribution

## Solution :

This is a very simple but very elegant problem to describe an unique and efficient technique to solve a class of problems, which may seem analytically difficult.

Here, for $X$, $Y$ and $Z$ as defined in the question, lets find out what we need first of all.

Sometimes, breaking a seemingly complex problem into some simpler sub-problems, makes our way towards the final solution easier. In this problem, the possible simpler sub problems, which I think would help us is, "Whats the value of $P(X<Y)$ (or similarly $P(Z=1)$) ?? ", "what is pdf of $X|X<Y$( or equivalently $X|Z=1$) ?" and finally " what is the conditional mean $E(X|Z=1)$ ??". We will attain these questions one by one.

for the very first question, "Whats the value of $P(X<Y)$ (or similarly $P(Z=1)$) ?? ", well the answer to this question is relatively simple, and I leave it as an exercise !! the probability value which one will find if done correctly is $\frac{1}{2}$. Verify it, then only move forward!!

The 2nd question is the most vital and beautiful part of the problem, we generally, do this kind of problems using the general definition of conditional probability, which you can obviously try, but will face some difficulties, which can be easily ignored by using the continuous form of Bayes' rule, which we are not often encouraged to use !! I don't really know why, though !

Let, find the conditional Cdf of $X|Z=1$,

$P(X \le x|Z=1) = \int^x_0 f_{X|X<Y}(x) dx, ........... x>0$

where $f_{X|X<Y}(x)$ is the conditional pdf, which we are interested in, So now we can use Bayes rule on $f_{X|X<Y}(x)$, we have,

$f_{X|X<Y}(x)$=$\frac{P(Z=1|X=x)f_X(x)}{P(Z=1)}$=$\frac{P(Y>x)f_X(x)}{P(X<Y)}$=$\frac{\frac{e^{-\frac{x}{\lambda}}e^{- \frac{x}{\lambda}}}{\lambda}}{\frac{1}{2}}$=$\frac{2}{\lambda}e^{-\frac{2x}{\lambda}}$

plugging this in the form of cdf we can easily verify, that $X|Z=1 \sim expo(\frac{\lambda}{2})$. (We can't say this directly from pdf because, pdfs are not unique, Can you give such an example ? think about it !)

So, now as we successfully answered the first 2 questions its easy to, answer the last and the final one, as $X|Z=1 \sim expo(\frac{\lambda}{2})$, its mean .i.e.

$E(X|Z=1)=\frac{\lambda}{2}.$

Hence the solution concludes.

## Food For Thought

Lets, provide an interesting problem before concluding,

There, are K+1 machineas in a shop, all engaged in the mass production of an item. the $i$th machine produces defectives with probability of $\frac{i}{k}$, i=0,1,2,.....,k.A machine is selected at random and then the items produced are repeatedly sampled. If the first n products are all defectives, then show that the conditional probability that (n+1)th sampled product will also be defective is approximately, equal to $\frac{(n+1)}{(n+2)}$ when k is large.

Can you show it? Give it a try !!

## Subscribe to Cheenta at Youtube

This site uses Akismet to reduce spam. Learn how your comment data is processed.