Categories

## ISI MStat PSB 2007 Problem 3 | Application of L’hospital Rule

This is a very beautiful sample problem from ISI MStat PSB 2007 Problem 3 based on use of L’hospital Rule . Let’s give it a try !!

## Problem– ISI MStat PSB 2007 Problem 3

Let f be a function such that $f(0)=0$ and f has derivatives of all order. Show that $\lim _{h \to 0} \frac{f(h)+f(-h)}{h^{2}}=f”(0)$
where $f”(0)$ is the second derivative of f at 0.

### Prerequisites

Differentiability

Continuity

L’hospital rule

## Solution :

Let L= $\lim _{h \to 0} \frac{f(h)+f(-h)}{h^{2}}$ it’s a $\frac{0}{0}$ form as f(0)=0 .

So , here we can use L’hospital rule as f is differentiable .

We get L= $\lim _{h \to 0} \frac{f'(h)-f'(-h)}{2h} = \lim _{h \to 0} \frac{(f'(h)-f'(0)) -(f'(-h)-f'(0))}{2h}$

= $\lim _{h \to 0} \frac{f'(h)-f'(0)}{2h} + \lim _{k \to 0} \frac{f'(k)-f'(0)}{2k}$ , taking -h=k .

= $\frac{f”(0)}{2} + \frac{f”(0)}{2}$ = $f”(0)$ . Hence done!

## Food For Thought

Let $f:[0,1] \rightarrow[0,1]$ be a continuous function such $f^{(n)} := f ( f ( \cdots ( f(n \text{ times} ))$ and assume that there exists a positive integer m such that $f^{(m)}(x)=x$ for all $x \in[0,1] .$ Prove that $f(x)=x$ for all $x \in[0,1]$

Categories

## ISI MStat PSB 2009 Problem 2 | Linear Difference Equation

This is a very beautiful sample problem from ISI MStat PSB 2009 Problem 2 based on Convergence of a sequence. Let’s give it a try !!

## Problem– ISI MStat PSB 2009 Problem 2

Let $\{x_{n}: n \geq 0\}$ be a sequence of real numbers such that
$x_{n+1}=\lambda x_{n}+(1-\lambda) x_{n-1}, n \geq 1,$ for some $0<\lambda<1$
(a) Show that $x_{n}=x_{0}+(x_{1}-x_{0}) \sum_{k=0}^{n-1}(\lambda-1)^{k}$
(b) Hence, or otherwise, show that $x_{n}$ converges and find the limit.

### Prerequisites

Limit

Sequence

Linear Difference Equation

## Solution :

(a) We are given that $x_{n+1}=\lambda x_{n}+(1-\lambda) x_{n-1}, n \geq 1,$ for some $0<\lambda<1$

So, $x_{n+1} – x_{n} = -(1- \lambda)( x_n-x_{n-1})$ —- (1)

Again using (1) we have $( x_n-x_{n-1})= -(1- \lambda)( x_{n-1}-x_{n-2})$ .

Now putting this in (1) we have , $x_{n+1} – x_{n} = {(-(1- \lambda))}^2 ( x_{n-1}-x_{n-2})$ .

So, proceeding like this we have $x_{n+1} – x_{n} = {(-(1- \lambda))}^n ( x_{1}-x_{0})$ for all $n \geq 1$ and for some $0<\lambda<1$—- (2)

So, from (2) we have $x_{n} – x_{n-1} = {(-(1- \lambda))}^{n-1} ( x_{1}-x_{0})$ , $\cdots , (x_2-x_1)=-(\lambda-1)(x_1-x_{0})$ and $x_1-x_{0}=x_{1}-x_{0}$

Adding all the above n equation we have $x_{n}-x_{0}=(x_{1}-x_{0}) \sum_{k=0}^{n-1} {(\lambda-1)}^{k}$

Hence , $x_{n}=x_{0}+(x_{1}-x_{0}) \sum_{k=0}^{n-1}(\lambda-1)^{k}$ (proved ) .

(b) As we now have an explicit form of $x_{n}=x_{0}+(x_{1}-x_{0}) \times \frac{1-{( \lambda -1)}^n}{1-(\lambda -1)}$ —-(3)

Hence from (3) we can say $x_{n}$ is bounded and monotonic ( verify ) so , it’s convergent .

Now let’s take $\lim_{n\to\infty}$ both side of (3) we get , $\lim_{x\to\infty} x_{n} = x_{0}+(x_{1}-x_{0}) \times \frac{1}{2 – \lambda}$ .

Since , $\lim_{x\to\infty} {( \lambda – 1)}^{n} = 0$ as , $-1 < \lambda -1 < 0$ .

## Food For Thought

$\mathrm{m}$ and $\mathrm{k}$ are two natural number and $a_{1}, a_{2}, \ldots, a_{m}$ and $b_{1}, b_{2}, \ldots, b_{k}$ are two sets of positive real numbers such that $a_{1}^{\frac{1}{n}}+a_{2}^{\frac{1}{n}}+\cdots+a_{m}^{\frac{1}{n}}$ = $b_{1}^{\frac{1}{n}}+\cdots+b_{k}^{\frac{1}{n}}$

for all natural number $\mathrm{n} .$ Then prove that $\mathrm{m}=\mathrm{k}$ and $a_{1} a_{2} \ldots a_{m}=b_{1} b_{2} . . b_{k}$ .

Categories

## ISI MStat PSB 2012 Problem 6 | Tossing a biased coin

This is a very beautiful sample problem from ISI MStat PSB 2012 Problem 6 based on Conditional probability . Let’s give it a try !!

## Problem– ISI MStat PSB 2012 Problem 6

There are two biased coins – one which has probability $1 / 4$ of showing
heads and $3 / 4$ of showing tails, while the other has probability $3 / 4$ of showing heads and $1 / 4$ of showing tails when tossed. One of the two coins is chosen at random and is then tossed 8 times.

(a) Given that the first toss shows heads, what is the probability that in the next 7 tosses there will be exactly 6 heads and 1 tail?
(b) Given that the first toss shows heads and the second toss shows tail, what is the probability that the next 6 tosses all show heads?

## Prerequisites

Basic Counting Principle

## Solution :

Let , $A_1$: Coin with probability of Head 1/4 and Tail 3/4 is chosen
$A_2$ : Coin with probability of Head 3/4 and Tail 1/4 is chosen
B :first toss shows heads and the next 7 tosses there will be exactly 6 heads and 1 tail .
C : the first toss shows heads and the second toss shows tail and the next 6 tosses all show heads .

(a) $P(B)=P(B|A_1)P(A_1) + P(B|A_2)P(A_2)$

Now , $P(B|A_1)= \frac{1}{4} \times {7 \choose 1} \times (\frac{1}{4})^{6} \times \frac{3}{4}$

Since , first toss is head so it can occur by coin 1 with probability 1/4 and out of next 7 tosses we can choose 6 where head comes and this occurs with probability ${7 \choose 1} \times (\frac{1}{4})^{6} \times \frac{3}{4}$

Similarly we can calculate $P(B|A_2)$ and $P(A_1)=P(A_2)= 1/2$ the probability of choosing any one coin out of 2 .

Therefore , $P(B)=P(B|A_1)P(A_1) + P(B|A_2)P(A_2)$

= $\frac{1}{4} \times {7 \choose 1} \times\left(\frac{1}{4}\right)^{6} \times \frac{3}{4} \times\frac{1}{2}+\frac{3}{4} \times {7 \choose 1} \times\left(\frac{3}{4}\right)^{6} \times \frac{1}{4} \times \frac{1}{2}$

(b) Similarly like (a) we get ,

$P(C)= \frac{1}{4} \times \frac{3}{4} \times (\frac{1}{4})^{6} \times \frac{1}{2}+\frac{3}{4} \times \frac{1}{4} \times\left(\frac{3}{4}\right)^{6} \times \frac{1}{2}$ .

Here we don’t need to choose any thing as all the outcomes of the toss are given we just need to see for two different coins .

## Food For Thought

There are 10 boxes each containing 6 white and 7 red balls. Two different boxes are chosen at random, one ball is drawn simultaneously at random from each and transferred to the other box. Now a box is again chosen from the 10 boxes and a ball is chosen from it.Find out the probability of the ball being white.

Categories

## ISI MStat PSB 2013 Problem 3 | Number of distinct integers

This is a very beautiful sample problem from ISI MStat PSB 2013 Problem 3 based on Counting principle . Let’s give it a try !!

## Problem– ISI MStat PSB 2013 Problem 3

1. Suppose integers are formed by taking one or more digits from the
2. following $2,2,3,3,4,5,5,5,6,7$ . For example, 355 is a possible choice while 44 is not. Find the number of distinct integers that can be formed in which
3. (a) the digits are non-decreasing;
4. (b) the digits are strictly increasing.

### Prerequisites

Basic Counting Principle

## Solution :

(a) To find the number of integers with non-decreasing digits we will go by this way

First see that the position of given digits are fixed as they have to form non-decreasing digits what can be do is to select number of times the particular digit occurs .

For example 223556 in an integer With non-decreasing digits so here 2 occurs
2 times, 3 occurs 1 times, 4 doesn’t occurs any times, 5 occurs 2 times, 6 occur 1 time and finally 7 which doesn’t occur any number of times.
So, here we have (0,1,2) possible choices of occurrence of 2 ,(0,1,2) possible choices of occurrence of $3, \cdots,$ (0,1) possible choices of occurrence of 7 .

Hence, number of such integers $=3 \times 3 \times 2 \times 4 \times 2 \times 2-1$.

We are subtracting 1 to exclude the case where no digits has occur any number of times.

(b) To find the number of integers with increasing digits we can go by above method. Here there is only one restriction that a digit must be greater than it’s preceding digits.so, no consecutive digits can be equal. Hence every digits has two choices (0,1) of occurrence . Therefore number of such integers = $2^{6}-1$ .

We are subtracting 1 to exclude the case where no digits has occur any number of times.

## Food For Thought

A number is chosen randomly from all the 5 digited numbers.Find out the probability that the digits form a non decreasing sequence.

Hint 1 : Find what is invariant here .

Hint 2 : Use the non-negative integer solutions of a equation formula .

Categories

## ISI MStat PSB 2013 Problem 8 | Finding the Distribution of a Random Variable

This is a very beautiful sample problem from ISI MStat PSB 2013 Problem 8 based on finding the distribution of a random variable . Let’s give it a try !!

## Problem– ISI MStat PSB 2013 Problem 8

1. Suppose $X_{1}$ is a standard normal random variable. Define
2. $X_{2}= \begin{cases} – X_{1} & , \text{if } |X_{1}|<1 \\ X_{1} & \text{otherwise} \end{cases}$
(a) Show that $X_{2}$ is also a standard normal random variable.
(b) Obtain the cumulative distribution function of $X_{1}+X_{2}$ in terms of the cumulative distribution function of a standard normal random
variable.

### Prerequisites

Cumulative Distribution Function

Normal Distribution

## Solution :

(a) Let $F_{X_{2}}(x)$ be distribution function of X_{2}\) then we can say that ,

$F_{X_{2}}(x) = P( X_{2} \le x) = P( X_{2} \le x | |X_{1}| < 1) P( |X_{1}| <1) + P( X_{2} \le x | |X_{1}| > 1 ) P( |X_{1}| >1)$

= $P( – X_{1} ||X_{1}| < 1)P( |X_{1}| <1) + P( X_{1} \le x | |X_{1}| > 1 ) P( |X_{1}| >1)$

= $P( – X_{1}\le x , |X_{1}| < 1 ) + P( X_{1} \le x , |X_{1}| > 1 )$

= $P( X_{1}\le x , |-X_{1}| < 1 ) + P( X_{1} \le x , |X_{1}| > 1 )$

Since $X_{1} \sim N(0,1)$ hence it’s symmetric about 0 . So,$X_{1}$ and$-X_{1}$ are identically distributed .

Therefore , $F_{X_{2}}(x) = P( X_{1}\le x , |X_{1}| < 1 ) + P( X_{1} \le x , |X_{1}| > 1 )$

=$P(X_{1} \le x ) = \Phi(x)$

Hence , $X_{2}$ is also a standard normal random variable.

(b) Let , $Y= X_{1} + X_{2} = \begin{cases} 0 & \text{if } |X_{1}|<1 \\ 2X_{1} & \text{ otherwise } \end{cases}$

Distribution function $F_{Y}(y) = P(Y \le y)$

=$P(Y \le y | |X_{1} < 1) P(|X_{1}| <1) + P( Y\le y | |X_{1}| >1)P(|X_{1}|>1)$

= $P( 0 \le y , -1 \le X_{1} \le 1 ) + P( 2X_{1} \le y , ( X_{1} >1 \cup X_{1}<-1))$ \)

= $P(0 \le y , -1 \le X_{1} \le 1 ) + P( X_{1} \le \frac{y}{2} , X_{1} > 1) + P( X_{1} \le \frac{y}{2} , X_{1} < -1)$

= $P(0 \le y , -1 \le X_{1} \le 1 ) + P( 1< X_{1} \le \frac{y}{2}) + P( X_{1} \le min{ \frac{y}{2} , -1 } )$

= $\begin{cases} P( -1 \le X_{1} \le 1 ) + P( 1< X_{1} \le \frac{y}{2}) + P( X_{1} \le -1) & y \ge 2 \\ P( -1 \le X_{1} \le 1 ) + P( X_{1} \le -1) & 0 \le y < 2 \\ P( X_{1} \le -1) & -2 \le y < 0 \\ P( X_{1} \le \frac{y}{2}) & y<-2 \end{cases}$

= $\begin{cases} \Phi( \frac{y}{2} ) & y<-2 \\ \Phi(-1) & -2 \le y < 0 \\ \Phi(1) & 0 \le y <2 \\ \Phi(\frac{y}{2} ) & y \ge 2 \end{cases}$ .

## Food For Thought

Find the the distribution function of $2X_{1}-X_{2}$ in terms of the cumulative distribution function of a standard normal random variable.

Categories

## ISI MStat PSB 2009 Problem 5 | Finding the Distribution of a Random Variable

This is a very beautiful sample problem from ISI MStat PSB 2009 Problem 5 based on finding the distribution of a random variable . Let’s give it a try !!

## Problem– ISI MStat PSB 2009 Problem 5

Suppose $F$ and $G$ are continuous and strictly increasing distribution
functions. Let $X$ have distribution function $F$ and $Y=G^{-1}( F(X))$
(a) Find the distribution function of Y.
(b) Hence, or otherwise, show that the joint distribution function of $(X, Y),$ denoted by $H(x, y),$ is given by $H(x, y)=\min (F(x), G(y))$.

### Prerequisites

Cumulative Distribution Function

Inverse of a function

Minimum of two function

## Solution :

(a) Let $F_{Y}(y)$ be Cumulative distribution Function of $Y=G^{-1}(F(x))$
Then , $F_{Y}(y)=P(Y \le y) =P(G^{-1}(F(x)) \le y)$
=$P(F(x) \le G(y))$

[ taking G on both side, since G is Strictly in decreasing function the inequality doesn’t change]
= $P(x \le F^{-1}(G(y)))$

[ taking $F^{-1}$ on both side and since F is strictly increasing distribution function hence inverse exists and inequality doesn’t change ]

=$F(F^{-1}(G(y)))$ [Since F is a distribution function of X ]
=G(y)

therefore Cumulative distribution Function of $Y=G^{-1}(F(x))$ is G .

(b) Let $F_{H}(h)$ be joint cdf of $(x, y)$ then we have ,

$F_{H}(h)=P(X \leq x, Y \leq y) =P(X \leq x, G^{-1}(F(X)) \leq y) =P(X \leq x, F(X) \leq G(y))$

=$P(F(X) \leq F(x), F(X) \leq G(y))$

[ Since if $X \le x$ with probability 1 then $F(X) \le F(x)$ with probability 1 as F is strictly increasing distribution function ]
= $P(\min F(X) \leq \min {F(x), G(y)}) =P(X \leq F^{-1}(\min {F(x), G(y)}))$

=$F(F^{-1}(\min {F(x),(n(y)})) =\min {F(x), G(y)}$ [ Since F is CDF of X ]

Therefore , the joint distribution function of $(X, Y),$ denoted by $H(x, y),$ is given by $H(x, y)=\min (F(x), G(y))$

## Food For Thought

Find the the distribution function of $Y=G^{-1}( F(X))$ where G is continuous and strictly decreasing function .

Categories

## ISI MStat PSB 2004 Problem 7 | Finding the Distribution of a Random Variable

This is a very beautiful sample problem from ISI MStat PSB 2004 Problem 7 based on finding the distribution of a random variable . Let’s give it a try !!

## Problem– ISI MStat PSB 2004 Problem 7

Suppose X has a normal distribution with mean 0 and variance 25 . Let Y be an independent random variable taking values -1 and 1 with
equal probability. Define $S=X Y+\frac{X}{Y}$ and $T=X Y-\frac{X}{Y}$
(a) Find the probability distribution of s.
(b) Find the probability distribution of $(\frac{S+T}{10})^{2}$

### Prerequisites

Cumulative Distribution Function

Normal distribution

## Solution :

(a) We can write $S = \begin{cases} 2x & , if Y=1 \\ -2x & , if Y=-1 \end{cases}$

Let Cumulative distribution function of S be denoted by $F_{S}(s)$ . Then ,

$F_{S}(s) = P(S \le s) = P(S \le s | Y=1)P(Y=1) + P(S \le s| Y=-1)P(Y=-1) = P(2X \le s) \times \frac{1}{2} + P(-2X \le s) \times \frac{1}{2}$ —-(1)

Here given that Y takes values 1 and -1 with equal probabilities .so , $P(Y=1)=P(Y=-1)= \frac{1}{2}$ .

Now as $X \sim N(0, 5^2)$ hence X is symmetric distribution about 0 . Thus X and -X are identically distributed .

Thus from (1) we get $F_{S}(s) = P(X \le s/2 ) \times \frac{1}{2} + P(-X \le s/2) \times \frac{1}{2} = P(X \le s/2 ) \times \frac{1}{2} + P(X \le s/2) \times \frac{1}{2}$=$P( X \le s/2) = P(\frac{X-0}{5} \le s/2 ) = \Phi(\frac{s-0}{10})$

Hence $S \sim N(0,{10}^2)$.

(b) Let K=$(\frac{S+T}{10})^{2}$ = $\frac{{XY}^2}{ {(10)}^2}$

Let C.D.F of K be $F_{K}(k) = P(K \le k ) = P(K \le k | Y=1)P(Y=1) + P(K \le k| Y=-1)P(Y=-1) = P( \frac{X^2}{{(10)}^2} \le k )$

=$P( -10 \sqrt{k} \le X \le 10 \sqrt{k} ) = \Phi(2\sqrt{k}) – \Phi(-2 \sqrt{k})$ as $X \sim N(0, 5^2)$.

## Food For Thought

Find the distribution of T .

Categories

## ISI MStat PSB 2004 Problem 4 | Calculating probability using Uniform Distribution

This is a very beautiful sample problem from ISI MStat PSB 2004 Problem 4 based on finding the probability using Uniform distribution . Let’s give it a try !!

## Problem– ISI MStat PSB 2004 Problem 4

Two policemen are sent to watch a road that is $1 \mathrm{km}$ long. Each of the two policemen is assigned a position on the road which is chosen according to a uniform distribution along the length of the road and independent of the other’s position. Find the probability that the
policemen will be less than 1/4 kilometer apart when they reach their assigned posts.

### Prerequisites

Uniform Distribution

Basic geometry

## Solution :

Let X be the position of a policeman and Y be the position of another policeman on the road of 1km length .

As it is given that chosen according to a uniform distribution along the length of the road and independent of the other’s position hence we can say that $X \sim U(0,1)$ and $Y \sim U(0,1)$ and X,Y are independent .

Now we have to find the probability that the policemen will be less than 1/4 kilometer apart when they reach their assigned posts , which is

nothing but $P(|X-Y|< \frac{1}{4} )$ .

So , let’s calculate the probability $P(|X-Y|< \frac{1}{4} )$ here some sort of geometry will help to calculate it easily !

In general we have 0<X<1 and 0<Y<1 and hence the total probability is the area of the square $1 \times 1$

And in favourable case we have $|X-Y|<1/4 , 0<X<1 , 0<Y<1$ . so, it’s basically the area covered by ACBDEF = Area covered by square – area of the triangles BGD and AFH = $1 \times 1$ – $2 \times$ $\frac{1}{2}$ $\times \frac{3}{4} (1- \frac{1}{4} )$ = $1-9/16$ .

Therefore $P(|X-Y|<1/4)= \frac{1-9/16}{1} = \frac{7}{16}$

## Food For Thought

Calculate the same under the condition that road is of length (b-a) , b>a and both are positive real number .

Categories

## ISI MStat PSB 2005 Problem 2 | Calculating probability using Binomial Distribution

This is a very beautiful sample problem from ISI MStat PSB 2005 Problem 2 based on finding probability using the binomial distribution. Let’s give it a try !!

## Problem– ISI MStat PSB 2005 Problem 2

Let $X$ and $Y$ be independent random variables with X having a binomial distribution with parameters 5 and $1 / 2$ and $Y$ having a binomial distribution with parameters 7 and $1 / 2 .$ Find the probability that $|X-Y|$ is even.

### Prerequisites

Binomial Distribution

Binomial Expansion

Parity Check

## Solution :

Given $X \sim$ Bin(5,1/2) and $Y \sim$ Bin(7,1/2) , and they are independent .

Now , we have to find , $P(|X-Y|=even )$

$|X-Y|$= even if both X and Y are even or both X and Y are odd .

Therefore $P(|X-Y|=even )=P(X=even,Y=even) + P(X=odd , Y=odd)$

P(X=even , Y= even ) =$( {5 \choose 0} {(\frac{1}{2})}^5 + {5 \choose 2} {(\frac{1}{2})}^5 + \cdots + {5 \choose 4} {(\frac{1}{2})}^5 )( {7 \choose 0} {(\frac{1}{2})}^7 + {7 \choose 2} {(\frac{1}{2})}^7 + \cdots + {7 \choose 6} {(\frac{1}{2})}^7)$

=$({(\frac{1}{2})}^5 \times \frac{2^5}{2})({(\frac{1}{2})}^7 \times \frac{2^7}{2})$

= $\frac{1}{4}$

Similarly , one can find P(X=odd , Y=odd ) which is coming out to be $\frac{1}{4}$ .

Hence , P(|X-Y|) = 14+1/4 = 1/2 .

## Food For Thought

Try to find P(X-Y=odd) under the same condition as given in the above problem .

Categories

## ISI MStat PSB 2013 Problem 2 | Application of sandwich Theorem

This is a very beautiful sample problem from ISI MStat PSB 2013 Problem 2 based on use of Sandwich Theorem . Let’s give it a try !!

## Problem– ISI MStat PSB 2013 Problem 2

Let f be a real valued function satisfying $|f(x)-f(a)| \leq C|x-a|^{\gamma}$ for some $\gamma>0$ and $C>0$
(a) If $\gamma=1,$ show that f is continuous at a
(b) If $\gamma>1,$ show that f is differentiable at a

### Prerequisites

Differentiability

Continuity

Limit

Sandwich Theorem

## Solution :

(a) We are given that $|f(x)-f(a)| \leq C|x-a|$ for some $C>0$.

We have to show that f is continuous at x=a . For this it’s enough to show that $\lim_{x\to a} f(x)=f(a)$.

$|f(x)-f(a)| \leq C|x-a| \Rightarrow f(a)-C|x-a| \le f(x) \le f(a) + C|x-a|$

Now taking limit $x \to a$ we have , $\lim_{x\to a} f(a)-C|x-a| \le \lim_{x\to a} f(x) \le \lim_{x\to a} f(a) + C|x-a|$

Using Sandwich theorem we can say that $\lim_{x\to a} f(x) = f(a)$ . Since $\lim_{x\to a} -C|x-a| = \lim_{x\to a} C|x-a|=0$

Hence f is continuous at x=a proved .

(b) Here we have to show that f is differentiable at x=a for this it’s enough to show that the $\lim_{x\to a} \frac{f(x)-f(a)}{x-a}$ exists .

We are given that , $|f(x)-f(a)| \leq C|x-a|^{\gamma}$ for some $\gamma>1$ and $C>0$ ,

which implies $|\frac{f(x)-f(a)}{x-a} | \le C|x-a|^{\gamma -1}$

$\Rightarrow -C|x-a|^{\gamma -1} \le \frac{f(x)-f(a)}{x-a} \le C|x-a|^{\gamma -1}$

Now taking $\lim_{x\to a}$ we get by Sandwich theorem $\lim_{x\to a}\frac{f(x)-f(a)}{x-a} =0$ i.e f'(a)=0 .

Since , $\lim_{x\to a} C|x-a|^{\gamma -1} = \lim_{x\to a} -C|x-a|^{\gamma -1} = 0$ , for $\gamma >1$.

Hence f is differentiable at x=a proved .

## Food For Thought

$f : R \to R$ be such that $|f(x)-f(a)| \le k|x-y|$ for some $k \in (0,1)$ and all $x,y \in R$ . Show that f must have a unique fixed point .