A mind-blowing problem! This will be my hot favourite application of binary search algorithm and nested intervals theorem. Well, you can apply this technique in many other real analysis problems, so follow it carefully:
If c=a and d=b satisfy the conditions, then we are done. Otherwise, suppose there is an x in [a,b] such that f(x) is not between f(a) and f(b). Two cases arise:
(i) f(x)>f(b), in which case, by continuity there is a b1 in (a,x) such that f(b1)=f(b). In this case, let a1=a.
(ii) f(x)<f(a), in which case, by continuity again, there is an a1 in (x,b) such that f(a1)=f(a). In this case, let b1=b.
Repeat this process on a1 and b1 to get a2 and b2, and so on, till you get an interval, say [a(n),b(n)] for some n, such that f(a(n))=f(a), f(b(n))=f(b) and for all x in (a(n),b(n)), f(x) is in (f(a),f(b)). Letting c=a(n) and d=b(n), we are done.
Suppose the algorithm never stops, that is you never arrive at an interval [a(n),b(n)] satisfying the given conditions. Then by nested intervals theorem, there is a t in [a,b] such that a(n)–> t and b(n)—>t as n–>infinity. But by continuity, f(a(n))—>f(t) and f(b(n))–>f(t) . So f(a)—> f(t) and f(b)—->f(t) [ basically, f(a(n)) and f(b(n)) are constant sequences tending to the same limit.] So f(t)=f(a)=f(b), contradicting f(a)<f(b).
Still an amateur at analysis, so please point out any errors. Cheers!
This reply was modified 2 months, 4 weeks ago by Alpha Beta.