Select Page

# number theory and inequalities

Home Forums Math Olympiad, I.S.I., C.M.I. Entrance Number Theory number theory and inequalities

Viewing 2 posts - 1 through 2 (of 2 total)
• Author
Posts
• #22510
space time
Participant

a and b are natural numbers

(a^2 + b^2 -d)^n = ( a^n + b^n)^2

n > 2 , d is an whole number whose value ranges from 0 to ab

prove that no natural solutions exist

#23205
Cheenta Support
Participant

For $n>3$, if $n$ is odd, then this means $a^n+b^n$ is a perfect $n^{th}$ power. By Fermat’s Last Theorem,
no positive integer solutions exist.

If $n$ is even, suppose that $n=2k$. Then, let $x=a^2$ and $y=b^2$. The equation becomes:
$(x+y-d)^k=x^k+y^k$. By Fermat’s Last Theorem, no positive integer solutions exist as well.

We now need to check $k=1,2$. For the first, note that $a^2+b^2<(a+b)^2$, so no solutions.
For the second,
$a^2+b^2-d=a^2+b^2$ implies $d=0$
So any $a,b$ work with $d=0$. However, if the range of $d$ is exclusive, there are no solutions.

The solution is Provided by our teaching assistant Tarit Goswami.

Viewing 2 posts - 1 through 2 (of 2 total)
• You must be logged in to reply to this topic.