Posted by Klavot on January 20, 2007, at 16:18:58
In reply to Anyone good at number theory?, posted by linkadge on January 11, 2007, at 17:17:02
> If you are given that the GCD(a,b)=1, what is the value of GDD(a^2 + b^2,a+b)?
>
> I've been struggling for hours.
>
> LinkadgeLet p be a prime dividing both a^2 + b^2 and a+b. Then a^2 + b^2 = mp and a+b = np for some integers m,n. Since a^2 + b^2 = (a+b)^2 - 2ab then mp = n^2p^2 - 2ab, i.e. 2ab = (n^2p - m)p.
Thus p = 2 or p divides a or p divides b.Suppose p divides a. Since p divides a+b then p divides b as well, a contradiction because gcd(a,b) = 1. Hence p does not divide a, and likewise p does not divide b either. Hence p = 2.
Thus we have shown that if a prime p divides both a+b and a^2 + b^2 then p=2.
Regarding possible values of a and b, there are two cases to consider.
Case 1: a odd, b odd. Then a+b is even and a^2 + b^2 is even, so gcd(a^2 + b^2,a+b) >= 2. From the above it follows that gcd(a^2 + b^2,a+b) = 2.
Case 2: a odd, b even. Then a+b is odd and a^2 + b^2 is odd. From the above, since the only candidate prime to divide both a+b and a^2 + b^2 is 2, it follows that gcd(a^2 + b^2,a+b) = 1.
Klavot
poster: Klavot
thread:721428
URL: http://www.dr-bob.org/babble/social/20070112/msgs/724537.html