How do you prove the factor theorem?
Printable View
How do you prove the factor theorem?
I'm assuming you mean the following:
"Let f(x) be a polynomial with coefficients in the real numbers (or complex numbers or any field for that matter), then x-a divides f(x) iff a is a root of f(x)."
This is a specific instance of a more general fact:
"The remainder when f(x) is divided by x-a is f(a)."
This requires knowledge of the division algorithm for polynomials:
"Let f(x), g(x) be polynomials with coefficients in the real numbers. There exist unique polynomials q(x), r(x) with deg r(x) < deg g(x), f(x) = g(x)q(x)+r(x)."
To prove the general fact from this, we let g(x) = x-a. Then applying the division algorithm, there are polynomials q(x), r(x) with deg r(x) < deg (x-a) and f(x) = (x-a)q(x)+r(x). Since deg (x-a) = 1, we must have r(x) = b is a constant polynomial. Thus f(x) = (x-a)q(x)+b. Plugging in a, we get f(a) = (a-a)q(a)+b = b, so r(x) = f(a). We then see f(a) = 0 iff the remainder is 0 iff x-a divides f(x).
The polynomial division algorithm isn't too hard to prove--you can do it pretty easily by induction, say. I can run through this if you'd like.
Yes, please.
Yes, please.
So let f(x) have degree n and g(x) have degree m. We assume that g(x) is not the zero polynomial. If n < m (or if, say, f(x) is the 0 polynomial), then we let q(x) = 0 and r(x) = f(x). Otherwise, m ≥ n. Suppose that
http://math.b3co.com/img/8cd4da2b873...a0881bf812.png
and
http://math.b3co.com/img/6637bbc40a3...5b4dea4bd4.png
Now note that, whatever q(x) is, we must have deg(q(x)) = n-m. This is because whatever r(x) is, it's supposed to have degree less than m, so it must have degree less than n. The sum of two polynomials has degree at most the maximum of the degree of the summands, and so g(x)q(x) must have degree n, whence q(x) must have degree n-m. So we'll solve for the coefficients of q(x). q(x) has the form
http://math.b3co.com/img/c8031a6573c...86e8c90ca8.png
Multiplying b(x) and q(x), we obtain
http://math.b3co.com/img/1da59d9a35e...a9c83151f3.png
where
http://math.b3co.com/img/7562dc79c73...ba6a09f680.png
Here, I use the convention that b<sub>i</sub> = 0 unless 0 ≤ i ≤ m and c<sub>j</sub> = 0 unless 0 ≤ j ≤ m-n. Since we want r(x) = f(x) - q(x)b(x) to have degree less than m, we want f(x) and q(x)b(x) to agree for all powers x<sup>k</sup> with m ≤ k ≤ n, i.e. we want a<sub>k</sub> = d<sub>k</sub> for m ≤ k ≤ n. Thus we have m-n+1 unknowns and m-n+1 equations
http://math.b3co.com/img/d9c27087674...e39cb42eb9.png
(Note that we may have n-m < m, and so any c<sub>k</sub> with k > n-m is 0.) If you know a thing or two about linear algebra, you'll see that you can set this up as a linear equation Bc = a, where B is a square, lower-triangular matrix with entries b<sub>m</sub> going down the diagonal. Since deg(g(x)) = m, we have b<sub>m</sub> ≠ 0, and so this system has a unique solution.
Even if you don't know linear algebra, you should know simultaneous equations, and this is enough to see what to do: use the first equation to solve for c<sub>n-m</sub>, and then use this new information in the next equation to solve for c<sub>n-m-1</sub>, as the only other unknown in this equation is c<sub>n-m</sub>, and we figured this out already. Now use this new information in the third equation, substituting in the values for the two coefficients we already know to solve for c<sub>n-m-2</sub>. Continue in this fashion--each successive equation involves one new unknown, the rest already having been solved for, and you can always solve for the new unknown, as it's always multiplied by something nonzero--namely, b<sub>m</sub>.
Since we can solve this equation uniquely, a unique q(x) exists such that f(x) - b(x)q(x) has degree less than m. Now let r(x) = f(x) - b(x)q(x).
Uniqueness is guaranteed by linear algebraic consideration, but we can show it easily in any case: suppose q(x), r(x), q'(x), r'(x) are two solutions to the division algorithm. Then
q(x)b(x)+r(x) = q'(x)b(x)+r'(x)
Thus
(q(x)-q'(x))b(x) = r'(x)-r(x)
r'(x)-r(x) has degree less than the degree of b(x), but this equation implies that b(x) divides them. Thus r'(x)-r(x) must be 0, i.e. r'(x) = r(x), and this immediately implies q(x) = q'(x).
Thansk, sorry I made a double-post.
No biggie. ;)
----------------
PS: This is my 1000<sup><u>th</u></sup> post!