Thread: Systems of Linear Equations and Linear Algebra

1. I'm sure most all posters on this forum can solve systems of linear equations with multiple variables using the classical "substitution" method. Are there methods deeper within linear algebra that allows for deriving a solution faster rather than the tedious "solving and plugging in" technique?

For members that aren't too familiar with linear equations, they are merely equations with a maximum exponent of one, taking the form of a polynomial. The substituion method I described above is similar to such (using a rather elementary linear equation, there are much more difficult ones):

4 + 3x= 16
-4____-4
-------------
3x= 12
/3__/3
-------------
x= 4

{4}

2.

3. Of course Gaussian elimination is pretty much "Linear Algebra 101"...

http://en.wikipedia.org/wiki/Gaussian_elimination

I need to go to work so I can't discuss the topic further at the moment...

4. Originally Posted by Ellatha
I'm sure most all posters on this forum can solve systems of linear equations with multiple variables using the classical "substitution" method. Are there methods deeper within linear algebra that allows for deriving a solution faster rather than the tedious "solving and plugging in" technique?

For members that aren't too familiar with linear equations, they are merely equations with a maximum exponent of one, taking the form of a polynomial. The substituion method I described above is similar to such (using a rather elementary linear equation, there are much more difficult ones):

4 + 3x= 16
-4____-4
-------------
3x= 12
/3__/3
-------------
x= 4

{4}

Any way you cut it, assuming that you are working with a system that actually has a unique solution, the problem comes down to inverting a matrix.

A system of linear equations is equivalent to a single vector equation where A is the matrix of coefficients, x is the vector of "unknowns" and y is the vector of "givens".

There are techiques for finding the matris . General techniques are not that different from Gaussian eliminatioin, in terms of the computational steps. There are some special techniques for cases in which A has some special properties, for instance if it has lots of 0's sparse matrices. Yoiu might try googling "sparse matrix techiques".

In general computing matrix inverses is a pain the neck.

There are general ways to compute inverses, involving calculating a bunch of determinants (classical adjoint divided by the determinant) but that is also very tedious and not very efficient from a computational perspective.

5. Thanks for that Dr. Rocket. Do you think learning Gaussian elimination is worth the effort in that case, or should I stick to solving and plugging in?

6. For a system with three or more unknowns, I'm thinking that Gaussian elimination would be way easier. It's really not hard, so you should go ahead and learn it. Always good to see new things.

7. Originally Posted by Ellatha
Thanks for that Dr. Rocket. Do you think learning Gaussian elimination is worth the effort in that case, or should I stick to solving and plugging in?
I think you ought to learn some linear algebra. Solving systems of equations, per se, is really boring. Linear algebra has a lot more to offer than just Gaussian elimination.

 Bookmarks
Bookmarks
 Posting Permissions
 You may not post new threads You may not post replies You may not post attachments You may not edit your posts   BB code is On Smilies are On [IMG] code is On [VIDEO] code is On HTML code is Off Trackbacks are Off Pingbacks are Off Refbacks are On Terms of Use Agreement