I am trying to read linear algebra, in order to decide to study it.
Cold anyone tell me in simple words what it is all about and its main uses.
Thanks
|
I am trying to read linear algebra, in order to decide to study it.
Cold anyone tell me in simple words what it is all about and its main uses.
Thanks
Linear algebra is usually taken after studying the first two courses in Calculus (differentiation, integration, infinite series, sequences, parametric and polar equations) and before or with Multi-variable Calculus (differentiation and integration with more than 2 unknowns). Though linear algebra is quite a bit unlike calculus. In calculus you deal with curves and curved regions, but linear algebra is the study of straight lines and planes. It's primarily solving systems of equations by using matrices at the school I went to. There's some really counterintuitive weird stuff like describing linear spaces that takes some getting used to at first. It isn't really easy. I was a math major at the time, and I found it the hardest of all the classes I took (all the way through differential equations) because it was unlike any of the others. More proof-based. However, it is a bit of a harbinger for things to come if you continue in math. It's used for all sorts of things because it is one of the oldest and strongest of the math approaches. Its applications are often found when it is coupled with calculus to solve systems.
Thanks for the excellent explanation. If its main purpose is to solve systems of linear equations, in what way it it more efficient than the various ancient mwthods (Gaussian etc)?
One more question : is it related also to eigenvalues?
The Gaussian method is a specific application of the much more general study of linear algebra; the latter gives a rigorous rationale for all the specific methods you might have encountered previously. Linear algebra doesn't just deal with linear equations, but more generally with vector spaces and, more importantly, mappings between them. As you probably know vector spaces are crucial to a later study of differential geometry.
Yes, indeed.One more question : is it related also to eigenvalues?
Not really. Take some arbitrary ( square ) matrix, and a vector with the same dimension. Now multiply the vector by the matrix - this yields another vector. If that new vector has the same or the exact opposite direction as the original one, then it is an eigenvector of the matrix; the ratio of their lengths is the corresponding eigenvalue. Nothing really to do with perspectives; it is simply an operation which maps a matrix into a real number, which is then called the matrix' eigenvalue.
You can use a 4x4 matrix to calculate the perspective transformation of a set of points. (I don't know if this has anything to do with linear algebra but I mention it because logic is one those who seems to latch onto a single word and then extrapolate like crazy.)
Last edited by monalisa; June 13th, 2013 at 07:35 AM.
A matrix represents a linear transformation of some kind ( e.g. rotation, scaling etc etc ). The determinant of that matrix is the factor by which the original object gets scaled in that process.
Take for example a parallelogram spanned by two vectors, which represents a well defined area. Now apply some transformation to it, represented by a matrix ( e.g. you could rotate it, stretch it, etc ); the resulting new parallelogram after the transformation will in general have a different area. The two areas before and after the transformation are related by the determinant of the transformation matrix, by way of being some multiplicative factor. This becomes very important later on when considering coordinate transformations under integrals, i.e. the generalized volume element.
Now, this is a very simplistic way to think of it, but it does convey the general idea.
Actually, come to think of it, there is an even simpler visualisation : if you consider the columns of a matrix as vectors, then the determinant of the matrix is the area / volume spanned by those vectors. It is that simple. Consider for example :
The determinant det(A) is then the area of the parallelogram spanned by the vectors
which is
You will find this example and others, along with some good drawings, on the Wiki page for "Determinant" :
Determinant - Wikipedia, the free encyclopedia
That ìs great, Markus, thank you!.
One more question, if you please: what is the meaning of a determinant 4 x 4?
No problem. The meaning of higher-dimensional determinants follows the same principle :
2x2 - Spanned by 2 vectors - area ( parallelogram )
3x3 - Spanned by 3 vectors - volume ( parallelpiped )
4x4 - Spanned by 4 vectors - 4D hypervolume
5x5 - Spanned by 5 vectors - 5D hypervolume
.
.
.
And so on. I think you get the idea![]()
I am not aware of the term linear algebra.
I just want to know whether derivation and integration are exactly opposite processes with respect to any algebraic expression in a finite function set.
Linear algebra deals with vector spaces and linear mappings; as such much of it is concerned with matrices, vectors, and the corresponding eigenvalues :
Linear algebra - Wikipedia, the free encyclopedia
« Can somebody explain the Riemann hypothesis? | Data processing » |