Notices
Results 1 to 19 of 19
Like Tree1Likes
  • 1 Post By wallaby

Thread: Determinant of a matrix

  1. #1 Determinant of a matrix 
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Hi there,
    I'd like to know if anyone can "prove" (or explain) why the following assertion is true :

    det(matrix)=0 <==> lines or columns are linearly independant

    Cheers


    Reply With Quote  
     

  2.  
     

  3. #2  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by jameswell View Post
    Hi there,
    I'd like to know if anyone can "prove" (or explain) why the following assertion is true :

    det(matrix)=0 <==> lines or columns are linearly independant

    Cheers
    Actually that assertion is false, the determinant of a matrix is zero if, and only if, the columns are linearly DEPENDENT. You should try proving it for yourself first, or show us what you've done if you already have tried. But i'll give you a hint, it has to do with the reduced row echelon form of the matrix.


    Reply With Quote  
     

  4. #3  
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Yes of course, my mistake, I forgot the slash after the "equal" sign to mean "different to".
    Anyway, I've tried doing it with a 3 by 3 matrix as follows :

    a b c
    d e f
    g h i

    But I can't transform it to it's row echelon form because, I would have to mutiply a line by one of the figures.

    For example, I would multiply line 1 by d so the matrix would become :

    ad bd cd
    d e f
    g h i

    and then I would multiply line 2 by a to get :

    ad bd cd
    ad ae af
    g h i


    After this, I would substract line 1 from line 2 :

    ad bd cd
    0 ae-bd af-cd
    g h i

    The problem being that all this is only allowed if I'm sure that a and d aren't equal to 0, otherwise the "new" matrices have different determinants.

    I've also tried using simoultaneous equations to try and "find out" what conditions are required so that none of the rows or columns are linear combinations of others,
    and I had exactly the same problem.

    So I can't think of anything else that generalises the determinant to any square matrix.
    Reply With Quote  
     

  5. #4  
    Moderator Moderator Markus Hanke's Avatar
    Join Date
    Nov 2011
    Location
    Ireland
    Posts
    7,302
    Quote Originally Posted by jameswell View Post
    Hi there,
    I'd like to know if anyone can "prove" (or explain) why the following assertion is true :

    det(matrix)=0 <==> lines or columns are linearly independant

    Cheers
    I agree with wallaby in letting you work out the formal proof yourself.
    As for an explanation why the determinant is zero if the columns are dependent, remember the geometrical visualization of what a determinant is - its absolute value is simply the area/volume spanned by the column vectors. If those vectors are linearly dependent, then the area/volume is obviously zero because the vectors point in the same direction.
    Reply With Quote  
     

  6. #5  
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Believe me I've tried everything I can think of to work it out for myself, that's how I would have wanted to find it out.
    The geometrical visualization is a good point, I had realized that, except that it can only be visualized in two or three dimensions.
    And I'm sure there must be a more algebraic way.
    Do you actually know it ? Or are you kind of guessing it can be worked out ? Maybe what I did in my previous post is the only way and therefore the problem needs to be dealt with in all the different possible cases, being "a=0" and "a=/0", etc...
    Reply With Quote  
     

  7. #6  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    In theory the method you presented in your post above, which it seems was to actually perform the row reduction of a general 3x3 matrix, could work. (but there would be some complicated algebra involved) An easier method would be to recall, if you've learnt this before, that the row reduced matrix (U) can be expressed in terms of some matrix (A) by the product of elementary matrices. These elementary matrices are defined so that multiplying them by the matrix A is the equivalent of carrying out a row operation. (swapping, scaling, subtraction) So the kth row operation performed on the matrix A to yield the row reduced form will be denoted by and thus we may represent the matrix U by,



    You can take my word for it or find more information here, but the determinants of these elementary matrices are non-zero. (proving this would probably be a good exercise to run through) So now using the properties of determinants and the equation i provided above you should be able to fill in the blanks, just remember what the matrix 'U' will look like if the columns are linearly independent and what this means for the determinant.
    Reply With Quote  
     

  8. #7  
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Quote Originally Posted by wallaby View Post
    In theory the method you presented in your post above, which it seems was to actually perform the row reduction of a general 3x3 matrix, could work. (but there would be some complicated algebra involved) An easier method would be to recall, if you've learnt this before, that the row reduced matrix (U) can be expressed in terms of some matrix (A) by the product of elementary matrices. These elementary matrices are defined so that multiplying them by the matrix A is the equivalent of carrying out a row operation. (swapping, scaling, subtraction) So the kth row operation performed on the matrix A to yield the row reduced form will be denoted by and thus we may represent the matrix U by,



    You can take my word for it or find more information here, but the determinants of these elementary matrices are non-zero. (proving this would probably be a good exercise to run through) So now using the properties of determinants and the equation i provided above you should be able to fill in the blanks, just remember what the matrix 'U' will look like if the columns are linearly independent and what this means for the determinant.
    If this has anything to do with linear applications or basis changes, it'll take me a bit of thinking before I can understand it cause it's still a bit new to me, but I'll try and work it out !
    thanks
    Reply With Quote  
     

  9. #8  
    Moderator Moderator Markus Hanke's Avatar
    Join Date
    Nov 2011
    Location
    Ireland
    Posts
    7,302
    Quote Originally Posted by jameswell View Post
    Believe me I've tried everything I can think of to work it out for myself, that's how I would have wanted to find it out.
    The geometrical visualization is a good point, I had realized that, except that it can only be visualized in two or three dimensions.
    And I'm sure there must be a more algebraic way.
    Do you actually know it ? Or are you kind of guessing it can be worked out ? Maybe what I did in my previous post is the only way and therefore the problem needs to be dealt with in all the different possible cases, being "a=0" and "a=/0", etc...
    Well, I would do it simply this way :

    1. Proposition : determinant is zero if the column vectors are linearly dependent
    2. If two column vectors A and B are linearly dependent, then their cross product A x B is zero.
    3. If the cross product A x B is zero, then so is the determinant formed by taking A and B as column vectors of a sub-matrix
    4. Repeat (2) and (3) for each combination of column vectors in the matrix
    5. If all sub-determinants above are zero, then so is the total determinant of the original matrix.

    Not sure if this is mathematically rigorous ( I am not a mathematician ), but that would be my approach to proving this.
    Reply With Quote  
     

  10. #9  
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Quote Originally Posted by Markus Hanke View Post
    Quote Originally Posted by jameswell View Post
    Believe me I've tried everything I can think of to work it out for myself, that's how I would have wanted to find it out.
    The geometrical visualization is a good point, I had realized that, except that it can only be visualized in two or three dimensions.
    And I'm sure there must be a more algebraic way.
    Do you actually know it ? Or are you kind of guessing it can be worked out ? Maybe what I did in my previous post is the only way and therefore the problem needs to be dealt with in all the different possible cases, being "a=0" and "a=/0", etc...
    Well, I would do it simply this way :

    1. Proposition : determinant is zero if the column vectors are linearly dependent
    2. If two column vectors A and B are linearly dependent, then their cross product A x B is zero.
    3. If the cross product A x B is zero, then so is the determinant formed by taking A and B as column vectors of a sub-matrix
    4. Repeat (2) and (3) for each combination of column vectors in the matrix
    5. If all sub-determinants above are zero, then so is the total determinant of the original matrix.

    Not sure if this is mathematically rigorous ( I am not a mathematician ), but that would be my approach to proving this.
    Hey that's a cool way of seeing it for someone who isn't a mathematician, there is one small problem though :
    As you said, if the three sub-determinants are zero, then the determinant of the matrix is obviously zero.
    But the determinant of the matrix could very well be zero with two or even three of the sub-determinants not being zero.
    So your proof is good for showing that in some cases, linear dependance gives a zero determinant, but it doesn't show anything for the cases where determinant is zero without the three sub-determinants being zero.
    And it doesn't go the other way, saying that if the determinant is zero then there is linear dependance.
    But nice thinking
    Reply With Quote  
     

  11. #10  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by jameswell View Post
    If this has anything to do with linear applications or basis changes, it'll take me a bit of thinking before I can understand it cause it's still a bit new to me, but I'll try and work it out !
    thanks
    You don't need to know much of anything about basis changes, although i realise that i have been assuming a few things about what you know. A set of vectors , for will be linearly independent iff the equation,
    ,
    implies that all of the are equal to zero.

    We can express that equation above in matrix form, . Now if the determinant of 'A' is not equal to zero then the matrix will have a multiplicative inverse . Thus the solution to the matrix equation, or the linear system, will be . What this means for us is that if then any vector in the set cannot be written as a linear combination of the others.

    So what was all of that nonsense about reduced row echelon form? well if you're like me and can never remember why implies that A has no multiplicative inverse then the above won't seem very satisfying, not to say that it will anyway. (Basically i changed my mind about what was the easiest way to prove the initial assertion and am trying to cover my ass)
    Markus Hanke likes this.
    Reply With Quote  
     

  12. #11  
    Moderator Moderator Markus Hanke's Avatar
    Join Date
    Nov 2011
    Location
    Ireland
    Posts
    7,302
    Quote Originally Posted by jameswell View Post
    But the determinant of the matrix could very well be zero with two or even three of the sub-determinants not being zero.
    Interesting. Do you have an example for such a matrix ?
    Reply With Quote  
     

  13. #12  
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Quote Originally Posted by wallaby View Post
    Now if the determinant of 'A' is not equal to zero then the matrix will have a multiplicative inverse
    How do you prove that ? Could I do it easily with simultaneous equations or sthg of that kind ? Otherwise, very satisfying explanation !
    Reply With Quote  
     

  14. #13  
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Quote Originally Posted by Markus Hanke View Post
    Quote Originally Posted by jameswell View Post
    But the determinant of the matrix could very well be zero with two or even three of the sub-determinants not being zero.
    Interesting. Do you have an example for such a matrix ?
    Well, if you start with det(A) = 0 = 1 + 1 - 2 = 1* [0*1 - (-1)*1] - 1*[1*1 - (-1)*(-2)] + (-2)*[1*1 - 0*(-2)] you could say that it's the determinant of this matrix :

    1 1 -2
    1 0 -1
    -2 1 1


    And then you can multiply that matrix by whatever you want, or you can even multiply lines or colums of it by whatever you want, and the determinant will still be zero, so there are loads of this kind of matrix
    Reply With Quote  
     

  15. #14  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by jameswell View Post
    Quote Originally Posted by wallaby View Post
    Now if the determinant of 'A' is not equal to zero then the matrix will have a multiplicative inverse
    How do you prove that ? Could I do it easily with simultaneous equations or sthg of that kind ? Otherwise, very satisfying explanation !
    This is one of those things that you'll see a proof of once and then just take as fact there after, as a result the details of a rigorous proof are a bit fuzzy but i think the following can provide a reasonably compelling argument of why it's true.

    Given an n x n matrix A, some matrix will be a multiplicative inverse to 'A' if the following identity holds. . By the properties of determinants it follows that,


    .

    In the event that then we will not be able to define a determinant for . If exists then its elements should be finite in value and we should be able to calculate a finite determinant, thus an indeterminate determinant (i hate that i just used those two words together) would be an indication that the inverse does not exist.

    like i said, not rigorous but it's for this very reason that the methods of finding an inverse break down. (they seem to contain a 1/det(A) term)
    Reply With Quote  
     

  16. #15  
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Quote Originally Posted by wallaby View Post
    Quote Originally Posted by jameswell View Post
    Quote Originally Posted by wallaby View Post
    Now if the determinant of 'A' is not equal to zero then the matrix will have a multiplicative inverse
    How do you prove that ? Could I do it easily with simultaneous equations or sthg of that kind ? Otherwise, very satisfying explanation !
    This is one of those things that you'll see a proof of once and then just take as fact there after, as a result the details of a rigorous proof are a bit fuzzy but i think the following can provide a reasonably compelling argument of why it's true.

    Given an n x n matrix A, some matrix will be a multiplicative inverse to 'A' if the following identity holds. . By the properties of determinants it follows that,


    .

    In the event that then we will not be able to define a determinant for . If exists then its elements should be finite in value and we should be able to calculate a finite determinant, thus an indeterminate determinant (i hate that i just used those two words together) would be an indication that the inverse does not exist.

    like i said, not rigorous but it's for this very reason that the methods of finding an inverse break down. (they seem to contain a 1/det(A) term)
    Also like this explanation. I'm starting to read of many different methods of proof on this discussion, if I organize all these ideas I may get to something.
    The only problem I have here, is that I'm pretty sure that determinants were "invented" if I may say, before working out that det(A*B)=det(A)det(B).
    Maybe I can make myself a little clearer about where I'm trying to get to :
    With a 2x2 matrix, it's really easy to see that both columns/lines are independant or not, you just look at proportionnality. And so, the concept of looking to see whether three lines or columns are independant must be some kind of "extension" of the idea of proportionnality between three objects, and I'm amazed that I can't work it out.
    Reply With Quote  
     

  17. #16  
    Comet Dust Collector Moderator
    Join Date
    Mar 2011
    Location
    New Jersey, USA
    Posts
    2,848
    Wouldn't this be better off in the Math forum?
    Reply With Quote  
     

  18. #17  
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Quote Originally Posted by MeteorWayne View Post
    Wouldn't this be better off in the Math forum?
    Hi, I'm new on this forum, I don't actually remember exactly how I created this discussion, it could very well be in an inappropriate category.
    If anyone can re-place it don't hesitate to do so.
    Cheers
    Reply With Quote  
     

  19. #18  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by jameswell View Post
    Also like this explanation. I'm starting to read of many different methods of proof on this discussion, if I organize all these ideas I may get to something. The only problem I have here, is that I'm pretty sure that determinants were "invented" if I may say, before working out that det(A*B)=det(A)det(B).
    Maybe I can make myself a little clearer about where I'm trying to get to :
    With a 2x2 matrix, it's really easy to see that both columns/lines are independant or not, you just look at proportionnality. And so, the concept of looking to see whether three lines or columns are independant must be some kind of "extension" of the idea of proportionnality between three objects, and I'm amazed that I can't work it out.
    If wikipedia is to be believed then the notion of a determinant has been around since ancient times, but studied more seriously at the end of the 16th century. Apparently the equality between the determinant of a product and the product of determinants was not formally proven until 1812, courtesy of Cauchy and Binet. So how would we have known that a non-zero determinant implies the existence of a solution, to a system of linear equations, before 1812? Well i think that comes down to noticing that a system with a solution can be reduced to an upper triangular form, the determinant of which is evidently non-zero, while systems that are not in upper triangular form after sufficient row operations will have a determinant of zero. (As will be the case with the original system) This much would have been within the grasp of early mathematicians, but personally i like to reap the benefits of their labour and go with the easier explanation. (then again applied maths is more my thing)
    Reply With Quote  
     

  20. #19  
    Forum Freshman
    Join Date
    May 2012
    Posts
    19
    Quote Originally Posted by wallaby View Post
    Quote Originally Posted by jameswell View Post
    Also like this explanation. I'm starting to read of many different methods of proof on this discussion, if I organize all these ideas I may get to something. The only problem I have here, is that I'm pretty sure that determinants were "invented" if I may say, before working out that det(A*B)=det(A)det(B).
    Maybe I can make myself a little clearer about where I'm trying to get to :
    With a 2x2 matrix, it's really easy to see that both columns/lines are independant or not, you just look at proportionnality. And so, the concept of looking to see whether three lines or columns are independant must be some kind of "extension" of the idea of proportionnality between three objects, and I'm amazed that I can't work it out.
    If wikipedia is to be believed then the notion of a determinant has been around since ancient times, but studied more seriously at the end of the 16th century. Apparently the equality between the determinant of a product and the product of determinants was not formally proven until 1812, courtesy of Cauchy and Binet. So how would we have known that a non-zero determinant implies the existence of a solution, to a system of linear equations, before 1812? Well i think that comes down to noticing that a system with a solution can be reduced to an upper triangular form, the determinant of which is evidently non-zero, while systems that are not in upper triangular form after sufficient row operations will have a determinant of zero. (As will be the case with the original system) This much would have been within the grasp of early mathematicians, but personally i like to reap the benefits of their labour and go with the easier explanation. (then again applied maths is more my thing)
    Of course ! And I'm not trying to redo all their work, because progress is moving forward. I'm just very curious about it and it disturbs me somehow not to be able to understand it. So, you can't tell whether a system of equations can be reduced unless you know what the figures are, because to reduce it you have to multiply lines by those figures, but if some of them are equal to zero you can't anymore.
    So therefore, I'm thinking how did they work out "a calculation that tells whether systems can or can't be reduced, without knowing what their coefficients are", and that would consequently work for any square system of equations.
    Reply With Quote  
     

Similar Threads

  1. Can this matrix be ...
    By ReMakeIt in forum Mathematics
    Replies: 7
    Last Post: February 7th, 2012, 04:40 AM
  2. what is a determinant
    By dejawolf in forum Mathematics
    Replies: 5
    Last Post: August 24th, 2011, 09:12 AM
  3. matrix
    By Heinsbergrelatz in forum Mathematics
    Replies: 6
    Last Post: May 25th, 2010, 04:14 PM
  4. Matrix problem
    By em12 in forum Mathematics
    Replies: 0
    Last Post: November 18th, 2009, 05:06 PM
  5. 2x2 matrix determinant
    By rgba in forum Mathematics
    Replies: 10
    Last Post: August 26th, 2008, 12:36 PM
Tags for this Thread

View Tag Cloud

Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •