Notices
Results 1 to 8 of 8

Thread: factorization of transposed and nontransposed matricies.

  1. #1 factorization of transposed and nontransposed matricies. 
    Forum Sophomore
    Join Date
    Jan 2008
    Location
    Texas
    Posts
    158
    Say I have a nxn matrix B. I have two column vectors a and c. (B a and c can be either members of the reals or imaginary numbers)



    Ba * Transpose(Bc) = B transpose(B)(a * transpose(c))

    Where * is the dot product (more appropriately the inner product)

    Are these equivalent? This is part of a much larger problem I'm trying to solve and I'm stuck here because I'm not sure if this can be written in such a way, It seems like you should be able to some how factor the matrix B out of the equation but I'm not sure.


    Reply With Quote  
     

  2.  
     

  3. #2  
    Forum Sophomore
    Join Date
    Jan 2008
    Location
    Texas
    Posts
    158
    should add that the left hand side is just an inner product. And the transpose should probably be expressed as a Hermitian operator instead.


    Reply With Quote  
     

  4. #3  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    I don't think you can, not in general anyway. Recall for any two matrices B and C, of appropriate dimension for multiplication, then . Now since a column vector is a matrix, the left hand side will be expressed as.

    Since is itself a matrix and matrices do not commute under multiplication, in general, it stands that you could only arrange the equation in the form of in the special case that commutes with . Or maybe if 'a' and 'c' are eigenvectors of the matrix B, but i state this without certainty or proof.

    In summary the answer is no for the general case and i don't believe the answer changes based on the properties of B, more on how 'a' and 'c' relate to B.
    Reply With Quote  
     

  5. #4  
    Forum Sophomore
    Join Date
    Jan 2008
    Location
    Texas
    Posts
    158
    You're right, I was looking over the problem again yesterday and factorization would be impossible because the transpose changes the order of the matrix multiplication.

    I'm approaching the problem from a different angle but it's not going well.

    I'm suppose to prove that every inner product on a vector space D can be computed using

    aij = <bj,bi>

    where bi and bj are the ith and jth elements of the basis B of D.

    any ideas? I've already written out two arbitrary vectors x, y as linear combinations of the elements of B.

    The hint given in the problem is to show that

    <x,y> = Transpose([y]B)A * [x]B

    [y]B and [x]B are the coordinate matrices of y and x respectively.



    Not asking for a solution just a hint.
    Reply With Quote  
     

  6. #5  
    Forum Sophomore
    Join Date
    Jan 2008
    Location
    Texas
    Posts
    158
    So I figured out that the solution was similar to where I started.

    the inner product of x and y can be represent as so:

    <x,y> = <B[x]sub b, B[y]sub b> = Hermitian(B[y]sub b) * B[x]sub b = Hermitian([y]sub b)Hermitian(B) * B [x]sub b = Hermitian([y]sub b)A[x]sub B


    Where A is a matrix composed of inner products between the basis vectors in B. Therefore A can be used to compute all inner products in the vector space.

    FUN!
    Reply With Quote  
     

  7. #6  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    I meant to reply days ago, sorry for the delay.

    It seems to me that your first statement, that is in fact what you're supposed to prove, that the change of basis transformation preserves the inner product. In other words that the basis transformation is an orthogonal linear transformation. So assuming i've interpreted your notation correctly then for denoting a basis for D then represents the basis for D in terms of the basis for , ie is a basis transformation. Hence for and you need to prove that you can use the matrix A, as you defined, to directly compute the inner product in D.


    If it were me i would start with the fact that the matrix representation of the inner product in is the identity matrix I, so .

    What do you notice about the matrix ?
    Reply With Quote  
     

  8. #7  
    Forum Sophomore
    Join Date
    Jan 2008
    Location
    Texas
    Posts
    158
    Quote Originally Posted by wallaby View Post
    I meant to reply days ago, sorry for the delay.

    It seems to me that your first statement, that is in fact what you're supposed to prove, that the change of basis transformation preserves the inner product. In other words that the basis transformation is an orthogonal linear transformation. So assuming i've interpreted your notation correctly then for denoting a basis for D then represents the basis for D in terms of the basis for , ie is a basis transformation. Hence for and you need to prove that you can use the matrix A, as you defined, to directly compute the inner product in D.


    If it were me i would start with the fact that the matrix representation of the inner product in is the identity matrix I, so .

    What do you notice about the matrix ?
    It is equal to the Matrix aij, which is what I was suppose to prove. I didn't do my inner product correct when I was first doing this problem and it threw me off. My professor also showed me away of approaching the problem by keeping all of my notation in sums of i=1,...,n.

    lesson learned, compute your inner product correctly and this problem can be solved in 5 minutes. If you mess up and don't realize, it will take you more than a day to solve. :S
    Reply With Quote  
     

  9. #8  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by GenerationE View Post
    lesson learned, compute your inner product correctly and this problem can be solved in 5 minutes. If you mess up and don't realize, it will take you more than a day to solve. :S
    Linear algebra problems are particularly painful in that respect, errors are rarely a simple loss of minus signs or forgetting some constant. (at least for me)
    Reply With Quote  
     

Similar Threads

  1. How to design front end for INTEGER FACTORIZATION TOOL
    By krishna139mukanti in forum Computer Science
    Replies: 0
    Last Post: July 8th, 2010, 12:59 AM
Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •