# Thread: factorization of transposed and nontransposed matricies.

1. Say I have a nxn matrix B. I have two column vectors a and c. (B a and c can be either members of the reals or imaginary numbers)

Ba * Transpose(Bc) = B transpose(B)(a * transpose(c))

Where * is the dot product (more appropriately the inner product)

Are these equivalent? This is part of a much larger problem I'm trying to solve and I'm stuck here because I'm not sure if this can be written in such a way, It seems like you should be able to some how factor the matrix B out of the equation but I'm not sure.

2.

3. should add that the left hand side is just an inner product. And the transpose should probably be expressed as a Hermitian operator instead.

4. I don't think you can, not in general anyway. Recall for any two matrices B and C, of appropriate dimension for multiplication, then . Now since a column vector is a matrix, the left hand side will be expressed as.

Since is itself a matrix and matrices do not commute under multiplication, in general, it stands that you could only arrange the equation in the form of in the special case that commutes with . Or maybe if 'a' and 'c' are eigenvectors of the matrix B, but i state this without certainty or proof.

In summary the answer is no for the general case and i don't believe the answer changes based on the properties of B, more on how 'a' and 'c' relate to B.

5. You're right, I was looking over the problem again yesterday and factorization would be impossible because the transpose changes the order of the matrix multiplication.

I'm approaching the problem from a different angle but it's not going well.

I'm suppose to prove that every inner product on a vector space D can be computed using

aij = <bj,bi>

where bi and bj are the ith and jth elements of the basis B of D.

any ideas? I've already written out two arbitrary vectors x, y as linear combinations of the elements of B.

The hint given in the problem is to show that

<x,y> = Transpose([y]B)A * [x]B

[y]B and [x]B are the coordinate matrices of y and x respectively.

Not asking for a solution just a hint.

6. So I figured out that the solution was similar to where I started.

the inner product of x and y can be represent as so:

<x,y> = <B[x]sub b, B[y]sub b> = Hermitian(B[y]sub b) * B[x]sub b = Hermitian([y]sub b)Hermitian(B) * B [x]sub b = Hermitian([y]sub b)A[x]sub B

Where A is a matrix composed of inner products between the basis vectors in B. Therefore A can be used to compute all inner products in the vector space.

FUN!

7. I meant to reply days ago, sorry for the delay.

It seems to me that your first statement, that is in fact what you're supposed to prove, that the change of basis transformation preserves the inner product. In other words that the basis transformation is an orthogonal linear transformation. So assuming i've interpreted your notation correctly then for denoting a basis for D then represents the basis for D in terms of the basis for , ie is a basis transformation. Hence for and you need to prove that you can use the matrix A, as you defined, to directly compute the inner product in D.

If it were me i would start with the fact that the matrix representation of the inner product in is the identity matrix I, so .

What do you notice about the matrix ?

8. Originally Posted by wallaby
I meant to reply days ago, sorry for the delay.

It seems to me that your first statement, that is in fact what you're supposed to prove, that the change of basis transformation preserves the inner product. In other words that the basis transformation is an orthogonal linear transformation. So assuming i've interpreted your notation correctly then for denoting a basis for D then represents the basis for D in terms of the basis for , ie is a basis transformation. Hence for and you need to prove that you can use the matrix A, as you defined, to directly compute the inner product in D.

If it were me i would start with the fact that the matrix representation of the inner product in is the identity matrix I, so .

What do you notice about the matrix ?
It is equal to the Matrix aij, which is what I was suppose to prove. I didn't do my inner product correct when I was first doing this problem and it threw me off. My professor also showed me away of approaching the problem by keeping all of my notation in sums of i=1,...,n.

lesson learned, compute your inner product correctly and this problem can be solved in 5 minutes. If you mess up and don't realize, it will take you more than a day to solve. :S

9. Originally Posted by GenerationE
lesson learned, compute your inner product correctly and this problem can be solved in 5 minutes. If you mess up and don't realize, it will take you more than a day to solve. :S
Linear algebra problems are particularly painful in that respect, errors are rarely a simple loss of minus signs or forgetting some constant. (at least for me)

 Bookmarks
##### Bookmarks
 Posting Permissions
 You may not post new threads You may not post replies You may not post attachments You may not edit your posts   BB code is On Smilies are On [IMG] code is On [VIDEO] code is On HTML code is Off Trackbacks are Off Pingbacks are Off Refbacks are On Terms of Use Agreement