Notices
Page 2 of 2 FirstFirst 12
Results 101 to 119 of 119

Thread: Vector spaces

  1. #101  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    Well, OK, maybe my notation doesn't bother you as much as it does me.

    Here's the thing: elements of the Cartesian products of sets, say A×B are the ordered pairs (a,b), a from A, b from B. In an inner product space, the scalar (v,w) is certainly an ordered pair, an element of V×V. But not all spaces have an inner product defined, so if, for one such space, I write (x,y) as an element of W×W we can take it that (x,y) is not an inner product.

    I'm going to change notation for a bit, and write [v,w] when I want to specify that this is an inner product. Where it makes no difference, I will simply use (v,w), which may or may not be an inner product, it's irrelevant for the purpose at hand (this, I think, will be usually).

    The other problem that may have escaped you is this. I'm writing fv for the element of V*, the vector dual to the vector v in V. Suppose, though, I now say that fv(v) = ||v|| for all v in V, it isn't clear that, in this case, I must insist that f[/sub]v1[/sub] is dual only to v1, which is a very cumbersome way to proceed. But here I don't see an easy way out.

    We'll have to live with it, so let me emphasize: the "f" has no real meaning, and simply means "some functional" that takes V → R. As the spaces V and V* are isomorphic (provided only that V is finite), this implies that for each v in V there is one and only one fv in V*. Maybe you already made that leap, if so, good on ya.

    And finally: the form fv⊗fw: V×W → R I wrote earlier was for illustration only, and not a good illustration at that - I urge you to erase it from your mind, it muddies the waters.

    Tensors are formed exclusively from the multilinear elements of the space V⊗...⊗V⊗V*⊗...⊗V* acting on V*×...×V*×V×...×V, i.e multiple copies of some space and its dual.
    Last edited by KALSTER; August 30th, 2012 at 08:14 AM.
    Reply With Quote  
     

  2. #102  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by Guitarist
    We'll have to live with it, so let me emphasize: the "f" has no real meaning, and simply means "some functional" that takes V → R. As the spaces V and V* are isomorphic (provided only that V is finite), this implies that for each v in V there is one and only one fv in V*. Maybe you already made that leap, if so, good on ya
    thinking in terms of the inner product fv(v) ment that i assumed that from the start, since the inner product isn't the only bilinear function of that sort i guess that it can be taken that all the other bilinear functions satisfy that condition or i got lucky.

    Quote Originally Posted by Guitarist
    Tensors are formed exclusively from the multilinear elements of the space V@...@V@V*@...@V* acting on V*—...—V*—V—...—V, i.e multiple copies of some space and its dual
    now that the other 'illustration' has been put out of mind shall we be going into the above in more detail?
    Last edited by KALSTER; August 30th, 2012 at 08:14 AM.
    Reply With Quote  
     

  3. #103  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    OK, but first let's be clear what's going on here. The issue of inner products is totally irrelevant here, if it helps us to interpret fv: V → R as an inner product, it will do no harm, as far as I can see. So let's stick to the simple case:

    The tensor product of all elements of V and all elements of V*, of which, say, v⊗fv is one, acts on the Cartesian product of their target spaces V*—V. Say (fu, u) is an element in V*—V. It is in the nature of the tensor product that v⊗fv: V*—V → R that this operation yields v⊗fv(fu, u) = v(fu)fv(u).

    We see straight away that, for n copies of our multilinear function, there are n elements in the codomain. This means that, in the present case, we have defined a rank 2 tensor, v⊗fv.

    We also see straight away that v⊗u: V*—V* → R is also a rank 2 tensor, so, as these guys are clearly different, we'll call them type (1,1) and type (2,0) tensors, respectively, it being understood that the first slot denotes the vector component, the second the dual vector component. Now, we could leave it here, but for two things: nobody else in the world seems willing to, and with a small adjustment to notation, tensor algebra becomes, superficially at least, rather easy.

    So, let's at least make a start. We agreed to describe a vector v = aiei. Let's now follow the herd, and raise the index on a, the scalar component on e, the basis vector thus: v = aiei. But, since, for a given basis, all we really care about are the components ai, let's drop all reference to the basis: v = ai. But we also agreed that the choice of basis is arbitrary. As we don't want our vector v to change merely because of our promiscuity with bases, then the ai don't mean very much, they could be anything, depending on the basis.

    So, we may as well write vi as our vector, it being understood that this means our vector is such that, for some choice of basis, there is an induced choice of scalar components such that vi is fully determined by the ith component on the ith basis vector.

    And finally: I said a rank 1, type (1,0) tensor is a vector. It is (probably, lemme check) not true that all vectors are type (1,0) tensors. So, in the case that our vector vi is a type (1,0) tensor, let's emphasize that fact, and call it Vi. Yay!, our very first tensor, aren't we cool!

    Out of puff, more later, maybe.
    Last edited by KALSTER; August 30th, 2012 at 08:15 AM.
    Reply With Quote  
     

  4. #104  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by Guitarist
    The tensor product of all elements of V and all elements of V*, of which, say, v@fv is one, acts on the Cartesian product of their target spaces V*—V. Say (fu, u) is an element in V*—V. It is in the nature of the tensor product that v@fv: V*—V → R that this operation yields v@fv(fu, u) = v(fu)fv(u).
    hmmm. right hand side of the equation v@ffv(fu,u) = v(fu)fv(u) brings matrix multiplication to mind.

    Quote Originally Posted by Guitarist
    And finally: I said a rank 1, type (1,0) tensor is a vector. It is (probably, lemme check) not true that all vectors are type (1,0) tensors. So, in the case that our vector vi is a type (1,0) tensor, let's emphasize that fact, and call it Vi. Yay!, our very first tensor, aren't we cool!

    Out of puff, more later, maybe.
    so in the case of say v*→ R if we could take our tensor vi and use it to act on an element of V*, or must we insist on a cartesian product between v* and some zero vector space, and aquire vi(fv as a result. which looks oddly familiar as a result.
    Last edited by KALSTER; August 30th, 2012 at 08:15 AM.
    Reply With Quote  
     

  5. #105  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    Quote Originally Posted by wallaby
    hmmm. right hand side of the equationv@ffv(fu,u) = v(fu)fv(u) brings matrix multiplication to mind.
    Why sure, good. But you rather quickly, say in the case of type (n,m) tensors, get into hypermatrices, things you can't picture, let alone draw.

    so in the case of say v*→ R if we could take our tensor vi and use it to act on an element of V*, or must we insist on a cartesian product between v* and some zero vector space, and aquire vi(fv) as a result. which looks oddly familiar as a result.
    I'll draw a discrete veil over "v* → R", you know better than that. But you can think of a vector space as the Cartesian product of that space and a "zero space", whatever that might be, but why bother.

    We have fv: V → R, u |→fv(u) as a familiar result, as intended. What might be less familiar, because I rather skated over it, is the form v: V* → R, fu|→ v(fu). Can you see why, or should I go through it again.

    I guess the point is, the above renditions are quite familiar, and the higher rank tensors are merely a generalization of this. I chose to argue from the higher rank case to the lower, as I thought it would be easier to follow. Others may disagree.

    Anyway, good work.

    EDIT: I think I may have given a slightly bum steer over the inner product. I don't think it's too serious, but we can talk about it if you want. The issue basically revolves around whether [ , ] in this simple form is bilinear or not. I now think this is only true over the real field. If we want full generality, I think we may have to insist that this is only true if [ , ] is an element of V*⊗V, that is to say, if an inner product exists, it is defined by some fu(v). As I say, I don't think it's too serious.
    Last edited by KALSTER; August 30th, 2012 at 08:16 AM.
    Reply With Quote  
     

  6. #106  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    So, we now have some rather boring housework to do, so get your pinny on!

    First, note that, for the multilinear tensor product of functions on the Cartesian product of spaces, there are two possible conventions (I confess I've been muddling them slightly). Let's stick with this: for each element of V acting as a function on V*, and each element of V* acting as a function on V, I will write, for example, V⊗V⊗V* acts on V*—V*—V, it being understood that elements of V are vectors and elements of V* are now called covectors.

    So, the above defines elements of V⊗V⊗V* as type (2,1) tensors, the first slot being reserved for vector components, the second for the covector components. Agreed?

    We also agreed to write our type (1,0) tensor = vector = function: V* → R as Vi. By a similar (and not interesting) argument, I hope nobody will dispute the reasonableness of calling a type (0,1) tensor = covector = function: V → R as Vj.

    Let's get daring, and consider some arbitrary tensors Ap and Bq. We know what it means to add vectors - we get another vector, right? So Ap + Bq = Cr, another vector. Similarly for covectors.

    This generalizes: Apq + Brs = Ctu and so on. Note we are not changing the rank or type here.

    What about multiplication? S'easy, just remember how we defined our tensors! AiBj = Ckl. So, multiplying two rank n tensors gives a rank 2n tensor. Hmm. What about adding a type (1.0) tensor and a type (0,1) tensor? No way, and in general tensors of different types cannot be added. (The model here is matrices)

    But we can multiply tensors of different types AijBk = Cpqr (note I'm careful with my indices (= vector/covector components, as they are bound to change under these operations).

    Now here's something cool, but familiar, I hope. The operation AkBk = Chh, by the above. It only remains for me to remind you that if, say Ak = v and Bk = fu, you will no difficulty seeing that this is ?? First one to tell me gets a sweetie (or a Toklas brownie, if they prefer!)
    Last edited by KALSTER; August 30th, 2012 at 08:16 AM.
    Reply With Quote  
     

  7. #107  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by Guitarist
    So, we now have some rather boring housework to do, so get your pinny on!

    First, note that, for the multilinear tensor product of functions on the Cartesian product of spaces, there are two possible conventions (I confess I've been muddling them slightly). Let's stick with this: for each element of V acting as a function on V*, and each element of V* acting as a function on V, I will write, for example, V@V@V* acts on V*—V*—V, it being understood that elements of V are vectors and elements of V* are now called covectors.

    So, the above defines elements of V@V@V* as type (2,1) tensors, the first slot being reserved for vector components, the second for the covector components. Agreed?
    Agreed

    Quote Originally Posted by Guitarist
    But we can multiply tensors of different types AijBk = Cpqr (note I'm careful with my indices (= vector/covector components, as they are bound to change under these operations).

    Now here's something cool, but familiar, I hope. The operation AkBk = Chh, by the above. It only remains for me to remind you that if, say Ak = v and Bk = fu, you will no difficulty seeing that this is ?? First one to tell me gets a sweetie (or a Toklas brownie, if they prefer!)
    hmm well,
    AkBk=vFu, however we want another tensor here and not a scalar hence we have a tensor product.
    v@fu=Chh. in this case Chh would be a rank 2 tensor, as defined earlier. that is in the case that Ak=v, Bk.

    thats what i make of it.
    Last edited by KALSTER; August 30th, 2012 at 08:17 AM.
    Reply With Quote  
     

  8. #108  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    Quote Originally Posted by wallaby
    hmm well,
    AkBk=vFu, however we want another tensor here and not a scalar hence we have a tensor product.
    Aha! but these are not incompatible statements - a scalar is a type (0,0) tensor! Remember that scalars, vectors (and tensors) not only result from actions of various sorts, they also act. Maybe the question was a little unfair without more info, sorry. You had the answer, then ran away from it. Observe:

    AkBk = Chh = v⊗fu: V*—V → R, (fu, v) |→ v⊗fu(fu, v). By bilinearity, this is v(fu)fu(v) = fu(v)fu(v), we know each factor here is scalar and the multiplication of two scalars is another scalar. As a rank (0,0) tensor is defined to be scalar, we conclude that Chh = C.
    This "cancellation law" is quite general - the same indices both raised and lowered cancel exactly after summation (remember we're still assuming summation). It's referred to as tensor contraction and is, as far as I'm aware, unique to tensors. There's subtlety here we can talk about, maybe should talk about another time.

    So, no sweetie for you, but as it was a brave attempt, suck on this stone instead.
    Last edited by KALSTER; August 30th, 2012 at 08:17 AM.
    Reply With Quote  
     

  9. #109  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by Guitarist
    Aha! but these are not incompatible statements - a scalar is a type (0,0) tensor! Remember that scalars, vectors (and tensors) not only result from actions of various sorts, they also act. Maybe the question was a little unfair without more info, sorry.
    so then could we define a vector space in terms of a field of rank (0,0) tensors? well actually when i think about it it would seem that if this were true it would be a special case of a tensor space. but thats if its true and my mind is not running away.

    Quote Originally Posted by Guitarist
    You had the answer, then ran away from it. Observe:

    AkBk = Chh = v@fu: V*—V → R, (fu, v) |→ v@fu(fu, v). By bilinearity, this is v(fu)fu(v) = fu(v)fu(v), we know each factor here is scalar and the multiplication of two scalars is another scalar. As a rank (0,0) tensor is defined to be scalar, we conclude that Chh = C.
    This "cancellation law" is quite general - the same indices both raised and lowered cancel exactly after summation (remember we're still assuming summation). It's referred to as tensor contraction and is, as far as I'm aware, unique to tensors. There's subtlety here we can talk about, maybe should talk about another time.

    So, no sweetie for you, but as it was a brave attempt, suck on this stone instead.
    having not known where that stone came from i shall save it for later.
    Last edited by KALSTER; August 30th, 2012 at 08:18 AM.
    Reply With Quote  
     

  10. #110  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    Quote Originally Posted by wallaby
    so then could we define a vector space in terms of a field of rank (0,0) tensors?
    Ah no. Don't confuse fields with spaces, they are not the same thing at all (though they can be related to each other via the notion of "bundles"). But you could have a special case of a scalar field that was a field of type 0 tensors, I suppose. But, just as I am reasonably confident that the definition of a rank 1 tensor as a vector is a one-way street, that is, not all vectors are type 1 tensors, so I think the same is true for rank 0 tensors and scalars. I haven't thought about it deeply very recently, but I suspect all will come good when (if) we get to tensor transformations
    having not known where that stone came from i shall save it for later.
    Good, has the same effect on your teeth as sweets, but rather more rapid
    Reply With Quote  
     

  11. #111  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by Guitarist
    Ah no. Don't confuse fields with spaces, they are not the same thing at all (though they can be related to each other via the notion of "bundles"). But you could have a special case of a scalar field that was a field of type 0 tensors, I suppose. But, just as I am reasonably confident that the definition of a rank 1 tensor as a vector is a one-way street, that is, not all vectors are type 1 tensors, so I think the same is true for rank 0 tensors and scalars. I haven't thought about it deeply very recently, but I suspect all will come good when (if) we get to tensor transformations
    can't wait
    Reply With Quote  
     

  12. #112  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    Quote Originally Posted by wallaby
    can't wait
    I'm afraid you're going to have to (wee, is that English?)

    Anyway, I'll tell you why you have to wait. First, the transformation of tensors requires an understanding (at the very least) of partial derivatives, do you know what this means?

    Second, tensor transformations are really only relevant when you think about manifolds. I suspect that river_rat is headed in that direction, and he is far more knowledgeable than I on this.

    But finally, and please don't take offense (as you seem like a very nice and very smart guy). But... I cannot convince myself that you have fully understood the concept of a tensor space; don't just rush forward without that understanding, as you seem to have done a bit of that recently.

    As an exercise for yourself (not me) try to put into words, no symbols, what your understanding is of a tensor space. I assure you, it ain't easy, but give it a try.
    Reply With Quote  
     

  13. #113  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by Guitarist
    I'm afraid you're going to have to (wee, is that English?)
    it's about as english as woo

    Quote Originally Posted by Guitarist
    Anyway, I'll tell you why you have to wait. First, the transformation of tensors requires an understanding (at the very least) of partial derivatives, do you know what this means?
    i can't say i know it to the degree that one might consider to be a product of education on the subject. however Gas Laws and the volumes of various shapes do seem to be common, and understandable, examples that i have seen here and there.

    i guess if you asked me to define it in words i'd have to say:
    "the derrivative of a multivariate function with respect to one variable."
    but i don't know how mixed partials work, never dealt with partial differential equations, i dare say i could flip through the calculus book beside me and find out.

    Quote Originally Posted by Guitarist
    Second, tensor transformations are really only relevant when you think about manifolds. I suspect that river_rat is headed in that direction, and he is far more knowledgeable than I on this.
    aren't manifolds more of a topology thing?
    i'd have to say i know absolutely nothing of topology and couldn't follow the topology primer thread.

    Quote Originally Posted by Guitarist
    But finally, and please don't take offense (as you seem like a very nice and very smart guy). But... I cannot convince myself that you have fully understood the concept of a tensor space; don't just rush forward without that understanding, as you seem to have done a bit of that recently.

    As an exercise for yourself (not me) try to put into words, no symbols, what your understanding is of a tensor space. I assure you, it ain't easy, but give it a try.
    upon reflection i'd have to agree with you.

    i'll think about tensor spaces, or tensors in general, for a while.
    Reply With Quote  
     

  14. #114  
    Forum Professor river_rat's Avatar
    Join Date
    Jun 2006
    Location
    South Africa
    Posts
    1,510
    Quote Originally Posted by wallaby
    aren't manifolds more of a topology thing?
    i'd have to say i know absolutely nothing of topology and couldn't follow the topology primer thread.
    They are more of a geometry thing actually Wallaby - and dont worry i am planning to take a very pedestrian route into manifolds and will leave most of the hairy technical stuff out if you want

    As soon as ive finished Urysohn's lemma with guitarist i will start a manifold thread
    As is often the case with technical subjects we are presented with an unfortunate choice: an explanation that is accurate but incomprehensible, or comprehensible but wrong.
    Reply With Quote  
     

  15. #115  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by river_rat
    They are more of a geometry thing actually Wallaby - and dont worry i am planning to take a very pedestrian route into manifolds and will leave most of the hairy technical stuff out if you want

    As soon as ive finished Urysohn's lemma with guitarist i will start a manifold thread
    geometry's good, if we can sacrifice most of the hairy technical stuff and still keep the moral of the story then i'm all for it.
    Reply With Quote  
     

  16. #116  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    Quote Originally Posted by wallaby
    i guess if you asked me to define it in words i'd have to say:
    "the derrivative of a multivariate function with respect to one variable."
    Well, one variable at a time, but yes, that's it. I have no intention of giving a calculus lesson, but I just want to make you aware of a couple of things. Suppose you have a function dependent on two variables simultaneously, say f(x,y) = z. First hold y constant (and you know what that means for differentiation, right?), and differentiate with respect to x. On writes ∂z/∂x. Now hold x constant and differentiated for y, you have ∂z/∂y.

    First thing to say is that, in spite of their slightly alarming appearance, these guys are handled in exactly the same way as any other differentiation.

    Second is, note that, for each variable, you have a separate partial derivative; sometimes you know how to fit these guys together (add them, multiply etc) only rarely you don't.

    Third is that you often see ∂/∂x(f) and ∂/∂y(f) as notation; it means the same thing, of course.

    Finally, observe this. Suppose we have an arbitrary vector transformation f on a Euclidean 3-space, with Cartesian coordinates xi, i = 1, 2, 3. If we want to describe the increments of this transformation, derivatives would be nice, and it shouldn't be too hard to see that we want the derivative of f(xi) = Σ∂f/∂xi.

    So that what we will want the partials for. Anyway, enough of that.

    An obvious question is this: given a vector space of tensors, how do we find a basis for this space? On the face of it, the answer is simple. Consider the space of type (1,1) tensors, these being bilinear objects of the form v⊗fu. Then we know that for each v in V, we have basis vectors ei such that v = aiei, and that for each fu in V* we have fu = bjεj.

    Then, by the construction of the space of type (1,1) tensors, we will have a basis given by ei⊗εj, with scalar coefficients ai and bj, respectively.

    Annoyingly, perhaps, when dealing with basis transformations, it will turn out to be more profitable to return to coordinates. But later for that.

    Hey! How about Ireland vs Pakistan in the World Cup? Much celebration chez nous, as my wife is Irish, with oceans of Guinness consumed!

    i'll think about tensor spaces, or tensors in general, for a while.
    How's that going? Any questions?
    Last edited by KALSTER; August 30th, 2012 at 08:18 AM.
    Reply With Quote  
     

  17. #117  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by Guitarist
    How's that going? Any questions?
    in a bit of a rush but i'll add this first and come back to the first bit.
    well its given me something to do when i'm bored.
    but as far as results go...well i'm not entirely sure.
    if i've come up with anything so far this is most likely it:

    Tensor Space: a collection of objects called tensors formed out of the tensor product of two vector spaces.
    however i would like to have thought of a definition not requiring the use of tensor related concepts to whatever extent possible, and i would have liked to have thought of something more [can't pick the word] definative or correct?

    of to school to think some more.

    EDIT: ok well now that i'm home i guess i should elaborate on my line of thinking so we can see where it may be wrong.

    earlier we defined the tensor product of two vectors as being a second rank tensor. i'm wondering if you can apply the idea generally to the tensor product of two vector spaces in order to obtain a tensor space.

    then theres the question of what about rank 0 tensors? might they be the tensor product of two fields or can we do a little better than that?
    i'm sure that by this definition, if its right of course, we can say that the tensor space is constructed over the vector spaces to which it is a product of.

    anywho i think there's enough there to see what i think of tensor spaces.

    the partials seem pretty simple so onto this lot.
    Quote Originally Posted by Guitarist
    Finally, observe this. Suppose we have an arbitrary vector transformation f on a Euclidean 3-space, with Cartesian coordinates xi, i = 1, 2, 3. If we want to describe the increments of this transformation, derivatives would be nice, and it shouldn't be too hard to see that we want the derivative of f(xi) = Σ∂f/∂xi.

    So that what we will want the partials for. Anyway, enough of that.
    the function f(xi) i take it would describe one of the cartesian variables in terms of the others.

    Quote Originally Posted by Guitarist
    An obvious question is this: given a vector space of tensors, how do we find a basis for this space? On the face of it, the answer is simple. Consider the space of type (1,1) tensors, these being bilinear objects of the form v@fu. Then we know that for each v in V, we have basis vectors ei such that v = aiei, and that for each fu in V* we have fu = bjεj.

    Then, by the construction of the space of type (1,1) tensors, we will have a basis given by eij, with scalar coefficients ai and bj, respectively.
    that makes sense. although while on the topic of a basis i remember, back when we had just started and i looked into the kronecker delta a little more, i found the wikipedia article on it amongst other articles at tensors. i remember as soon as i saw the word tensor i stopped looking any further. so is there some connection between the Kronecker Delta and tensors?

    Quote Originally Posted by Guitarist
    Annoyingly, perhaps, when dealing with basis transformations, it will turn out to be more profitable to return to coordinates. But later for that.

    Hey! How about Ireland vs Pakistan in the World Cup? Much celebration chez nous, as my wife is Irish, with oceans of Guinness consumed!
    missed that, trust assignments to issolate you from the news. enjoy the celebrations.
    Last edited by KALSTER; August 30th, 2012 at 08:19 AM.
    Reply With Quote  
     

  18. #118  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    Quote Originally Posted by wallaby
    Tensor Space: a collection of objects called tensors formed out of the tensor product of two vector spaces.
    however i would like to have thought of a definition not requiring the use of tensor related concepts to whatever extent possible, and i would have liked to have thought of something more [can't pick the word] definative or correct?
    Your definition of a tensor space is correct, good. As to your other comment, are you objecting to the use of the term "tensor product" in the definition? Like you think it's sorta self-fulfilling? Fine, call it the outer product if you want. Incidentally, when I said earlier that tensors can be multiplied, I should said explicitly that this actually means using the tensor product thus: (v⊗fu)⊗(u⊗fv) = v⊗u⊗fu⊢fv: V*—V*—V—V → R

    earlier we defined the tensor product of two vectors as being a second rank tensor. i'm wondering if you can apply the idea generally to the tensor product of two vector spaces in order to obtain a tensor space.
    I don't understand your point, that's exactly what we did! Look: if I write f: X → Y, these being sets, I really mean take each x in X and find a y in Y s.t. f(x) = y. I work component-wise to find Y from X. Or the Cartesian product of sets A—B takes each a in A and each b in B and pairs them off, one by one through the whole set. Likewise, v⊗fu refers to each v in V and each fu in V*, which taken in all pairwise combinations make up the space V⊗V*, the tensor space.

    then theres the question of what about rank 0 tensors?
    Ok, I didn't introduce these very well. A vector space is a set |V| of abstract objects together with an associated field K. The spaces V and V*, being duals, must associate with the same field, just the one. If |V| and |V*| are empty sets, we are simple left with the field. Or you can just take as a definition that a type (0,0) tensor is scalar.
    i'm sure that by this definition, if its right of course, we can say that the tensor space is constructed over the vector spaces to which it is a product of.
    Hmm, I think I'd be more inclined to say it is over V, rather than over V and V*. Not sure, though, lemme check.

    so is there some connection between the Kronecker Delta and tensors?
    Yes - it is a tensor! (of a rather special sort).
    Last edited by KALSTER; August 30th, 2012 at 08:20 AM.
    Reply With Quote  
     

  19. #119  
    Forum Professor wallaby's Avatar
    Join Date
    Jul 2005
    Location
    Australia
    Posts
    1,521
    Quote Originally Posted by Guitarist
    Your definition of a tensor space is correct, good. As to your other comment, are you objecting to the use of the term "tensor product" in the definition? Like you think it's sorta self-fulfilling? Fine, call it the outer product if you want. Incidentally, when I said earlier that tensors can be multiplied, I should said explicitly that this actually means using the tensor product thus: (v@fu)@(u@fv) = v@u@fu@fv: V*—V*—V—V → R
    yes it was a rather irrelevent and superficial statement to make and yet for some reason i put it in anyway. tensor product or Boot it really doesn't matter when we use V@V*.

    Quote Originally Posted by Guitarist
    Ok, I didn't introduce these very well. A vector space is a set |V| of abstract objects together with an associated field K. The spaces V and V*, being duals, must associate with the same field, just the one. If |V| and |V*| are empty sets, we are simple left with the field. Or you can just take as a definition that a type (0,0) tensor is scalar.
    i see now

    Quote Originally Posted by Guitarist
    Yes - it is a tensor! (of a rather special sort).
    interesting
    Last edited by KALSTER; August 30th, 2012 at 08:20 AM.
    Reply With Quote  
     

Page 2 of 2 FirstFirst 12
Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •