Notices
Results 1 to 44 of 44

Thread: Electromagnetic 4-force

  1. #1 Electromagnetic 4-force 
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    Does anyone know of more formulas of 4-vectors with the cross product in them.

    I ask because I succeded in writing the electromagnetic 4-force in simpler form using my 4D vector product. This also reduces Maxwell's equations to 2 (not the tensor way).


    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  2.  
     

  3. #2  
    Forum Junior c186282's Avatar
    Join Date
    Dec 2008
    Posts
    208
    Can you please show us what you mean by an dimensional cross product?
    What is:


    Reply With Quote  
     

  4. #3  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    There are three equivalent formulations: as a 3xn determinant, a formula together with an extendable number triangle and the determinant expansion (like at wikipedia) together with a table of cross products of unit vectors in nD.

    In 4D the result would be (using and e1, e2, e3, e4 as unit vectors):







    where the 2x3 determinants reduce like:



    and the logic of reduction is consistent and explainable as follows: explicitly delete columns such that two remains (use a symbol like [] to denote the deleted column(s)) followed by column transpositions such that all deleted columns are at rightmost. Count the amount of transpositions required and multiply the term by (-1) if it is odd. Take all combinations of deleted columns (where each creates a 2x2 determinant term).
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  5. #4  
    Forum Radioactive Isotope MagiMaster's Avatar
    Join Date
    Jul 2006
    Posts
    3,440
    Your 2x3 determinant is rather arbitrary and completely inconsistent with both your 3x4 determinant and the general definition of determinants.
    Reply With Quote  
     

  6. #5  
    Forum Isotope
    Join Date
    Feb 2009
    Location
    Transient
    Posts
    2,914
    I don't really understand what you're doing, but my understanding of determinants and matrices is rather limited. Could you explain how you came to this example?
    Wise men speak because they have something to say; Fools, because they have to say something.
    -Plato

    Reply With Quote  
     

  7. #6  
    Forum Junior c186282's Avatar
    Join Date
    Dec 2008
    Posts
    208

    Therefore I can think of as row or column vectors.

    If I have vectors in an dimensional space where (I have more vectors than there are dimensions in my space.), then the collection of the vectors must be linearly dependent. That means that I can take linearly independent vectors from the set of and write the other as a linear combination of the . Therefore your non-square matrices must have a determinant of zero.
    Reply With Quote  
     

  8. #7  
    . DrRocket's Avatar
    Join Date
    Aug 2008
    Posts
    5,486
    Quote Originally Posted by c186282

    Therefore I can think of as row or column vectors.

    If I have vectors in an dimensional space where (I have more vectors than there are dimensions in my space.), then the collection of the vectors must be linearly dependent. That means that I can take linearly independent vectors from the set of and write the other as a linear combination of the . Therefore your non-square matrices must have a determinant of zero.
    Sorry, but the determinant is not even defined for matrices that are not square. They don't have a determinant at all.
    Reply With Quote  
     

  9. #8  
    Forum Isotope
    Join Date
    Feb 2009
    Location
    Transient
    Posts
    2,914
    Can you write the matrix out like



    with (h,i,j,k) being the unit vector?

    or is this just another example of "creative algebra" (the term I was taught in high school Trig when we did something really weird)?
    Wise men speak because they have something to say; Fools, because they have to say something.
    -Plato

    Reply With Quote  
     

  10. #9  
    Forum Radioactive Isotope MagiMaster's Avatar
    Join Date
    Jul 2006
    Posts
    3,440
    Well, you might could make sense of that determinant if you can clearly define what is.
    Reply With Quote  
     

  11. #10  
    Forum Junior c186282's Avatar
    Join Date
    Dec 2008
    Posts
    208
    Quote Originally Posted by DrRocket
    Sorry, but the determinant is not even defined for matrices that are not square. They don't have a determinant at all.
    I should have been more clear for that was my point.
    Reply With Quote  
     

  12. #11  
    Forum Junior c186282's Avatar
    Join Date
    Dec 2008
    Posts
    208
    Quote Originally Posted by Arcane_Mathamatition
    Can you write the matrix out like



    with (h,i,j,k) being the unit vector?

    or is this just another example of "creative algebra" (the term I was taught in high school Trig when we did something really weird)?
    In principle there is nothing wrong with creative algebra after all that is where the great ideas come from.

    Once you expand your smaller matrices you will have terms that are the product of two unit vectors like . What will your rules be here?

    There is something called a Quaternion.
    Reply With Quote  
     

  13. #12  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    I cannot see where quaternions come into this. Anyway, by a known property of determinants, if any 2 rows (or columns) are equal, the determinant vanishes, so it's not obvious what you have achieved.
    Reply With Quote  
     

  14. #13  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    Let me first state that every non-square matrix has two determinants: one developed by row (DR) and the other by column (DC).

    I take the questions in order:

    1) The development by row is exactly like the square case untill the 2xp stage. One can say that the method is exactly the same except that in the square case there is no non-squareness to deal with at the 2x2 stage. In the square case the logic also changes at the 2x2 stage (since you no longer delete rows and columns). It isn't arbritary as both the properties (see item 3) and the usefullness are maintained. In fact the logic reduces to the square case if the determinant is square (of course).

    2) I came to it via the cross product of two vectors in 4 and 5 dimensions (determined using the anti-commutivity and orthogonality property). It is amazing that it turned out to be writable as a non-square determinant DR.

    3) The det (A) property generalises as: DR (A) = DC (A^T) and vice versa. The DR (rows, n<m) and DC (columns, m<n) works as duals. The linear dependence is no problem for a nxm matrix if n<m and your dimension is m. The linear dependence of rows implying DR is zero does hold for n<m. You need special considerations for n<m and m<n on the properties. You just consider row vectors if n<m and column vectors if m<n.

    4) It is a new definition.

    5) Yes, if you define hi = h, hj = h, hk = h, ih = i, ij =i, ik = i etc. Where the product is the ordinary number product. But in higher dimensions the similar thing gets akward (already in 5D): you would need to define products like hij, hji. The definition may have required a creative step, but it proves usefull and satisfies the properties (suitably restated). Your creative steps are not neccesarily wrong. One can work out if that products are extendable and by what logic if you insist on square determinants, but then you miss the generalisation to arbritary non-square determinants, and I proved they are usefull in physics.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  15. #14  
    Forum Radioactive Isotope MagiMaster's Avatar
    Join Date
    Jul 2006
    Posts
    3,440
    A nitpick, but the delete-rows-and-columns method does still work on a 2x2 matrix. You're left with 2 1x1 matrices, which is just the same as a scalar.
    Reply With Quote  
     

  16. #15  
    Forum Isotope
    Join Date
    Feb 2009
    Location
    Transient
    Posts
    2,914
    after getting a big enough bug up my butt to go look up all of these on wiki, I have come to the understanding that a cross product only works because the unit vectors are orthogonal to each other in such a way that the cross products of any combination, ixj, jxk, and so forth, will equal the remaining unit vector. I don't think you can cross 2 4-dimensional vectors as there will be an infinite number of orthogonal vectors that it could equal... I think you need 3 4-d vectors to cross... but this is just me speculating


    Okay, that said, let me disregard it and be creative . First, I need some 4 dimensional definitions:



    where jk, hk, hi, ik, hj, and ij are second order unit vectors defined as where the first "component" unit-vector is positive, the second "component" is negative.



    will reduce to:
    =-+ and so on and so forth through all of the 3x3 matricies... I am probably missing some points though, so I will think on this over the next couple days and report back.
    Wise men speak because they have something to say; Fools, because they have to say something.
    -Plato

    Reply With Quote  
     

  17. #16  
    Forum Radioactive Isotope MagiMaster's Avatar
    Join Date
    Jul 2006
    Posts
    3,440
    If you're going by that definition, then .
    Reply With Quote  
     

  18. #17  
    Forum Isotope
    Join Date
    Feb 2009
    Location
    Transient
    Posts
    2,914
    ummmm.... yeah, basically. Like I said, I'm probably missing some stuff
    Wise men speak because they have something to say; Fools, because they have to say something.
    -Plato

    Reply With Quote  
     

  19. #18  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    Looks like your are deriving the 4D version another way.

    1) Yes it is still deleting rows and columns, but my logic extends the concept - like any other generalisation. You need a leap to "make sense" of it.

    You also have:

    (uxv)|nD = u|nD x v|nD

    if u and v are two mD vectors and n<=m, which doesn't hold for the 7D-octonian vector product.

    2) Yes there would be an infinity of them, but my product gives you a special member of them - namely the one you would find on demanding the product is that of a Lie Algebra (it satisfies the Jacobi condition).

    The unit vectors are considered orthogonal at the outset.

    I also found it neccessary to write the cross product of unit vectors as a sum.

    I found it necessary to define cross products of unit vectors as a sum of the rest (like):









    continuing like this.

    Then you have conceptually that the cross product of unit vectors reduction is a map of "second order-unitvectors" to a sum of unit vectors. You need the determinant expansion to use these definitions though:




    If this logic of yours using two rows of unit vectors is equivalent, then the non-square determinant needs serious consideration because they are more general.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  20. #19  
    Forum Isotope
    Join Date
    Feb 2009
    Location
    Transient
    Posts
    2,914
    just to clarify, you don't get a vector that way. you get a plane in 4 dimensions, volume in 5d, hyper-volume(4-d object) in 6d, etc, etc... (with only 2 vectors)

    I don't understand how you would get anything like a vector this way.
    Wise men speak because they have something to say; Fools, because they have to say something.
    -Plato

    Reply With Quote  
     

  21. #20  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    The symbols say you get a vector, no matter in what dimension. You were two steps away from it yourself, with the definition of those cross products of unit vectors. Somehow you at least saw that every unit vector not in LS must occur in RS. I just don't know if you had the determinant expansion in mind.

    It looks like there is a coceptual block on a more abstract level.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  22. #21  
    Forum Isotope
    Join Date
    Feb 2009
    Location
    Transient
    Posts
    2,914
    no, it's not that, but think about it, There is a WHOLE plane that is orthogonal to 2 vectors in 4d... there really is, and my method should have produced a plane, not a vector. those 2 vectors exist within a plane, and any and all planes have only one vector "direction" that is orthogonal to them in 3d, but, if you go to 4d, there is an entire plane that will satisfy the orthogonality that the cross product produces. take these 2 unit vectors for example, and , it is clear that the vector is orthogonal, correct? But, so is the vector . and the vector . any vector in the plane(taken from a coordinate system of ). there is a plane orthogonal to 2 vectors, because there is a plane's worth of directions orthogonal to a plane in 4d. I don't see how any specific ONE vector on that plane is special
    Wise men speak because they have something to say; Fools, because they have to say something.
    -Plato

    Reply With Quote  
     

  23. #22  
    Forum Radioactive Isotope MagiMaster's Avatar
    Join Date
    Jul 2006
    Posts
    3,440
    Well, he mentioned applying the Jacobi equation to that plane to single out a direction, but I can't really see how that'd work.

    Edit: Nevermind, not the Jacobi equation (or the Jacobi condition), but the Jacobi identity from Lie algebras. I can't tell just by looking if that'd work out or not. I'd have to sit down and work it out.
    Reply With Quote  
     

  24. #23  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    That determinant expansion is not my own, it was already worked out. If you take it at face value it tells you that you can get one "second order-vector" in any dimension.

    They define a second order vector by the wedge product of two vectors (which is why I put it in "").

    All you need then is to see if you can reduce those cross products of unit vectors by extending the rather basic logic of the 3D case. Then work the logic through any existing properties and related concepts already worked out. It is the compatability with a consistent system that matters.

    Look closely and you'll see i<j implies the anti-commutative terms don't occur.

    2) You just show that the vector product definition satisfies the Jacobi identity.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  25. #24  
    Forum Isotope
    Join Date
    Feb 2009
    Location
    Transient
    Posts
    2,914
    There still doesn't seem to me to be anything special about it though. I'm sorry, but it just doesn't make sense to me to single out one direction off of an entire plane.
    Wise men speak because they have something to say; Fools, because they have to say something.
    -Plato

    Reply With Quote  
     

  26. #25  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    It is special because it reduces Maxwell's Equations to two.

    Isn't it more desirable to have many systems of order by having a distinguished elements of an infinite set (like 0,1 for numers, ideals, units in groups and the like)?

    I'm not singeling it out, the system is telling you there is one.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  27. #26  
    Forum Isotope
    Join Date
    Feb 2009
    Location
    Transient
    Posts
    2,914
    but the system is saying this based one how you manipulated the definitions, as I manipulated them too. But, in my defense, I was trying to get the equation of the plane, not a vector.
    Wise men speak because they have something to say; Fools, because they have to say something.
    -Plato

    Reply With Quote  
     

  28. #27  
    Moderator Moderator
    Join Date
    Jun 2005
    Posts
    1,620
    Quote Originally Posted by talanum1
    It is special because it reduces Maxwell's Equations to two.
    I also can reduce Maxwell's equations to two, using differential forms, the exterior derivative and the Hodge operator. So what's new?
    Reply With Quote  
     

  29. #28  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    1) Isn't the definition of the 3D vector product a stretching of the definition of determinants? And there are other examples of such stretches: complex numbers etc. It stretches, but it fits into a consistent system in several ways.

    2) Non-square determinants and nD vector product of two vectors are new. If other formulas of nD vectors with the cross product in them occurs (like in string theory/supergravity/supersymmetry) this would apply. Not a limited definition at all. This also allows the curl operation to extend into nD.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  30. #29  
    Forum Isotope
    Join Date
    Feb 2009
    Location
    Transient
    Posts
    2,914
    Please Excuse my ignorance, But I didn't understand any of that post
    Wise men speak because they have something to say; Fools, because they have to say something.
    -Plato

    Reply With Quote  
     

  31. #30  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    It is already a stretch to put unit vectors in place of numbers inside the determinant.

    The square determinant is a product-combinatorial device. So is the non-square case.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  32. #31  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    Can someone please suggest where I can apply for a grant (or bursuary for further study). This would enable me to continue working like this.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  33. #32  
    Forum Bachelors Degree
    Join Date
    Mar 2009
    Posts
    421
    I haven't read all the posts, so maybe this has already been covered.

    First off: There already is a well-known multiplication operation for 4-d vectors. It's called quaternionic multiplication. If you've defined a multiplication for 4-d vectors that's distributive and has multiplicative inverses, you've probably just rediscovered this well-known structure.

    Second: Maxwell's equations in a vacuum can be written in one equation:



    where F is the electromagnetic field (E,B) written as a 2-form, and is the Hodge-Laplacian on forms.

    So, though I don't know for sure what you've been working on, it sounds a lot like work that's already been done. I know that doesn't come as welcome news, but if someone has beaten you to the punch, better to know sooner than later.

    Incidentally, these days it's very hard to come up with any new contributions to math/physics at the classical level. To have a chance of discovering something new, you need to study things at the quantum level. Roughly speaking, this means that instead of looking at some classical object, like a vectorbundle on a space, you look at the moduli space of all vectorbundles on that space, satisfying some properties. Or, instead of looking at differential operators like the Laplacian, you look at infinite-dimensional analogues of those operators on the loop space. Or, instead of studying representation theory of Lie groups, study representations of affine lie algebras (i.e., infinite-dimensional extensions of lie algebras).

    I don't claim that any of those roads is easy. But that is the uncharted territory in math/physics, and that is the domain where you have a chance of doing something new.
    Reply With Quote  
     

  34. #33  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    You would have seen my definiton is valid for all dimensions larger than two, not just in 4D.

    Interesting equation though, it must be newish.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  35. #34  
    Forum Bachelors Degree
    Join Date
    Mar 2009
    Posts
    421
    Quote Originally Posted by talanum1
    You would have seen my definiton is valid for all dimensions larger than two, not just in 4D.

    Interesting equation though, it must be newish.
    Regarding the equation I gave, that is not new at all--my guess is it's been around since the 1930s or 40s.

    As for your multiplicative structure, you should check to make sure you're not just redefining something like Clifford multiplication.
    Reply With Quote  
     

  36. #35  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    Don't that equation require a gauge condition?

    Looks like I did something very hard then. That equation comes from more complex ideas. It seems like duals to bivetors is more complex because it is one logic level up from vectors (you need the specific dual to enter the logic). Something must be valid in a broader sense for both vectors and one-forms, if they are duals and I did not see this made explicit.

    It isn't the Clifford product because the cross product transforms bivectors into ordinary vectors and it has eiei = -1 while ei x ei = 0 for the vector product.

    My definition would allow advances in probable higher dimensional formulas of the future. No-one has yet come up with a symbolisable rejection reason to my definition, and a single terminological extension can hardly require new methods foreign to the logical background of the original terminology.

    The simpler form of the electromagnetic 4-force is not at the expense of simple logical background.

    Can't quantisation be done from the other direction (start with natural numbers or lattices)?
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  37. #36  
    Forum Bachelors Degree
    Join Date
    Mar 2009
    Posts
    421
    Having taken a closer look at the definition you gave, it strikes me as highly unlikely that this definition is useful for anything. For example, it's not at all clear to me that this product is even distributive.

    In any event, simply defining a product is not really that interesting unless you can use it to prove something new.
    Reply With Quote  
     

  38. #37  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    I can do a lot with it. The non-square determinant can be used to get left or right inverses for non-square matrices (you know they do not have double sided inverses). Aside from already stated physical application. I haven't seen a lot written about non-square matrices - classical. Maxwell's equation goes simpler in any form in free space (no sources).

    It satisfies bilinearity therefore distributivity and is easy to prove. The paper on it's properties is available (currently being considered for publication, ask for copy).
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  39. #38  
    Forum Bachelors Degree
    Join Date
    Mar 2009
    Posts
    421
    It's literally impossible for a non-square matrix to have both a left and right inverse. For example, let A be a pxq matrix, where p > q. Suppose B is a right inverse.

    Since the rank of A is at most q < p, there must be a vector v in R^p which is not in the image of A. But if B is a right inverse, A(Bv) = (AB)v = v, which is a contradiction.

    Otherwise, suppose that A is a pxq matrix with p < q. Suppose that C is a left inverse. Since rank(A) <= p < q, there must be a non-zero vector w in the null-space of A. But if C is a left inverse, CAw = w, contradicting that Aw = 0.

    So non-square matrices cannot have both left and right inverses.
    Reply With Quote  
     

  40. #39  
    . DrRocket's Avatar
    Join Date
    Aug 2008
    Posts
    5,486
    Quote Originally Posted by salsaonline
    It's literally impossible for a non-square matrix to have both a left and right inverse. For example, let A be a pxq matrix, where p > q. Suppose B is a right inverse.

    Since the rank of A is at most q < p, there must be a vector v in R^p which is not in the image of A. But if B is a right inverse, A(Bv) = (AB)v = v, which is a contradiction.

    Otherwise, suppose that A is a pxq matrix with p < q. Suppose that C is a left inverse. Since rank(A) <= p < q, there must be a non-zero vector w in the null-space of A. But if C is a left inverse, CAw = w, contradicting that Aw = 0.

    So non-square matrices cannot have both left and right inverses.
    Alternately, and the gist of this is implicit in your proof, one can view A as a liniear transformation and prove the theorem without direct reference to coordinates or dimension. (I am certain that you know this, but it might be instructive to the wider audience). If A has left inverse then A is injective (one-to-one) and if A has right inverse then A is surjective (onto). So if A has both a left inverse and a right inverse then A is bijective, the left and right inverses are the same and the matrix representation must therefore be square.
    Reply With Quote  
     

  41. #40  
    Forum Bachelors Degree
    Join Date
    Mar 2009
    Posts
    421
    Non-mathematicians don't typically think of things in terms of linear maps, as natural as that point of view is. So I was trying to confine my explanation to beginning linear algebra terminology.

    Besides, the underlying reasons for the conclusions you're making are the reasons I gave.
    Reply With Quote  
     

  42. #41  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    I said left OR right inverses: not both on the same matrix. i.e:

    A^(-1)A = I or BB^(-1) = I.

    You can get a right inverse on A or left on B (except when singular) as well but not by this method and right and left inverse is not the same. Weather you can get a left or right inverse by this method depends on the matrix being row or column longest.

    This isn't abuse of the term "inverse" since the specification is stated.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  43. #42  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    The articles (two of them) have been posted at the following adresses. Those that are curious are invited to read them and comment.

    My m-dimensional, m >= 3, vector product does not conform to the "neat" psychology of 2^n algebras (quaternions, octonions) - decide for yourself.

    Another more detailed version with all of the properties is available.

    Main paper:

    http://www.scribd.com/doc/18024743/4-Force

    Supplement paper:

    http://www.scribd.com/doc/18024749/V...duct-Reduction
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  44. #43  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    For those who are still confused: one may say that the logic of the square case is equivalent to the non-square case i.e. that "after development to 2 rows(columns) stage, whatever is left reduces to a sum of all possible 2x2 subdeterminants (for every term) while respecting the + - + structure". In the square case of course there is just one such subdeterminant for each term, while in the non-square case there are more than one.

    One sees that the row/column swapping after row/column deletion falls in two cases (for example the 2x4 case below).

    +-+-

    +- and -+ gives +1 multiplier

    ++ and -- gives -1 multiplier
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

  45. #44  
    Forum Sophomore
    Join Date
    Jul 2007
    Location
    South Africa
    Posts
    196
    salsaonline make it seem like the differential forms way is simpler. It isn't because you need another equation: d(*F) = *J for when the sources are not zero. Moreover you need more ideas too: differential forms, exterior derivative, hodge star duality, metric tensor, components of p-forms, electromagnetic potential, Pfaff Topological Dimension and they need tensors and some of its manipulation rules (sum over repeated indices, anti-symmetrisation, Levi-Civita tensor, index gymnastics) to prove the differential forms are equivalent to Maxwell's equations. See:

    http://indexguy.wordpress.com/2007/0...rential-forms/

    My way only uses vectors, cross product, replacement operator and the del operator.
    It also matters what isn't there - Tao Te Ching interpreted.
    Reply With Quote  
     

Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •