what is the antiparticle of the photon?
|
what is the antiparticle of the photon?
It is it's own antiparticle, so to speak, as it has neutral charge.
It's an interesting question. It clearly isn't just matter of whether or not the particle has charge, as Q has suggested, because it is well known that there is an antineutrino and an antineutron, neither of which is charged. Perhaps the fact that photon is a boson has something to do with it.
http://en.wikipedia.org/wiki/Antiparticle
Originally Posted by Wikipedia
Originally Posted by JaneBennet
Wikapedia should be in the sci-fi search engine index. That is just totally ridiculous compared to everything I ever learned, used and or built.
Sincerely,
William McCormick
Exactly what have you built that has anything to do with the properties and/or existence of neutrons?
Just about anything any human being has built, from spears to spacecraft, will have contained neutrons and indirectly depended on their properties (of which existence is the most fundamental). But if Bill has been building anything to do with antimatter, then, given his daring scientific ideas, pardon me if I express a certain amount of concern.Originally Posted by MagiMaster
Take cover,
Leszek.
Some of your countrymen have just finished building a house in my area.Originally Posted by Leszek Luchowski
I heard they did a good job but I am sure this had nothing to do with their knowledge,or lack of knowledge, about the existence and properties of neutrons.
So am I. And your point is ... ?Originally Posted by Halliday
When replying to the previous post Magi Master was not saying that physical objects do not contain neutrons. He was asking how the previous poster could decide,partly on the basis of having built certain objects,that the Wiki article was nonsense.Originally Posted by MagiMaster
I see. Actually, I had tried to be witty, but severe hipocaffeinemia thwarted my attempts. A thousand pardons.Originally Posted by Halliday
Cheers, L.
Things I build use the neutral nature of matter and ambient radiation. That was what was originally called the neutron. Not a particle or anything like that.Originally Posted by MagiMaster
The neutron was actually a German thing describing the ether. And the way that ambient radiation could either add an abundance of electrons to say a bomb. Or take away an abundance of electrons, like during the creation of an x-ray.
http://www.rockwelder.com/Electricit...page/Howto.htm
Check out the bottom of page 47 and the top of 48.
Now granted none of the science writers that have survived are what I would call top 90 percentile writers or scientists. They have a lot of preconceived notions and wrong ideas. That often conflict themselves.
However there are some bits and pieces of history there that you can view, and understand how science became so screwed up.
The neutron was a misunderstood German concept. That many of our science writers and poor scientists like, Chadwick, Einstein, Fermi and others could not grasp.
Our great scientists totally understood the ether and the neutral effect of the ether ,or what they called the ether, (ambient radiation).
This was when American and German scientists were still trading the most outrageous science back and forth. During the start of World War Two. But only good scientists were trading information.
Evil scientists were trying to end the world. With paranoia and secret cover-ups and secret experiments on both sides.
Sincerely,
William McCormick
No need to apologise. I can only ask for pardon for having missed your flashing wit!Originally Posted by Leszek Luchowski
Cheers.
This sounds like someone else we all know. :POriginally Posted by William McCormick
Originally Posted by MagiMaster
And yet you have never highlighted one conflicting point?
Because there are none in my all electron universe.
Sincerely,
William McCormick
William, science is built on math. The Uncertainty Principle, for example, has an english translation of something like "the velocity and position of a particle cannot be know simultaneously." However, that's only a translation. The real Uncertainty Principle is. Your ideas have no math behind them because you don't believe in math. Your theories are just imprecise words. If we haven't pointed out any specific contradictions in your ideas, it's because your ideas are too vague to mean anything, much less anything particularly contradictory.
Originally Posted by MagiMaster
Take a look at your first statement there. That science is built on math.
Science works without an accountant. The universe continues along just fine.
I will on the other hand having been a mathematician and honors math student. Admit that math is probably one of the most important tools the scientist has.
However it is rarely to uncover new and sensational finds. The sensation if you go back over the years was nothing more then poor scientists bringing out scientific evidence with a lot of publicity. And often a lot of error.
The math is basically a tallying, an accounting of experiment accuracy. And inaccuracy. So that other scientists can either verify or find fault or differences in materials used in other countries.
But all these crazy black hole formulas are like little kids running around with a ray gun and making believe they are on the moon.
You do not have to believe in math. Good math is rarely complicated. And proves itself. There is little arguing, and even less misunderstanding while using good math.
We currently do not even use the proper order of mathematics. So I highly doubt you know whether or not I like or use math. Because you probably have never seen it.
Sincerely,
William McCormick
William, I'm a mathematician. I know what math is and how it's used. I know you don't like or use math because you've said so yourself.Originally Posted by William McCormick
You are right that math is the most important tool in science though.
I love math, use it in real life all the time. I write macros for cadd using math.Originally Posted by MagiMaster
I just hate that the order of mathematics is not addition, subtraction, multiplication and division. As I learned it. I learned it both ways actually.
This symbol "÷" did not mean a fraction when I learned math. This symbol meant to take everything to the left of it and divide it by everything to the right of it. It was the in line division key, that meant take everything to the left and divide it by everything to the right of it. The "/" symbol meant fraction.
The capital "X" also meant to take everything to the left of it and multiply everything to the right of it. So that you were free to do all your addition and subtraction first, and then the large "X" would multiply everything on the right by everything on the left. The small "x" was used instead of parenthesis if you wanted to just multiply two numbers. It was the same as parenthesis around two numbers being multiplied.
But that appears to be gone forever so I don't want to argue it.
The guys on the cadd forum were wondering about my spacing system so I made a movie of it in action.
It is pretty cool, that spacing program will also set objects that overlay each other like louvers in condenser air, through the window ducts, on large package AC units, in large buildings. You just use a negative spacing input.
http://www.rockwelder.com/Flash/Spac...acingMacro.htm
There is a lot of cool math going on there. A super time saver.
Sincerely,
William McCormick
How you do math or how I do math isn't math, it's only a way of doing math. The difference may not be obvious, but it is important.
Math is one of the few things, because it is not really complicated like the Universe. That you can hone to the finest possible system of totalling things to an exact outcome.Originally Posted by MagiMaster
That means that as you say it matters not who does the math, but who does the best, fastest, math. The math that has the best most understandable language, that allows you to input massive amounts of data in a streamline manner and have a neat orderly trail to check it out many times. Is the math.
We do not have that now. That is why we are not on the moon.
Sincerely,
William McCormick
William, youris exactly the same as my
and your
is exactly the same as my
. It's just that since more people use my way than your way, my way is more readable. Not more correct, just more readable.
I am saying that at one time some years ago in my grade school days. Math was still taught with a capital "X" meaning take everything to the left, in the whole formula and multiply it by everything to the right of it. Up to a division symbol "÷".Originally Posted by MagiMaster
The little or lower case "x" meant take the two integers or variables on either side of the lower case "x", and multiply them by each other. Much like a fraction but instead multiplying them. If you do a lot of measuring and creating in Cadd. My system is a dream.
You can intermix, fractions, decimals, with no parenthesis. Total up small actually measured increments using the "+" addition symbol. Used to concatenate objects in a formula. We originally learned you do not break the chain. Unless you ran into the "/" or the "x" symbol.
The current system is nothing but error, for high speed entry of information for processing. Often because you have to enter the formula in reverse in order to enter the parenthesis in their right order. In other words you have to work out the end of the formula before you can total up, what is going to have to be placed in the senseless parenthesis. You need to know what it is you want to do with the added up values at the end of the formula.
This is a redundant nightmare like problem if you really enter a lot of measurements and perform calculations on them. Nothing but parenthesis.
Sincerely,
William McCormick
You know, if you took the time to learn it, I'm sure you'd love Polish notation.
Originally Posted by MagiMaster
No, I saw it. Just looked like a waste of time. This system I am talking about was built to burn.
Sincerely,
William McCormick
« help | Through the center of earth » |