i've grown confused of late, happens when i think to hard.
i was wondering if i could get a simple definition of Entropy.
|
i've grown confused of late, happens when i think to hard.
i was wondering if i could get a simple definition of Entropy.
The problem is that over simplification leads to misconceptions.Originally Posted by wallaby
Entropy is derived from counting and probability.
Consider taking a bucket of a hundred dice and rolling them all at once. There is only one way they could all add up to 600, but there are many many ways that you could get a total of 350. There are a total of 501 different sums that could result - all the numbers from 100 to 600. When you add up the different ways you could get each of these numbers together you get a total of 6^100 = 6.5x10^77 for all the numbers from 100 to 600. Most of these 6.5x10^77 ways thing could happen are for the numbers close to 350. So we say that a result close to 350 is a high entropy state and a result close to 600 or to 100 is a low entropy state. In fact there is formula for the entropy of each sum which is something like the logarithm of the number of ways the dice can add up to that sum.
Now why does this have anything to do with physics? It is because most of the things we see are the result of adding up a bunch of random events. For example, if those 100 dice are black dots on white cubes and we look down on the result from a great height, instead of seeing the dots on the dice we would only see grayish blur. The 600 result would be the darkest gray and the 100 result would be the lightest gray.
If you put the dice all with the six side up on a large aluminum tray held by a group of helpers, then while watching from a great height your helpers shake the tray. You will see the dark gray change to the middle gray, but you never see the middle gray go to either dark gray or light gray. This is the second law of thermodynamics, that entropy increases, going from states of low entropy like the dark gray to the high entropy state of middle gray.
It is all about probability. The dark and light states are just very unlikely. But this is nothing compared to how unlikely it would be if you had 6.023x10^23 dice instead of only 100 dice! Then we could be very confident indeed that the middle gray would never become dark gray or light gray.
People like to equate entropy with "disorder" whatever that means, but I think this is very misleading. One would could just as well equate it with a measure of "mediocrity".
Thanks for the clarification.
mitchels definition was really good.
Generally Entropia is defined as disorder.
The formula is S = lnW * K
-S => Entropia
-W => number of states
-K => constant of Bolzman
ln1 = 0, so if you only have one state the Entropie = 0.
ya wanna knwo why i was drawn to this post, entropy.
the first computer game i played was crash bandicoot. cool name.. and charchters.
i played 3 -warped. time travel it was about. and the creater of the machine.. was non other than ... DR. N. tropy!
i now see where his name comes from! numbers!
sorry for interupting with pointless dribble...
« Potential Energy Electron/Nucleus | ¿Does light have mass? » |