
Originally Posted by
yohann
It is a very fundamental question, but most statistic books don't give concrete reasons to justify the use of standard deviation, just say it is better, or say by taking square of the deviation could avoid the problem of negative deviation. But I don't think its a reasonable explanation, because you can just take absolute values.
Avarage Deviation: the sum of the deviation of all terms divided by the number of terms
Standard deviation: the square root of [the sum of the square of the deviation of all terms divided by the number of terms]
it is not intuitive to me that Standard deviation is better than Avarage Deviation, can anyone give an explanation. And will it be better to take the cube of the deviation and then take the cube root.