Notices
Results 1 to 16 of 16

Thread: why Standard deviation is better than Avarage Deviation

  1. #1 why Standard deviation is better than Avarage Deviation 
    New Member
    Join Date
    Jan 2010
    Posts
    2
    It is a very fundamental question, but most statistic books don't give concrete reasons to justify the use of standard deviation, just say it is better, or say by taking square of the deviation could avoid the problem of negative deviation. But I don't think its a reasonable explanation, because you can just take absolute values.

    Avarage Deviation: the sum of the deviation of all terms divided by the number of terms

    Standard deviation: the square root of [the sum of the square of the deviation of all terms divided by the number of terms]

    it is not intuitive to me that Standard deviation is better than Avarage Deviation, can anyone give an explanation. And will it be better to take the cube of the deviation and then take the cube root.


    Reply With Quote  
     

  2.  
     

  3. #2 Re: why Standard deviation is better than Avarage Deviation 
    . DrRocket's Avatar
    Join Date
    Aug 2008
    Posts
    5,486
    Quote Originally Posted by yohann
    It is a very fundamental question, but most statistic books don't give concrete reasons to justify the use of standard deviation, just say it is better, or say by taking square of the deviation could avoid the problem of negative deviation. But I don't think its a reasonable explanation, because you can just take absolute values.

    Avarage Deviation: the sum of the deviation of all terms divided by the number of terms

    Standard deviation: the square root of [the sum of the square of the deviation of all terms divided by the number of terms]

    it is not intuitive to me that Standard deviation is better than Avarage Deviation, can anyone give an explanation. And will it be better to take the cube of the deviation and then take the cube root.
    What is "better" ?

    One is the first moment. The other is the second moment.

    The Gaussian distribution is completely determined by the mean and the standard deviation.

    The variance (square of the standard deviation) of a sum of independent random variables is the sum of the variances.

    One uses what is appropriate and what makes logical sense. It is not like picking your favorite color.


    Reply With Quote  
     

  4. #3  
    Forum Professor
    Join Date
    Jul 2008
    Location
    New York State
    Posts
    1,079
    It is a lot easier to compute the standard deviation than the average of the absolute deviation.
    Reply With Quote  
     

  5. #4  
    . DrRocket's Avatar
    Join Date
    Aug 2008
    Posts
    5,486
    Quote Originally Posted by mathman
    It is a lot easier to compute the standard deviation than the average of the absolute deviation.

    Why do you say that, other than that there is a pre-programmed button on many calculators and statistics packages ?
    Reply With Quote  
     

  6. #5  
    Forum Professor
    Join Date
    Jul 2008
    Location
    New York State
    Posts
    1,079
    Quote Originally Posted by DrRocket
    Quote Originally Posted by mathman
    It is a lot easier to compute the standard deviation than the average of the absolute deviation.

    Why do you say that, other than that there is a pre-programmed button on many calculators and statistics packages ?
    Try doing it by hand, particularly with a continuous distribution - you will see soon enough.
    Reply With Quote  
     

  7. #6  
    . DrRocket's Avatar
    Join Date
    Aug 2008
    Posts
    5,486
    Quote Originally Posted by mathman
    Quote Originally Posted by DrRocket
    Quote Originally Posted by mathman
    It is a lot easier to compute the standard deviation than the average of the absolute deviation.

    Why do you say that, other than that there is a pre-programmed button on many calculators and statistics packages ?
    Try doing it by hand, particularly with a continuous distribution - you will see soon enough.
    Trying doing the variance by hand and you won't like that either.
    Reply With Quote  
     

  8. #7  
    Forum Professor
    Join Date
    Jul 2008
    Location
    New York State
    Posts
    1,079
    When dealing with experimental data, the first and second moments of the data can be readily calculated. From these items, unbiased estimates of the mean and variance can be obtained simply. It is much harder to get such an estimate for the average absolute deviation.
    Reply With Quote  
     

  9. #8  
    New Member
    Join Date
    May 2010
    Posts
    1
    A very fundamental and non-obvious question.

    See the paper "Revisiting a 90-year-old debate: the advantages of the mean deviation" by Stephen Gorard at http://www.leeds.ac.uk/educol/documents/00003759.htm
    Reply With Quote  
     

  10. #9  
    Suspended
    Join Date
    Oct 2009
    Posts
    92
    Quote Originally Posted by howorka
    A very fundamental and non-obvious question.

    See the paper "Revisiting a 90-year-old debate: the advantages of the mean deviation" by Stephen Gorard at http://www.leeds.ac.uk/educol/documents/00003759.htm
    Thanks for that, interesting reading, although I am not really a statistical, I can see no reason to use RMS.
    Some of the reason given for the SD seem a bit 'vague' to me, easier to work with?
    You might as well say 7+2=10, because it's a nice round number to work with.

    On a side note apparently Fischer apparently could not find a link between smoking and cancer from the statistics, you would have to question his statistical abilities there somewhat!!

    Mind you he was being paid by a tobacco company, he also worked in eugenics.
    I definitely prefer Eddington.
    Reply With Quote  
     

  11. #10  
    Suspended
    Join Date
    Oct 2009
    Posts
    92
    " According to Fisher, it was in meeting the last criteria that SD proves superior. When drawing repeated large samples from a normally distributed population, the standard deviation5 of their individual mean deviations is 14% higher than the standard deviations of their individual standard deviations (Stigler 1973). Thus, the SD of such a sample is a more consistent estimate of the SD for a population, and is considered better than its plausible alternatives as a way of estimating the standard deviation in a population using measurements from a sample (Hinton 1995, p.50). That is the main reason why SD has subsequently been preferred, and why much of subsequent statistical theory is based on it."

    This to me is just plain wrong, what it shows to me is that the SD is wrong and produces the wrong result.
    Reply With Quote  
     

  12. #11  
    Suspended
    Join Date
    Oct 2009
    Posts
    92
    Wel I have found someone who agree with me lol.

    http://www.trade2win.com/boards/trad...al-2-a-78.html

    "I was reading wikipedia. Now the Sharpe Ratio mother ****er is based, among other things, on standard deviation. But the mother ****ing standard deviation is based on the variance, and the mother ****ing variance is none other than this crap:"

    Not the words I would have used, but I have to agree.
    Reply With Quote  
     

  13. #12  
    . DrRocket's Avatar
    Join Date
    Aug 2008
    Posts
    5,486
    Quote Originally Posted by howorka
    A very fundamental and non-obvious question.

    See the paper "Revisiting a 90-year-old debate: the advantages of the mean deviation" by Stephen Gorard at http://www.leeds.ac.uk/educol/documents/00003759.htm
    I got about a half page into that paper before a violent visceral response. What trash. Barf bag needed.

    The standard deviation is not intrinsically a "measure of spread" above and beyond being the second moment.

    There are probability distributions for which the standard deviation does not exist. There are also probability distributions for which the mean does not exist. People writing this stuff ought to learn a little bit of the theory of probability before they pick up a pen.

    It is not that the standard deviation is "better" than the the first moment, or that the first moment is "better" than the second moment. One needs to understand the problem and then use the appropriate tools. If all you have is a hammer, everything looks like a nail.

    That said, the standard, aka "normal, aka Gaussian density is very useful. It is useful for a number of reasons. The primary reason is that it is approximately correct on theoretical grounds in some situations as a result of the central limit theorem. The central limit theorem says that the sum of independent random variables with identical variance tends, in the limit as the number of summands becomes large, to be normally distributed, in a probabilistic sense. As a practical matter this means that a sum of a few random variables can be in many cases adequately represented by a normal distribution. It is also the case that the sum of any number of independent normally distributed random variables is normally distributed.

    Now the density corresponding to a sum of random random variables is the convolution product of the densities for those variables. Convolutions are relatively difficult to calculate -- except for Gaussian densities. If the densities are Gaussian the convolution is trivial to calculate.

    The convolution of Gaussian densities is trivial to calculate because 1) the result is Gaussian 2) Gaussian densities are characterized by just 2 parameters -- the mean and the variance (the variance is the square of the standard deviation) 3) the mean of a sum of random variables is the sum of the means and 4) the variance of a sum of independent random variables is the sum of the variances. So to calculate the density of the sum of a bunch of independent normally distributed random variables you just add the means, add the variances and stick those parameters into the formula for a Gaussian density. Anybody can do that.

    So, since the normal distribution is so useful it turns out that the variance is also useful.

    Using the central limit theorem you can also approximately calculate the density for a sum of independent random variables even if they are not normally distributed. You make the approximation that the sum is normally distributed. Then you just calculate the means and variances, add them up and stick them in the formula for a Gaussian density. Again anybody can do that.

    There are many other density functions for which the variance is very descriptive, so in practice the variance is a widely used parameter.
    Reply With Quote  
     

  14. #13  
    Suspended
    Join Date
    Oct 2009
    Posts
    92
    I would have thought that as squaring the values is meaning less anything based upon that is also meaningless.

    Think about that for a moment, but no more than one moment.

    All the flowery language in the world cannot hide the smell of manure.
    Reply With Quote  
     

  15. #14  
    . DrRocket's Avatar
    Join Date
    Aug 2008
    Posts
    5,486
    Quote Originally Posted by smokey
    I would have thought that as squaring the values is meaning less anything based upon that is also meaningless.

    Think about that for a moment, but no more than one moment.

    All the flowery language in the world cannot hide the smell of manure.
    You are smelling your upper lip.
    Reply With Quote  
     

  16. #15  
    Suspended
    Join Date
    Oct 2009
    Posts
    92
    Your breath more like.
    Reply With Quote  
     

  17. #16  
    . DrRocket's Avatar
    Join Date
    Aug 2008
    Posts
    5,486
    Quote Originally Posted by smokey
    Your breath more like.
    Nope. You would not have survived such a close encounter.
    Reply With Quote  
     

Bookmarks
Bookmarks
Posting Permissions
  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •