Mean is average of a given set of data. Let us consider below example
These eight data points have the mean (average) of 5:
Variance is sum of squares of differences between all numbers and means.
Deviation for above example. First, calculate the deviations of each data point from the mean, and square the result of each:
variance = = 4.
Standard Deviation is square root of variance. It is a measure of the extent to which data varies from the mean.
Standard Deviation (for above data) = = 2
Why did mathematicians chose square and then square root to find deviation, why not simply take difference of values?
One reason is the sum of differences becomes 0 according to definition of mean. Sum of absolute differences could be an option, but with absolute differences it was difficult to prove many nice theorems. [Source: MIT Video Lecture at 1:19]
Some Interesting Facts:
1) Value of standard deviation is 0 if all entries in input are same.
2) If we add (or subtract) a number say 7 to all values in input set, mean is increased (or decreased) by 7, but standard deviation doesn’t change.
3) If we multiply all values in input set by a number 7, both mean and standard deviation are multiplied by 7. But if we multiply all input values with a negative number say -7, mean is multiplied by -7, but standard deviation is multiplied by 7.
Below questions have been asked in previous year GATE exams
http://quiz.geeksforgeeks.org/gate-gate-cs-2012-question-64/