Root Mean Square Error
Accuracy is expressed as the Root Mean Square Error (RMSE), which is a measure of the distance from the true position within which about 67% of points would be expected to lie. The maximum expected geometric error on an OS map is about three times the RMSE.
Definition
RMSE is the square root of the mean of the sum of the squares of the errors between the observations. The RMSE is a measure of the magnitude of a set of numbers. It gives a sense for the typical size of the numbers.
Example
Consider this set of numbers: -2, 5, -8, 9, -4
We could compute the average, but this does not tell us much because the negative values cancel the positive values, leaving an average of zero. What we want is the size of the numbers without regard for positive or negative. The easiest way to do this is to just erase the signs and compute the average of the new set: 2, 5, 8, 9, 4
Average = 5.6
But that’s not how statisticians do it. For reasons of their convenience, they chose a different approach. Instead of wiping out the signs, they square every number (which makes them all positive), then take the square root of the average. It’s like this:
- Square all the values
- Take the average of the squares
- Take the square root of the average
So, the RMSE of: -2, 5, -8, 9, -4 is 6.16
The RMSE is always the same as or just a little bit larger than the average of the unsigned values.