John claims he has found an easier way of finding the average (mean) of a set of numbers when you only need an estimate. It’s easier because you work with smaller numbers. To illustrate his method, let’s use these numbers:
John first takes the smallest number in the set and subtracts it from the other numbers in the set.
(31-14)=17, (25-14)=11, (35-14)=21, (18-14)=4.
Then he uses the “standard” procedure to average those numbers.
(17+11+21+4 + 0)/5 = ~10.6 (approximately 10.6)
He then adds the smallest number from the original set to this average (mean).
14 + 10.6 = 24.6
This average (mean) is the average (mean) of the original set of numbers (24.6).
Does John’s way of finding the average (mean) of a set of numbers always work? Why or why not? Would his method work if you did the first step AGAIN with the smallest number in the new set (4), found the average (mean) of this new set of numbers, and then added this average (mean) with 14 and 4? Why or why not?
Would John’s method work if he chose any other number in the data set to subtract from the other numbers? Explain.