When audiences hear the word “average” to mean “typical,” a literally accurate claim can nonetheless mislead. In 2004, President George W. Bush told the public that the “average tax cut” would be ,586 when, according to the nonpartisan Tax Policy Center, half of Americans would get 0. There was nothing wrong with the Bush number. In unSpun, Brooks Jackson and Kathleen Hall Jamieson explain:
“Imagine a small town of a thousand persons, including one super rich resident we’ll call Gil Bates. Everybody in town is getting a tax cut this year: for everybody but Mr. Bates, who is getting a whopping cut of ,000. What’s the average? Divide the sum of all the tax cuts (0,000) by the total number of residents (1,000) and the average works out to [a mean] 0 per resident. But that’s not the typical cut.”
To unmask the problems inherent in the ambiguity in the word “average,” ask, “Is the figure in question the ‘mean’ or the ‘median’?” Unlike the “mean,” the “median” is the midpoint, with half above and half below. The median is better indicator of typicality than the mean or average.