Every year at some point I end up teaching about mean versus median as measures of central tendency. The mean has the advantage of representing all the data points, but that same advantage can be its weakness, of there are outliers in the data. I typically make up some off-the-cuff example where I pretend I’m an employer and four or five of the kids in my class, usually one of the rows, are my employees, and list their salaries as $10,000/year, $10,000/year, $10,000/year, $10,000/year, and $120,000/year (I often jokingly suggest that this employee is my niece or nephew). I then point out that I can honestly say to prospective employees that they should come work for me because there is a lot of potential for salary growth at my company: the mean salary is $32,000/year. This is technically true, but not a fair representation of the salary situation at my company—the median salary of $10,000/year would be a far better representation of what a new hire could expect to be paid by me.
Well it illustrates the point, but it’s an extreme and far-fetched example, right?
The average federal tax cut under the AHCA (read: our collective kickback for dismantling healthcare, such as it was) is $600, and yet more than 80% of taxpayers will get less than this? Why, how is this possible? There must be some truly dramatic outlier to make this—oh, there it is, on the right.