There was a show on TV late last night, but I don’t know what the show was or, for that matter, what channel it was on. Two men and two women had been trying to lose weight in some boys vs. girls competition, so they added up the weight loss for each pair, divided by their previous weight, and compared. They guys had lost more weight, but the gals had lost a higher percentage so they won.
Comparing percentages instead of absolute amounts made sense, but I’m not sure what the fairest way to make that comparison is. Suppose you had a team with a 100 and a 200 pound person, and another team with two 150 pound people. If each person lost 5% of their body weight then the total for each team would be 15 pounds, or 5%.
But what if someone lost 1 pound? If it were the 100 pound person, that seems more significant than the 200 pound person, but they’re counted the same because those two are on the same team. Likewise, if the 200 pound person lost an extra 10 pounds and one of the 150-pound people lost 9 pounds, the first team would win even though the second individual lost a higher individual percentage because it’s the team total that is used for the percentage, and both teams weigh 300 pounds.
I was thinking that it might make more sense to compute the percentages and then average them. The problem is that it might be easier for a 200 pound person to lose even 5% of their weight than a 100 pound person so even that is imperfect — ease and health of weight loss don’t quite follow a linear scale. But more importantly, I suspect, it might just be one step too confusing for a late night show.