Here are the rankings:
1. TrueBlue, JetBlue Airways, 4.34
2. Rapid Rewards, Southwest Airlines, 3.92
3. Mileage Plan, Alaska Airlines, 3.81
4. MileagePlus, United Airlines, 3.74
5. AAdvantage, American Airlines, 3.71
6. Elevate, Virgin America, 3.57
7. SkyMiles, Delta Air Lines, 3.46
8. Hawaiian Miles, Hawaiian Airlines, 2.81
9. EarlyReturns, Frontier Airlines, 2.23
10. Free Spirit, Spirit Airlines, 1.44
But this ranking is based on a severely flawed methodology that has very little to do, even, with frequent flyer programs. And is coupled with even worse advice. Here’s their methodology.
What They Say About Their Survey is Just Wrong
Here’s how the survey is described:
To help you weigh your options, U.S. News evaluated 10 leading frequent flier programs using an unbiased methodology that takes into account each program’s average flight prices, earning ratios and daily flight volume (among other features). Our ranking and detailed program profiles are intended to help you choose the program that’s best for you.
They call it an unbiased survey, but it has little to do with the quality of frequent flyer programs as I’ll explain below.
Most importantly, though, the bias comes from the criteria itself.
It appears as though the survey designers do not understand their subject even. For instance, they offer this advice:
If you prefer to fly first class, you should consider a program that awards benefits based on dollars spent rather than miles flown.
That might be true if you prefer to buy paid first class tickets, you may earn more in a program that rewards you for spending more. That’s not always true. But it is decidedly not true that if you prefer to redeem for first class you will be better off earning points in a program that prices awards based on the cost of a paid ticket.
That’s a distinction they don’t make, so they likely contribute to confusion rather than understanding.
Sixty Percent of the US News Scoring Is About the Airline and Not the Frequent Flyer Program
10% of the score is the number of daily flights an airline has. Domestic flights only are included, so airlines that operate primarily domestically are advantaged here.
10% of the score is the geographic coverage of the airline (of course, what’s relevant for frequent flyer miles is the worldwide reach of the ability to earn and burn miles, but this ranking does not include airline partnerships or alliances in the mix at all). The number of airports served is the metric — not even metropolitan regions. Serving Washington Dulles, National, and Baltimore count as being 3 times as good as serving just one of those those. Serving little-used Essential Air Service communities is as good as serving New York, Chicago, and Los Angeles.
20% of the score is flight frequency, allowed for booking. Airlines get a score based on the number of flights for the cities they fly between the most that you can use points for. But since most airlines let you use (some number of) points for any seat on any flight, this comes down to giving points to airlines that fly a lot between the cities they serve. (Only United does not offer all customers access to the last available seat for some number of miles — a ‘privilege’ they reserve for elites and co-brand credit card holders.)
20% of the score is the quality of the airline. It doesn’t matter that I can use my Alaska Airlines miles to fly Delta or Cathay Pacific or Emirates. What matters here is how good an airline is Alaska to fly? Which of course is entirely an airline quality rating and not a frequent flyer program rating as the study purports to offer.
The 40% of the Score That’s Actually About the Frequent Flyer Program Tells Us Very Little
30% of the score is supposed to be ‘ease of earning a free roundtrip flight’. But it isn’t. They take the average price of domestic coach awards for just 5 routes for each airline. They searched the same travel dates for each. That means they took no account of seasonality, or day of travel effects.
What’s more, that tells you nothing about how easy it is to earn a free roundtrip, since it looks only at the number of points required but not the ease of earning those points. Different airlines, especially the ‘revenue-based’ ones, have an entirely different point scale.
10% of the score is ‘additional benefits’ described as follows:
The Additional Benefits score comprises the number of ways members can earn and use points (other than for booking flights), whether or not points expire and additional perks that come from memebership.
Not all airlines were given an additional benefits score.
The Survey Doesn’t Tell Us What it Tells Us it Tells Us
The important thing is to understand what a survey does and does not show. There’s no one answer to which program is best, it depends very much on the needs of the consumer, what they subjectively value, although there are broad generalizations.
While the Freddie Awards offers one window into the programs that are best, I try to be as clear as possible about what it shows and does not show — and that it doesn’t offer one final answer for all cases. It is merely an attempt to amalgamate the preferences and opinions of members writ large. It’s what program members in general express that they value most in a given year.
The US News survey tells us, based on very limited data, what its editors think is important in a US domestic airline product and they then say they’re ranking frequent flyer programs. They aren’t.
What’s more, they aren’t accounting for how easy it is to earn points, the scale of those points, airline alliances and partnerships, international premium cabin awards, and any myriad number of things that actually factor into what is important in a frequent flyer program.
Unfortunately there will be those who pick up the ranking and believe it means more than it does.