Rankings are not the best way to choose your MBA
WHICH is the best MBA course in the world? According to the FT’s rankings it’s the one at London Business School, followed by those at Wharton, Harvard and Stanford. Argument over? Not quite. If you had instead chosen the Economist’s list, you would be under the impression that Booth School of Business at the University of Chicago is number one, followed by Tuck and Haas.
The harder you look at MBA rankings the less they make sense. Despite its top ranking in the Economist, Chicago Booth is just 10th in the FT. LBS tops the FT rankings, but is a lowly 19th in the Economist. INSEAD is 23rd in the Economist but 5th in the FT. Haas, third in the Economist, is 28th in the FT. Only Harvard, Booth, Stanford and Wharton make it into both top 10s.
Look at the BusinessWeek list and the confusion deepens. It ranks US and non-US schools separately, leaving us to wonder whether Harvard, second on the US list, is better than Queen’s in Kingston, Canada, which takes the same place on the non-US list.
And then there’s the Forbes ranking, in which Stanford comes out as top US school while INSEAD, IMD and IE are its top MBAs outside America. Indeed, INSEAD is the top non-US MBA for both Forbes and BusinessWeek – but in the Economist is ninth. Six schools can justifiably claim to be Europe’s best.
Also interesting are schools’ moves up and down the rankings. Spanish school IESE dropped from first in the Economist in 2009 to fifth a year later. In the same year Chinese school Ceibs dropped 14 places in the FT, from 8th to 22nd. Boston College went up 42 places. Belgian school Vlerick Leuven placed 10th in the Economist in 2008, only to drop 37 places the next year.
For most people in Europe the FT’s ranking is taken most seriously – and the release of the 2011 list on 31 January is one of the biggest events of the MBA year. But how seriously should we take it? How useful are its numbers? The first criterion it uses is graduates’ salary. But as an indicator of quality, that is a blunt tool to say the least. Salary is dependent on the sectors that graduates go into, rather than their quality – those who go into financial services earn more than entrepreneurs who may well take a pay cut in order to earn more later. Those graduating in financial services-heavy parts of the world surely have more earning power than others. It’s no shock that London and East Coast schools rank so highly.
Salary increase is the second major measure. Young MBAs who were earning little, or who come straight out of an undergraduate degree, will have a greater increase than more experienced people. But who would you rather have on your course? Many MBAs want to change careers, and might well be willing to take a pay-cut to get a foothold in a new industry.
Thirdly, the FT looks at whether graduates were employed three months after graduation. This also depends on sectors, location, and level of experience; high-fliers might not walk into the first job they are offered. Also, it tells you nothing about long-term success.
METHODOLOGY AND MADNESS
To understand the lists, you have to understand the methodologies they use (see box above right). Two things leap out at you. Firstly, the data is based on information given by students and schools – in the Economist’s case, 80 per cent comes from schools – perhaps not the most objective of sources. As one recent MBA says, students have a “vested interest” to be nice about their schools; wouldn’t you rather have an MBA from the fourth best school than the 14th?
Secondly, there is the time-lag. Several lists use data from graduates from a number of years to help flatten out spikes and troughs, and they also wait until the graduates have been out of schools three years. The FT’s 2011 list will have information from those who graduated in 2005-7, while the current Forbes list has information from MBAs who graduated in 2004. How relevant is that to those taking an MBA in 2012?
Looking at those who graduated longer ago also boosts the “earnings” column – they are less impressive if you know that some of the data is from those who graduated six years ago. And again, this number tells you nothing about long-term performance.
So how seriously should we take the lists? Pat Harker, dean of Wharton, said in 2004 that “there is a very strong consensus among parties (alumni, faculty and staff of other institutions) I’ve consulted, that the ranking methodologies are severely flawed.” And that wasn’t sour grapes – Wharton regularly gets top 10 rankings.
London business school Cass’s line is that “business school rankings are one important indicator of quality” but are “composed of an extremely large amount of disparate information and should be read carefully by prospective students.” Ross Geraghty, whose TopMBA website has a ranking based entirely on recruiters’ perceptions of MBAs, calls rankings a “necessary evil”.
Officially, schools always say that they should be taken with a pinch of salt but schools know that a high ranking looks great in the marketing material and they are not shy of using it in their brochures for years to come. One person who works in the MBA industry says that deans can be guaranteed to “crack open the champagne” if their ranking goes up, and bang heads together when it goes down.
TAKING IT SERIOUSLY
Another insider says that “how well you do in the lists is entirely a function of how seriously you take them”. He says that the biggest, richest schools have people whose sole job is to get students to respond positively to the rankings questionnaires. Smaller schools just don’t have the manpower for that.
There is also a good deal of schmoozing of graduates. “They will invite graduates to dinners and ply them with booze,” says an MBA graduate from a London school. “They say: ‘You are going to be sent some forms, and this is how you should fill them in.’” Others mutter about “massaging” of figures. The Economist looks at the gender balance of courses, giving a better score to those with more women. Would some schools aim to accept more women, then? “Absolutely,” says one person who works for a top business school. He insists that a school would never recruit sub-standard students – that would damage its reputation – but among equals, some are more helpful than others.
The reasons schools do this are not altogether mercenary, of course. A high ranking means good students, which improves the quality of the education. Highly ranked schools, in any list, are good.
But it’s an open question what rankings actually teach us. An industry insider says that a few years ago when one particular school topped a list “everybody thought it was a joke.” In other words, people already know what the best schools are without the rankings. Which makes you wonder: if we all know the answer, what’s the point of asking the question?
UPS AND DOWNS | METHODOLOGIES
Schools and graduates provide the data for the Financial Times list, but to take part students have to have graduated three years previously. So the 2011 list will include data from graduates of the classes of
2005-7.
The three main pieces of data are 1) alumni salaries and career development; 2) the diversity and international reach of the business school and its MBA programme; and 3) the research capabilities of each school. These are broken down into 20 criteria, eight from graduate information and 12 from the schools. Exactly how this all gets brewed together to produce a ranking is unclear, but a whopping 40 per cent of the weighting for the final ranking comes from the salaries. On the other hand, this is probably a good reflection of MBA students’ priorities.
The Economist list is the oddest. Alumni ratings make up 20 per cent of the total ranking, while 80 per cent is based on data provided by schools. The scores are weighted 35 per cent for “career opportunities”; 35 per cent for “personal development/educational experience”; 20 per cent “earning potential”; and 10 per cent “networking potential”. Why they are weighted this way is unclear, but there is a tool on the website that allows you to change the weightings, which shows that small tweaks can have huge effects.
The simplest MBA ranking list is the one published by Forbes, which looks at ROI – the increase in salary reported by MBA graduates. No information is provided by schools. On the minus side, it’s very US-centric.
BusinessWeek uses data from both graduates and recruiters who have employed them. It recently introduced an “intellectual capital measurement”, simply looking at how many papers faculty have published in 20 leading journals and this makes up 10 per cent of the score. This has led to stability, at least in the number one spot – in 12 lists over 22 years only Wharton, Booth and Kellogg have topped the list. It’s very American, though, and only has 18 schools in its non-US list.