Get our top-rated Mini-MBA Certificate for $199 $49 (till 31st Mar!)

Are MBA rankings losing their credibility?

MBA rankings, according to John A. Byrne, P&Q founder, are a fact of life. He says, “They [rankings] have changed the way schools are managed, the way applicants choose programs, the way companies recruit MBA students, the way schools hire and retain faculty, how alumni feel about their alma mater, and how much donors give to schools.”

MBA rankings, a concept which Byrne helped conceptualize at Businessweek, have also increased awareness about the benefits of the MBA degree.

Read How important are MBA rankings?
 

Are MBA rankings losing their credibility?

Any loss in the credibility of MBA rankings would be a big blow to MBA applicants, students, schools, and employers. And it certainly looks as if credibility has taken a hit, because of the actions of some schools and publications.

As for schools, some of them seem to be indulging in fudging numbers—number of applications received, admit rate and selectivity, average GMAT score of selected applicants, reputational ranking, and graduation rate. At least a few schools cook up numbers so that they can burnish their image through higher rankings, without many outsiders noticing. Fake data lead to false MBA rankings, affecting their credibility.
 

Techniques schools employ to game the MBA ranking

One criterion for ranking is selectivity. To assess the selectivity of a school, the admit rate is calculated by simply diving the number of students admitted by the number of applicants. But some schools inflate the number of applicants in order to artificially boost the admit rate and, thereby, selectivity, according to an article in the Chicago Tribune.

Schools don’t just stop at padding up the number of applicants. Some of them resort to using a two-part application, where the first part is nothing more than a casual expression of interest by the applicant, and the second part is similar to the usual format adopted for school applications, with grades, test scores, essays, and recommendation letters. Schools are known to count only the number of “first part” applications, which many more students are likely to submit, than the actual, second part of applications, in order to jack up the number of applicants.

A new trick that some schools use is to “wait list” applicants who are unlikely to join—for example, because their scores are much higher than the schools’ average scores and they feel they have a good chance at better schools—and ask them to immediately decide about joining. Obviously, these applicants might not be able to make a quick decision as they know they are likely to be accepted later at a better school. The schools then put these applicants on the rejected list so as to appear more selective.

Some schools use delayed admissions (candidates who have not been accepted in the fall but are accepted next spring) to show a lower number of admitted students.

It might be possible that some schools also manipulate their average reported test scores. They may choose to ignore low verbal scores of international applicants, particularly, and consider only the high quant scores of these candidates. They may also ignore the scores of “special admit” applicants in cases where the scores are low.

Some schools declare that they are ‘test-optional’ but yet announce their average test scores, which obviously would be high, as only students with high scores would reveal them to the schools. Even reputed news journals have been known to include these scores among their ranking criteria.

Reputational rankings also may be fixed. Top administrators and academics who assess programs at other schools may strike a deal with their counterparts in other programs/schools to give a good opinion about their own programs/schools in return for the same favor.

Graduation rates can also be gamed. Schools try to minimize the total number of students to only those who do well in their final tests so that the graduate rate is 100 percent or nearly that.

Alumni donations, an indication of whether a school has been a happy experience for students, also see some doctoring. Schools exclude from their donor lists those alumni who have not donated for some time to bolster their “giving rate.”

All these manipulations have affected the credibility of MBA rankings. But why do schools tamper with data? It may be because many of them probably falsify their numbers, and schools with no intention of doing eventually give in, as they don’t want to be left out. Publications are also guilty of taking the reported figures of schools at face value and not independently verifying them.
 

Methodology loopholes

As for publications, the methodology adopted by some of them has also contributed to the loss of the credibility of rankings. For example, some publications get some of their data from surveys of students/graduates, and it is not difficult for schools to warn the respondents that what they say will affect their schools andthe reputation of their own degrees, and thereby their career prospects. Naturally, students and alumni would only have good things to say about their schools/programs. Again, incorrect information ruins credibility.

Publications’ audit or scrutiny of school-reported data may have no big contribution to improving credibility, says P&Q’s Byrne. Moreover, publications don’t vouch for the correctness of the school-reported data on the basis of which they finalize rankings. “Rankings put far too much emphasis on job placement and compensation and far little on the MBA experience and the business learning that goes on in a program,” Byrne adds.

Can a school manage to get a place on the rankings even though it is not a business school? Well, this is also possible, apparently. The MBA Journal reports an instance where a top financial newspaper included in its MBA rankings a finance school in Frankfurt that was not a business school and did not have international accreditation (from the AACSB or EQUIS). The newspaper, which had broken its own rules, defended itself saying that the school was included for its master’s program in finance.

In another instance, the financial newspaper included a top business school in India that had no international accreditation. Why? The newspaper responded that accreditation through American or European agencies was only for American or European schools.

The volatility of a publication’s rankings may also affect the credibility of the list. For example, in the 2016 ranking of a renowned weekly magazine, the positions of about 30 schools rose or fell at least 20 places compared with the previous year. Even the positions of the top 25 programs in the ranking, which are more or less stable in the rankings of most publications, changed by an average of four. A top school in Spain lost 30 places. A highly valued school in Ahmedabad fell 32 spots.

Big surprises also deflate credibility. In over a decade of the well-respected magazine’s ranking history, Harvard, Stanford, or Wharton has never topped its lists even once. In one list, Harvard’s faculty managed only rank 56 and its student quality rank 13. In the same list, the magazine gave the seventh rank in faculty quality to an Australian school besides the first rank in student quality. This Australian school also worsted Columbia, Wharton, MIT Sloan, and other top b-schools.

Two world-class schools refused to participate in the rankings because they believed the methodology lacked credibility, but the magazine included these institutions nevertheless on the basis of publicly available data and older surveys.
 

Lack of consistency

Dissimilarities between rankings also affect credibility. If one ranking puts School A in the top five and another doesn’t include the schools in the top ten or 15, the credibility of both rankings and their publications is affected.

A blogger asks this pertinent question: if the rankings are reliable, why aren’t they consistent between themselves? Moreover, by how much is a school ranked first on a list superior to another ranked 2 or 3 on the same list? According to the blogger, a financial newspaper known for its MBA rankings has admitted that there may only be a relatively small difference between a school ranked 20 and another ranked 40. Then the question is, what justifies the big gap in their ranking?

One reason why schools are ranked differently by various publications is that publications use different criteria or give a different weight to each criterion.
 

Just one tool among many

As Byrne says, rankings may not be reliable, but they matter. No MBA applicant can afford to ignore them or make a decision based solely on them. At best, rankings could be one of several factors for decision-making.

Rankings not only help applicants but also schools and employers. A school with a high ranking can attract good students and faculty. Many top employers recruit MBAs from only the top-rated schools. Rankings with high credibility can help lower-ranked schools identify weaknesses in their programs, improve programs/facilities, and compete better with world-class schools.

Students may be well-advised not to go by the ranking of a school on a particular measure. Every school has an area where it performs well, for which it gets a great ranking. But the fact is that this school has been found very good only on that one criterion. The trick is to see how the school ranks among its larger peer group.

Also read:
List of MBA Rankings
 
Resources: 1,2,3,4,5 | Image credit


Mini-MBA | Start here | Success stories | Reality check | Knowledgebase | Scholarships | Services

Serious about higher ed? Follow us:

               

7 thoughts on “Are MBA rankings losing their credibility?”

    • I am an Indian, I understand your feelings. yeah many of us are in those not-so-good institutes, but I am also sure many of us are in some of the best universities.

      Reply
  1. I’m currently pursuing my final year of graduation in B.com . I want to join MBA next year. What are the top college’s accepting admission without work experience ??

    Reply
  2. Sir i have passed bsc chemistry with 60% ..now i m confused between msc or mba because though i have interest in chemistry but not that much and moreover i wont persue phd in future …so plz suggest me shall i go for mba as i think it will help me to get a job more quicker … And sir plz tell me if there is any good career after msc chemistry if i m not going for phd..

    Reply
  3. Hi, my name is Bipasha. I am currently pursuing B.Tech in Computer science engineering. I want to do my masters in US. But I don’t know which one to go for. MBA or MS.. Totally confused. What should I do?. Should I wait for 1 year after graduation and have some work experience or should I apply for GRE or GMAT just after my graduation.?
    Plz help

    Reply
  4. Hai…am vivek i have completed b.com(corp)..in may 2017…and am pursuing my post graduate as mba(finance)…what should i do next to study MS in abroad…please give me some of ur suggestions…thank you..

    Reply
  5. @Oracle: I don’t think it would be fair to link this practice to any nationality.

    @Rehaman: Almost all good schools insist on work experience. Here’s why: https://www.mbacrystalball.com/blog/2011/08/21/work-experience-why-is-it-important-for-international-mba-applications/

    @Piyali: Since Chemistry still interests you, an MSc may be a better choice. An MBA won’t necessarily get you a job faster.

    @Bipasha: We’ve tackled the question of MBA vs MBA after engineering here: https://www.mbacrystalball.com/blog/2015/06/01/mba-or-ms-after-btech-be-engineering/

    @Vivek: It may be better to focus on making the most out of the MBA degree you’ll have soon, rather than going for another degree.

    Reply

Leave a Comment