Mba - MBA Program Rankings

MBA Program Rankings

See also: List of United States business school rankings, Australian MBA Star Ratings, and Rankings of business schools in South Africa

As MBA programs proliferated over time, differences in the quality of schools, faculty, and course offerings became evident. As a means of establishing criteria to assess quality among different MBA programs, a variety of publications began compiling program information and ranking quality. Different methods of varying validity were used. The Gourman Report, which ran from 1967 until 1997, did not disclose criteria or ranking methods, and these reports were criticized for reporting statistically impossible data, such as no ties among schools, narrow gaps in scores with no variation in gap widths, and ranks of nonexistent departments. In 1977 The Carter Report published rankings of MBA programs based on the number of academic articles published by faculty. Also in 1977, the Ladd & Lipset Survey relied on opinion surveys of business school faculty as the basis for rankings, and MBA Magazine ranked schools based on votes cast by business school deans.

Most recently, well-known publications such as US News & World Report, Business Week, Financial Times, The Economist, the Wall Street Journal and the Forbes publish rankings of selected MBA programs. Often a schools’ rank will vary significantly across publications, as the methodology used to create the ranks is different among each publication. The U.S. News & World Report ranking incorporates responses from deans, program directors, and senior faculty about the academic quality of their programs as well as the opinions of hiring professionals. The ranking is calculated through a weighted formula of quality assessment (40%), placement success (35%), and student selectivity (25%). The BusinessWeek rankings are similarly based on student surveys, a survey of corporate recruiters, and an intellectual capital rating. The Financial Times incorporates criteria including survey responses from alumni who graduated three years prior to the ranking and information from business schools. Salary and employment statistics are weighted heavily. Rankings by the Economist Intelligence Unit and published in The Economist result from surveys administered to business schools (80%) and to students and recent graduates (20%). Ranking criteria includes GMAT scores, employment and salary statistics, class options, and student body demographics. Although the Wall Street Journal stopped ranking full-time MBA programs in 2007, its ranking are based on skill and behavioral development that should lend toward career success, such as social skills, teamwork orientation, ethics, and analytic and problem-solving abilities. In contrast to the aforementioned rankings, the Forbes MBA ranking only considers the return of investment five years after graduation. MBA alumni are asked about their salary, the tuition fees of their MBA program and other direct costs as well as opportunity costs involved. Based on this data, a final "5-year gain" is calculated and determines the MBA ranking position.

An often overlooked differentiator among MBA rankings are the weights attributed to the participating groups and their answers. At first glance, for instance, the Financial Times Global MBA Ranking seems to provide more emphasis to the opinion of schools' representatives than to alumni: Schools provide data for 11 out of 20 criteria whereas alumni only contribute to 8 criteria. The answers of the alumni, however, are weighted by 59 percent whereas the schools' answers are weighted only by 31 percent. Hence, the ranking strongly builds on the opinion of alumni. In contrast, the Economist MBA Ranking primarily relies on the data provided by business schools and the Bloomberg Businessweek MBA Rankings equally emphasizes the opinion of alumni and corporate recruiters.

Other rankings base methodologies on attributes other than standardized test scores, salary of graduates, and recruiter opinions. The Beyond Grey Pinstripes ranking, published by the Aspen Institute is based on the integration of social and environmental stewardship into university curriculum and faculty research. Rankings are calculated on the amount of sustainability coursework made available to students (20%), amount of student exposure to relevant material (25%), amount of coursework focused on stewardship by for-profit corporations (30%), and relevant faculty research (25%). The 2011 survey and ranking include data from 150 universities. The QS Global 200 Business Schools Report compiles regional rankings of business schools around the world. Ranks are calculated using a two-year moving average of points assigned by employers who hire MBA graduates. Since 2005, the UT-Dallas Top 100 Business School Research Rankings ranks business schools on the research faculty publish, not unlike The Carter Report of the past.

The ranking of MBA programs has been discussed in articles and on academic Web sites. Critics of ranking methodologies maintain that any published rankings should be viewed with caution for the following reasons:

  • Rankings limit the population size to a small number of MBA programs and ignore the majority of schools, many with excellent offerings.
  • The ranking methods may be subject to biases and statistically flawed methodologies (especially for methods relying on subjective interviews of hiring managers).
  • The same list of well-known schools appears in each ranking with some variation in ranks, so a school ranked as number 1 in one list may be number 17 in another list.
  • Rankings tend to concentrate on the school itself, but some schools offer MBA programs of different qualities (e.g. a school may use highly reputable faculty to teach a daytime program, and use adjunct faculty in its evening program).
  • A high rank in a national publication tends to become a self-fulfilling prophecy.
  • Some leading business schools including Harvard, INSEAD, Wharton and Sloan provide limited cooperation with certain ranking publications due to their perception that rankings are misused.

One study found that objectively ranking MBA programs by a combination of graduates' starting salaries and average student GMAT score can reasonably duplicate the top 20 list of the national publications. The study concluded that a truly objective ranking would be individualized to the needs of each prospective student. National publications have recognized the value of rankings against different criteria, and now offer lists ranked different ways: by salary, GMAT score of students, selectivity, and so forth. Other publications have produced “rankings of the rankings”, which coalesce and summarize the findings of multiple independent rankings. While useful, these rankings have yet to meet the critique that rankings are not tailored to individual needs, that they use an incomplete population of schools, may fail to distinguish between the different MBA program types offered by each school, or rely on subjective interviews.

Read more about this topic:  Mba

Famous quotes containing the word program:

    “Why visit the playhouse to see the famous Parisian models, ... when one can see the French damsels, Norma and Diana? Their names have been known on both continents, because everything goes as it will, and those that cannot be satisfied with these must surely be of a queer nature.”
    —For the City of New Orleans, U.S. public relief program (1935-1943)