This in-depth look at U.S. college rankings offers a fresh perspective on the high school student’s college search and a wealth of resources to help find the “right” school.
BY FRANCESCA KELLY
Just as your high school student begins his senior year, guess what pops up on the newsstand? That’s right; it’s the U.S. News & World Report annual special issue of America’s top colleges.
Started 30 years ago, this list of ranked colleges has become a huge phenomenon among high school seniors, their parents, alumni, and the colleges and universities themselves. Although newer lists now exist, published by Washington Monthly and others, the U.S. News rankings are still the most popular.
But how helpful are they? Let’s take them apart to see how they work. Then we’ll look at some alternative lists of U.S. colleges that may be more useful in finding the right school.
U.S. News began ranking colleges back in 1983, based on a simple questionnaire sent to college presidents asking which colleges they considered “the best.” In 1987, the publication became a standalone, annual issue of the magazine, and colleges began to take notice and demand more objective methodology. U.S. News then expanded its opinion survey to include deans and administrators, and added criteria such as SAT scores of applicants and the colleges’ retention rates.
Over the years, the magazine’s editors have met regularly with college officials, guidance counselors and others in an effort to respond to criticism, revise their methodology and expand their market.
Since the “Best Colleges” list was first published, it has become enormously successful, and U.S. News has expanded rankings to include high schools, graduate schools and other institutions, as well as a new “Best Global Universities” list. Their Education Web page receives 30 million visits per month.
U.S. News offers a list of about 1,800 colleges and universities, which constitute roughly half of the total number of higher learning institutions in the United States.
These are divided into four categories:
The following factors go into determining a college’s score, and hence, its ranking. Each factor’s weight is given as a percentage of the score.
Academic Reputation (22.5 percent). This is based on peer assessment, with surveys collecting data from college administrators and faculty, as well as high school guidance counselors.
Retention (22.5 percent). Eighty percent of this factor is based on the six-year graduation rate, and 20 percent on the freshman retention rate.
Faculty Resources (20 percent). One of the most complicated factors in determining rank, this comprises several components: average class size and faculty salaries, as well as student-faculty ratio, highest degree in field, etc.
Student Selectivity (12.5 percent). Also using multifaceted methodology, student selectivity incorporates SAT and ACT scores for an entering freshman class (65 percent), as well as class rank, with a higher standard for national than for regional entities. The acceptance rate is also a factor in selectivity.
Financial Resources (10 percent). This is not about how much money a college has, but how much it spends on each student for instruction, research and student services. Spiffy dorms and Olympic-sized swimming pools don’t factor into this measurement.
Graduation Rate Performance (7.5 percent). This is a relatively new factor, only in its second year. What this specifically measures is a class’s actual rate of graduation compared to what was predicted for that class six years earlier. Students’ test scores and financial aid are factored into the equation, since these have an effect on the timeliness of graduation.
Alumni Giving Rate (5 percent). This is considered an indication of alumni satisfaction.
Colleges that choose not to take part in the rankings may still end up on the list: U.S. News footnotes them as “non-responders,” but gathers data on them from other sources, including the American Association of University Professors, the National Collegiate Athletic Association, the Council for Aid to Education and the U.S. Department of Education’s National Center for Education Statistics.
The success of the U.S. News rankings has spawned other ranking indexes from other publications, news entities and college-related organizations.
The success of the U.S. News rankings has spawned other ranking indexes from other publications, news entities and college-related organizations. And, of course, college guidebooks like Fiske, Peterson, Princeton Review and others that have been around for decades are now online, as well. Here is a selection of some of those resources. Varying widely in methodology and focus, they are listed alphabetically.
Fiske is available as a printed guidebook and also as a useful college search website where you can search colleges by different categories and do a self-survey to help narrow down choices.
Forbes has ranked colleges using a methodology that is based more on outcomes than on applicant qualifications. Calling the U.S. News rankings “abstract” and “wasteful,” Forbes’ list is centered on return on investment, with student satisfaction and post-graduate success among the biggest factors.
Kiplinger focuses its attention on “best value” institutions, divided into private universities, private liberal arts colleges and public universities. There are charts detailing average amount of debt after graduation by school, for example.
Money Magazine recently introduced rankings that measure which schools give you the most bang for your tuition buck, focusing on quality of education, affordability and outcomes.
The New York Times’ Upshot section ventured into alternative college rankings earlier this year, focusing on colleges that enroll students who are economically diverse.
Niche’s education portion of their website (formerly College Prowler) offers rankings that are based on student assessments and cover a variety of factors.
Peterson’s has been providing college search information for a long time, and its website offers practical college search tools, such as colleges listed by geography and major.
Princeton Review has a list for everything: best campus food, best professors, etc. Both their books and their website are student-oriented.
Washington Monthly came up with alternative rankings a few years ago, touting a list that “asks not what colleges can do for you, but what colleges are doing for the country.” Washington Monthly’s website states: “We rate schools based on their contribution to the public good in three broad categories: Social Mobility (recruiting and graduating low-income students), Research (producing cutting-edge scholarship and Ph.D.s) and Service (encouraging students to give something back to their country).” This year, they also included a list of worst colleges.
Wintergreen Orchard House, one of the main compilers of statistics for institutions of higher learning, is a destination for data-heads and guidance counselors who want a complete library of college data and statistics.
In late October, U.S. News released a new ranking index of the 500 top universities worldwide. Although many of the criteria used in the methodology remain subjective, such as “global reputation,” some of the U.S.-centric factors simply do not work when ranking schools in other countries, often because data such as selectivity are not measured by foreign universities.
U.S. News relied heavily on Thomson-Reuters’ Academic Reputation Survey, which measures such factors as number of doctorates awarded, number of publications from faculty, etc.
Interestingly, while Princeton often gets the sought-after number one spot on the U.S. rankings list, Harvard came out on top in this index, followed by three more U.S. institutions: Massachusetts Institute of Technology, Berkeley and Stanford. Oxford and Cambridge are also in the top 10, as well as Caltech, UCLA, the University of Chicago and Columbia University.
Because research and publications are heavily weighted, small American liberal arts colleges don’t stand much of a chance of getting ranked here.
Perusing this list may be of value to the Foreign Service dependent who wants to expand his or her educational opportunities beyond the United States. It also spotlights those American universities that may have a better reputation worldwide.
Although the media are making a fuss over the new U.S. News global rankings, London-based Times Higher Education has also been ranking global universities for years. Seven out of the top 10 schools on their list are American universities.
Sound familiar? Note that its reliance on Thomson-Reuters for data means that the U.S. News’ new global list is more or less identical to the THE list. Other lesser-known lists of global universities can be found online, as well.
Although the media are making a fuss over the new U.S. News global rankings, London-based Times Higher Education has also been ranking global universities for years.
So you live overseas, and you’ve got to narrow down your choices for college without a whole lot of knowledge. Wouldn’t college ranking indexes be a good place to start?
The answer is a very qualified yes, as long as you understand that rankings are only a small part of a much bigger picture.
Mona Molarsky, an education and arts writer who also counsels students as the online “College Strategist” explains: “College rankings are mostly used by people who aren’t very familiar with the educational landscape in the United States. If you consult these rankings with the understanding that the numbers are really just crude, ballpark estimates, you can get a general idea of a school’s reputation.”
Molarsky admits that using the rankings as a basis for comparison between schools might encourage a student to “dig further,” but cautions against taking the rank of any particular college seriously: “Should you base your college decision on the fact that U.S. News ranked Williams College #1 among national liberal arts colleges this year, while they ranked Haverford College #8? Absolutely not.”
Many experts agree that rankings or “top college” lists are probably not a good way to make a college decision, and some believe they are, in fact, harmful. In a recent article in Forbes (yes, the same magazine that publishes its own college rankings), writer Andrew Kelly explains that colleges can manipulate their standing in the rankings by raising tuition and rejecting more applicants, thus making them more selective.
He adds: “As long as we continue to define ‘the best colleges’ as those that enroll the best students—as opposed to those that teach their students the most or deliver the best return on investment—rankings competition will do little to expand educational opportunity.”
Colleges can manipulate rankings in many ways—some ethical, some not. For example, if a college wait-lists applicants whom they would ordinarily accept but are not sure will attend, those students will not count as “accepted students” unless they decide to enroll.
As a result, the “percentage of accepted students who enroll” statistic, also known as yield, which is used by many indexes, stays high for that college. Every college wants to be considered its students’ top choice, after all.
Other ways of manipulating statistics over the years have included offering incentives to admitted students to retake SATs to get a higher score; not admitting students with lower scores until later in the year after data is submitted; and, of course, encouraging as many students as possible to apply, even if they have no hope of being admitted, simply so the school can reject more of them, upping its selectivity.
Some schools have been found to conveniently “leave out” SAT and other admission test scores of their international applicants, as non-native English speakers tend to do poorly on these tests. Other schools have reported as an applicant anyone who had completed even part of their application, even if that student never actually applied.
When colleges have been discovered to have deliberately falsified data, as Claremont-McKenna did a few years ago, they have been “punished” by being left off the list for a year. In the latest U.S. News “Best Colleges” list, Claremont-McKenna is back, with a coveted number-eight ranking among national liberal arts colleges.
Even though U.S. News and other ranking indexes rely on independent data services to a certain extent, most of the data they receive is from the colleges themselves. Flagler College in Florida is the latest college among a growing list to have admitted to inflating data such as SAT scores for the U.S. News rankings.
As mentioned earlier, some colleges have chosen not to take part in ranking indexes. Reed College is perhaps the most notable, yet U.S. News still ranks it #77 of national liberal arts colleges, based on data gathered elsewhere—a rank some experts feel is meaningless. (Reed provides its own data on its website.)
But most colleges do take part in at least some ranking indexes, devoting time and resources to fill out surveys and questionnaires from data-compiling agencies. As cumbersome as it is to participate, opting out may hurt a college or university’s standing in the rankings, or even disqualify them altogether.
In fact, not answering just one question can keep a school from getting a rank. Kristin McKinley, associate director of research administration at Lawrence University, a small liberal arts college on the banks of the Fox River in leafy Appleton, Wisconsin, explains: “For a school to be ranked in U.S. News Best Colleges (2015 edition), there was a single question determining eligibility: Does your institution make use of SAT, ACT or SAT Subject Test scores in admission decisions for first-time, first-year, degree-seeking applicants? A school that answered ‘no’ was listed as ‘Unranked.’”
Because methodology varies among ranking entities, some colleges fare better with one index than with another. For example, among the data collected by many ranking indexes, graduation rate is perhaps the most common factor and tends to be weighted the highest.
“Yet even this figure varies based on type and calculation,” says McKinley. “At our institution, we focus on a six-year rate, given we have a double-degree program and many of our students have more than one major.”
In other words, if a ranking index uses a four-year rather than a six-year graduation rate, it would tend to work against a school that offers double majors or combined degree programs.
One way the rankings can be helpful is to allow students to compare their transcripts and admissions test scores with those of the “typical student” at a certain university.
“College rankings are poor guides with regard to the one thing that should really matter: Will this particular student find this school to be an optimal learning environment? No ranking can answer that question,” argues George Leef, director of research for the John William Pope Center for Higher Education.
Decrying the idea of “elite” schools that appear to offer a better education than schools low on the ranking list, Leef points out that many students learn more and better at small colleges whose professors are more dedicated to teaching than to big-name research.
Indeed, there are certain components to a successful college experience that are not going to appear in ranking indexes: for example, mentoring opportunities, whether the college is a “commuter school” and empties out on weekends, or is near or in a city with a rich cultural environment.
Factors that are especially important for Foreign Service kids, such as how close the college is to stateside relatives, the cost of overseas airfare and how many international students there are on campus don’t show up in a ranking. Yet these are vital issues; they require more research than just looking at a number on a list.
According to Northwestern University Associate Provost for University Enrollment Michael Mills, ranking indexes can be useful “if they measure meaningful aspects of the undergraduate experience, and are used in conjunction with all the sources of information about individual colleges.”
Determining which experiences are meaningful is up to the individual, but Mills posits that they may include “small class sizes, academic credentials of entering freshmen (learning from peers) and success rates (retention and graduation rates).”
One way the rankings can be helpful is to allow students to compare their transcripts and admissions test scores with those of the “typical” student at a certain university. That will give a clearer idea of their chances of admission.
Students can also use the lists as a jumping-off point, and then find the specific indexes, using the sites listed above and others, to assess factors like geography, size or specific programs in certain majors.
Reading guidebooks and using websites such as About.com’s college search section allow a student to delve deeper than simply perusing single lists of college rankings.
For better or for worse, in some circles there is still importance attached to prestige. Going to a “name” school, or at least one that most people have heard of, may open certain doors, and students need to consider that.
Yes, we all know there are plenty of wonderful colleges out there where students get a fabulous education. But if prestige is important to a student, then the rankings do show what college administrators regard as the most elite institutions.
Yet prestige isn’t everything. Loren Pope, author of Colleges That Change Lives, and Washington Post education columnist Jay Mathews, author of Harvard Schmarvard, have argued along with others that a college’s name is not enough to guarantee a good education, or at least, the right education for every individual.
Pope’s very popular Colleges That Change Lives inspired the nonprofit organization by the same name. CTCL is dedicated to the advancement and support of a student-centered college search process. Founded in 1998, it hosts information sessions nationwide and coordinates outreach efforts with high school counselors and college counseling agencies to educate families on the importance of understanding an individual student’s needs and how they “fit” with the mission and identity of a particular college community.
The CTCL website provides valuable information, news and resources on current issues in higher education, as well as common misperceptions about the college search process.
Says one Foreign Service parent whose child went to a Virginia public university, “I went to a ‘name’ school, basically hated it, lived on bagels and ramen, worked 20-plus hours a week the entire time, and came out of it in debt.”
And parent Victoria Hess, whose son Andrew attended the University of Wyoming (ranked #161 on the U.S. News list of national universities), says, “To graduate, he had to pass a rigorous national engineering exam. He would have had to pass the same exam at Johns Hopkins (ranked #12), where he also was accepted, but which he rejected due to cost. And at Wyoming, Andrew found a mentor—someone who really cared about him.”
In a 2013 speech at the State University of New York-Buffalo, President Barack Obama declared a crisis in college affordability and the need for restructuring, including a new ratings system for colleges based on return on investment. Washington Monthly, which started “alternative” rankings in 2005, immediately welcomed this news as in line with their own philosophy.
The trend toward value for money in college ranking indexes is on the upswing. Washington Monthly’s methodology, for example, favors more public institutions than elite private ones, and applauds colleges like Berea College, which awards every admitted student a scholarship covering tuition.
Other college rankings indexes are starting to shift their focus to value of investment, as well. And why shouldn’t they, when college expenses run into the tens, even hundreds of thousands of dollars?
For that reason and others, choosing a college is generally the first major decision a young adult makes. And it’s a very personal decision. A short glance over the rankings can be helpful. But you can lose perspective quickly and buy into the too-prevalent idea that an “elite” college is the only worthwhile place for your education.
As college strategist Molarsky says, “It’s important to take all these numbers with a big grain of salt, because it’s really impossible to quantify the quality of an education.”