- EphBlog - http://ephblog.com -

We’re #1!

Good news:

Princeton, Williams College once again take top spots in U.S. News’ rankings for 2014-15

1) Every time that Williams appears in a headline like this with Princeton, the value of the Williams brand improves. It is very important that we maintain this #1 ranking, mainly for admissions, and especially for international students.

2) Kudos to Adam Falk (and everyone else at Williams) for making this happen. US News can be tricky about its methodology and the changes it makes from year-to-year. They would sell more magazines if there were more changes in the top, so maintaining a #1 ranking can be tricky.

3) As I mention each year, there is a great senior thesis to be written about the rankings, similar to this article on the US News law school rankings. If you write such a thesis, hundreds of people around the country will read it.

4) Is anyone a subscriber to the detailed data. All I can see is:


We need to dive into the details. How far in the lead is Williams and what do we need to do to maintain the lead?

5) Recall my predictions from 5 years ago.

Although the competition is tough, our most serious competitor is Amherst and they will face real headwinds given their financial constraints. Their endowment is in more trouble than ours. Their increase in enrollment will hurt the student:faculty ratio. These ranks are based on data from before the financial crash, so the Williams advantage over Amherst will only continue. Don’t be surprised if/when Amherst falls behind Swarthmore in a year or two. I also suspect that Middlebury’s recent (and deserved) rise may be in danger.

Amherst hasn’t caught us, as predicted, and Middlebury has fallen from 4th to 7th. I still think that Amherst is in danger of falling behind Swarthmore, but we need more detailed data to evaluate that.

6) Below the break are the details of the methodology, which I am saving here for historical purposes.

The host of intangibles that makes up the college experience can’t be measured by a series of data points. But for families concerned with finding the best academic value for their money, the U.S. News Best Colleges rankings provide an excellent starting point for the search.

They allow you to compare at a glance the relative quality of institutions based on such widely accepted indicators of excellence as freshman retention and graduation rates and the strength of the faculty. And as you check out the data for colleges already on your short list, you may discover unfamiliar schools with similar metrics, and thus broaden your options.

Many factors other than those spotlighted here will figure in your decision, including location and the feel of campus life; the range of academic offerings, activities and sports; and cost and the availability of financial aid. But if you combine the information on usnews.com with college visits, interviews and your own intuition, our rankings can be a powerful tool in your quest for the right college.

How the Methodology Works

The U.S. News ranking system rests on two pillars. The formula uses quantitative measures that education experts have proposed as reliable indicators of academic quality, and it’s based on our researched view of what matters in education.

First, schools are categorized by their mission, which is derived from the breakdown of types of higher education institutions as refined by the Carnegie Foundation for the Advancement of Teaching in 2010. The Carnegie classification, which is used extensively by higher education researchers, has been the basis of the Best Colleges ranking category system since our first rankings were published in 1983.

The U.S. Department of Education and many higher education associations use the system to organize their data and to determine colleges’ eligibility for grant money. In short, the Carnegie categories are the accepted standard in higher education. The category names we use are our own – National Universities, National Liberal Arts Colleges, Regional Universities and Regional Colleges – but their definitions rely on the Carnegie principles.

National Universities offer a full range of undergraduate majors, plus master’s and doctoral programs, and emphasize faculty research. National Liberal Arts Colleges focus almost exclusively on undergraduate education. They award at least 50 percent of their degrees in the arts and sciences.

Regional Universities offer a broad scope of undergraduate degrees and some master’s degree programs but few, if any, doctoral programs. Regional Colleges focus on undergraduate education but grant fewer than 50 percent of their degrees in liberal arts disciplines; this category also includes schools that have small bachelor’s degree programs but primarily grant two-year associate degrees.

Regional Universities and Regional Colleges are further divided and ranked in four geographical groups: North, South, Midwest and West.

Once schools have been divided by category, we gather data from each college on up to 16 indicators of academic excellence. Each factor is assigned a weight that reflects our judgment about how much a measure matters. Finally, the colleges and universities in each category are ranked against their peers, based on their composite weighted score.

U.S. News didn’t make any changes to the Best Colleges ranking methodology for the 2015 edition.

Unranked Schools

Schools are Unranked and listed separately by category if they have indicated that they don’t use SAT or ACT test scores in admissions decisions for first-time, first-year, degree-seeking applicants. And, in a few cases, schools are Unranked if too few respondents to the spring and summer 2014 peer assessment survey gave them a rating.

Other reasons institutions are not ranked include: a total enrollment of fewer than 200 students, a large proportion of nontraditional students and no first-year students – as is the situation at so-called upper-division schools.

As a result of these eligibility standards, many of the for-profit institutions have been grouped with the Unranked schools; their bachelor’s degree candidates are largely nontraditional students in degree completion programs, for example, or they don’t use SAT or ACT test scores in admissions decisions.

In total, 148 colleges in the National Universities, National Liberal Arts Colleges, Regional Universities and Regional Colleges categories are listed as Unranked.

We also did not rank 83 highly specialized schools in arts, business and engineering.

Data Sources

Most of the data come from the colleges. This year, 91.5 percent of the 1,365 ranked colleges and universities we surveyed returned their statistical information during our spring and summer 2014 data collection window.

A ranked college is defined as those colleges in the National Universities, National Liberal Arts Colleges, Regional Universities and Regional Colleges categories that are numerically ranked or listed as Rank Not Published.

In total, U.S. News has collected data on nearly 1,800 colleges and all their data is on usnews.com, but only 1,365 are included in the rankings described in this methodology and given an actual numerical rank or Rank Not Published designation.

We obtained missing data from a number of sources, including the American Association of University Professors (faculty salaries), the National Collegiate Athletic Association (graduation rates), the Council for Aid to Education (alumni giving rates) and the U.S. Department of Education’s National Center for Education Statistics (information on financial resources, faculty, SAT and ACT admissions test scores, acceptance rates and graduation and retention rates).

Estimates, which are not displayed by U.S. News, may be used in the ranking calculation when schools fail to report particular data points that are not available from other sources. Missing data are reported as N/A in the ranking tables.

For colleges that were eligible to be ranked but refused to fill out the U.S. News statistical survey in the spring and summer of 2014, we have made extensive use of the statistical data those institutions were required to report to the National Center for Education Statistics, including such factors as SAT and ACT scores, acceptance rates, number of faculty, student-faculty ratios and graduation and retention rates. These schools are footnoted as nonresponders.

Ranking Model Indicators

The indicators we use to capture academic quality fall into a number of categories: assessment by administrators at peer institutions, retention of students, faculty resources, student selectivity, financial resources, alumni giving, graduation rate performance and, for National Universities and National Liberal Arts Colleges only, high school counselor ratings of colleges.

The indicators include input measures that reflect a school’s student body, its faculty and its financial resources, along with outcome measures that signal how well the institution does its job of educating students.

The measures, their weights in the ranking formula and an explanation of each follow.

Undergraduate academic reputation (22.5 percent): The U.S. News ranking formula gives significant weight to the opinions of those in a position to judge a school’s undergraduate academic excellence. The academic peer assessment survey allows top academics – presidents, provosts and deans of admissions – to account for intangibles at peer institutions, such as faculty dedication to teaching.

To get another set of important opinions on National Universities and National Liberal Arts Colleges, we also surveyed 2,152 counselors at public high schools, each of which was a gold, silver or bronze medal winner in the U.S. News rankings of Best High Schools published in April 2013, as well as 400 college counselors at the largest independent schools. The counselors represent nearly every state and the District of Columbia.

Each academic and counselor surveyed was asked to rate schools’ academic programs on a scale from 1 (marginal) to 5 (distinguished). Those who didn’t know enough about a school to evaluate it fairly were asked to mark “don’t know.”

The score used in the rankings is the average score of those who rated the school on the 5-point scale; “don’t knows” are not counted as part of the average. In the case of National Universities and National Liberal Arts Colleges, the academic peer assessment accounts for 15 percentage points of the weighting in the ranking methodology, and 7.5 percentage points go to the counselors’ ratings.

For the third year in row, the results from the two most recent years of counselor surveys, from spring 2013 and spring 2014, were averaged to compute the high school counselor reputation score. This was done to increase the number of ratings each college received from the high school counselors and to reduce the year-to-year volatility in the average counselor score.

The academic peer assessment score continues to be based only on the most recent year’s results – in this case, from spring 2014. Both the Regional Universities and Regional Colleges rankings continue to rely on one assessment score, by the academic peer group.

In order to reduce the impact of strategic voting by respondents, we eliminated the two highest and two lowest scores each school received before calculating the average score. Ipsos Public Affairs collected the data in spring 2014; of the 4,533 academics who were sent questionnaires, 42 percent responded. This response rate is unchanged from the survey conducted in spring 2013 for the 2014 edition of Best Colleges. The counselors’ one-year response rate was 9 percent for the spring 2014 surveys.

Retention (22.5 percent): The higher the proportion of freshmen who return to campus for sophomore year and eventually graduate, the better a school is apt to be at offering the classes and services that students need to succeed.

This measure has two components: six-year graduation rate (80 percent of the retention score) and freshman retention rate (20 percent). The graduation rate indicates the average proportion of a graduating class earning a degree in six years or less; we consider freshman classes that started from fall 2004 through fall 2007. Freshman retention indicates the average proportion of freshmen who entered the school in the fall of 2009 through fall 2012 and returned the following fall.

Faculty resources (20 percent): Research shows that the more satisfied students are about their contact with professors, the more they will learn and the more likely they are to graduate. We use six factors from the 2013-2014 academic year to assess a school’s commitment to instruction.

Class size has two components: the proportion of classes with fewer than 20 students (30 percent of the faculty resources score) and the proportion with 50 or more students (10 percent of the score).

Faculty salary (35 percent) is the average faculty pay, plus benefits, during the 2012-2013 and 2013-2014 academic years, adjusted for regional differences in the cost of living using indexes from the consulting firm Runzheimer International. We also weigh the proportion of professors with the highest degree in their fields (15 percent), the student-faculty ratio (5 percent) and the proportion of faculty who are full time (5 percent).

Student selectivity (12.5 percent): A school’s academic atmosphere is determined in part by the abilities and ambitions of the students.

This measure has three components. We factor in the admissions test scores for all enrollees who took the Critical Reading and Math portions of the SAT and the composite ACT score (65 percent of the selectivity score). We also consider the proportion of enrolled freshmen at National Universities and National Liberal Arts Colleges who graduated in the top 10 percent of their high school classes or the proportion of enrolled freshmen at Regional Universities and Regional Colleges who graduated in the top quarter of their classes (25 percent). The third component is the acceptance rate, or the ratio of students admitted to applicants (10 percent).

The data are all for the fall 2013 entering class. While the ranking calculation takes account of both the SAT and ACT scores of all entering students, the ranking tables display the score range for whichever test was taken by most students.

For the second year in a row, we used clearer footnotes to indicate the schools that did not report to U.S. News the fall 2013 SAT and ACT scores for all first-time, first-year, degree-seeking students for whom the schools had data. Schools sometimes fail to report SAT and ACT scores for students in these specific categories: athletes, international students, minority students, legacies, those admitted by special arrangement and those who started in the summer of 2013. The footnotes also indicate schools that declined to tell U.S. News whether all students with SAT and ACT test scores were represented.

For schools that did not report all scores or that declined to say whether all scores were reported, we reduced the value of their SAT and ACT scores in the Best Colleges ranking model. This practice is not new; since the 1997 rankings, we have discounted the value of such schools’ reported scores in the ranking model, because the effect of leaving students out could be that lower scores are omitted. If a school told U.S. News that it included all students with scores in its reported SAT and ACT scores, then those scores were counted fully in the rankings and were not footnoted.

Financial resources (10 percent): Generous per-student spending indicates that a college can offer a wide variety of programs and services. U.S. News measures financial resources by using the average spending per student on instruction, research, student services and related educational expenditures in the 2012 and 2013 fiscal years. Spending on sports, dorms and hospitals doesn’t count.

Graduation rate performance (7.5 percent): For the second year in a row, the graduation rate performance indicator has been used in all of the Best Colleges ranking categories. This indicator of added value shows the effect of the college’s programs and policies on the graduation rate of students after controlling for spending and student characteristics, such as test scores and the proportion receiving Pell Grants. We measure the difference between a school’s six-year graduation rate for the class that entered in 2007 and the rate we predicted for the class.

If the school’s actual graduation rate for the 2007 entering class is higher than the rate U.S. News predicted for that same class, then the college is enhancing achievement, or overperforming. If a school’s actual graduation rate is lower than the U.S. News prediction, then it is underperforming.

Alumni giving rate (5 percent): This reflects the average percentage of living alumni with bachelor’s degrees who gave to their school during 2011-2012 and 2012-2013, which is an indirect measure of student satisfaction.

To arrive at a school’s rank, we first calculated the weighted sum of its scores. The final scores were rescaled so that the top school in each category received a value of 100, and the other schools’ weighted scores were calculated as a proportion of that top score. Final scores were rounded to the nearest whole number and ranked in descending order. Schools that are tied appear in alphabetical order and are marked as tied on all ranking tables.

Check out usnews.com over the coming year, since we may add content to the Best Colleges pages as we obtain additional information. And as you mine these tables for insights – where your SAT or ACT scores might win you some merit aid, for example, or where you will be apt to get the most attention from professors – keep in mind that they provide a launching pad, not an easy answer.

Comments Disabled (Open | Close)

Comments Disabled To "We’re #1!"

#1 Comment By frank uible On September 11, 2014 @ 10:26 am

As a life long procrastinator, do not temporize.

#2 Comment By hc On September 12, 2014 @ 5:10 am


Congrats to Williams.

MCLA also broke out in the top ten. #9 in the nation for public liberal arts. Average cost per student is 2k.

Tough to compete against the service academies given the methodology: having the academies on this list is fundamentally unfair. Their federally funded DoD pockets are “just a little deeper” than those of the other state funded institutions on the list. The service schools are some of the best schools in the nation in any “overall” category, just like Williams. If retention of students was not measured West Point and Annapolis would probably rank at the top of the stack, ahead of any other school. Unlike David’s point on Williams, the service academies are not selling a luxury item. Annapolis and West Point are free; you pay later!

To be fair, That puts MCLA at #6 in the nation in its category. Not bad for a school that only costs a couple grand a year for the average student attending.

#3 Comment By Past Eph On September 13, 2014 @ 3:50 am

David, since in other contexts you seem to enjoy deriding those who make speculative, non-evidence based claims, what is your evidence that “it is very important that we maintain this #1 ranking, mainly for admissions, and especially for international students”? There is no evidence whatsoever that Williams trumps Amherst (or Swarthmore, or even Pomona) in admissions when it comes to any criteria (average SAT scores of incoming students, percentage admitted, number of applicants), despite its consistent edge in both US News and Forbes rankings. Nor is there any evidence that it is a more desirable destination for international applicants. All of these four schools prioritize slightly different things in admissions, but in the end, they are all roughly equal across the board in terms of the type of applicants and ultimately matriculants they attract.

Nor is there any evidence that Williams has fared any differently in terms of relative desirability over the past 12 years, when it ranked first, than it did over the previous decade, when it mainly ranked second or third. So far as I can tell, the evidence suggests the opposite of your claim: there is absolutely no correlation between small rankings changes and relative desirability to applicants, zero. Now, I’ll grant that if Williams fell out of the top five liberal arts colleges, that might have some small impact, but that never has and never will happen so long as the US News methodology does not change dramatically. So again, to the extent Falk might consider basing institutional policy on US News rankings, that would be a very silly priority indeed.

Also, Middlebury was ranked four just last year. There is some variability in the rankings from year to year, and Midd is one that bounces around that 4-8 range. If you had drafted this post one year ago, would you have then had to acknowledge your prediction about Midd was wrong? A one year blip in the rankings in no way meaningfully supports your predictions about Midd. I predict that it will rise to fourth or fifth again at some point in the next year or two (or three), and indeed, Midd is more popular than it ever has been before with applicants, by a large margin. As someone who supposedly cares about statistics, you have to admit that as single year’s ranking does not equal any sort of meaningful data point.

Finally, there is no evidence that Amherst’s endowment is in any type of trouble. They have nearly as high a total endowment, and a higher endowment per student, as Williams does. Where Williams has an advantage is that it has recently completed several massively expensive building projects (theater, student center, library, football complex, humanities buildings, and before those, the science center), with few if any others, other than gradually upgrading campus housing over time (which both schools generally engage in) on the horizon. Meanwhile, Amherst needs to either raise, spend down, or borrow 200-250 million for a badly-needed new science center, and once that is complete, likely turn to massive upgrades to its dining, library, and student life facilities, all of which need to be addressed at some point. (Amherst is also about to build more new dorms to replace dorms that will need to be torn down for the new science center). But, given that it seems easier to raise huge amounts of cash in one fell swoop for building projects (naming rights!!), there is no guarantee that will actually materially impact upon Amherst’s endowment, although it will certainly affect campus quality of life in the short term.

#4 Comment By Green Mt. Eph On September 13, 2014 @ 8:31 am

Word on the street is that Brown and Middlebury had submissions errors: Brown forgot to supply a data point and Middlebury submitted incorrect information for its faculty resources score. Hard to know what happened in each case, though Middlebury’s comparative numbers from last year suggests this is likely true. The rankings in that category for Middlebury fell 10+ places in one year and it is improbable that a school that has such a relatively high proportion of its classes as language courses (that is, compared to other liberal arts colleges), all with caps of 15 students or fewer, can have such a low percentage of its classes “under 20” students (62%). That category is part of the faculty resource metric, and is unlikely to have changed that much over past years, especially with the recent growth of the size of the faculty. Thus, there seems to be merit to the rumor of that submission error. And with the 4,5,6, ranks so close, that error could easily have caused the slide. With Brown: the Brown student newspaper reports the error, but it is hard to see how much that affected the overall score, but that makes sense, too.

#5 Comment By Eph Parent ’14 On September 13, 2014 @ 10:10 am

I have to agree with Past Eph on his Midd take, and Green Mt. Eph’s post supports my sense. I have a daughter at Midd (soph) and my son graduated from Williams this pat June. He took a gap year and had a terrific Williams experience. Daughter is having an equally rewarding and excellent experience at Middlebury. As one who has scoured the data prior to their college visits, Middlebury has been top 5 for 5 years, and this year’s “fall” was hardly predictable. In fact, Middlebury, in the circles I travel, as Past Eph notes, has never been more popular. Most, if not all, other metrics point upwards, save that outlier (faculty resources category). And the fall was hardly a tumble (tied for 4/5 last year).

As a data hound,I have delved deeply into the US methodology and numbers. If you want to question the top 10 rankings for LACs or provide meaningful critiques, look at Wellesley’s jump, questionable not because of the jump (Middlebury was #11 a decade ago in 2005), but because of reported data. While the disaggregated numbers are not available to parse, the large categories seem questionable, with big changes just from a year ago–changes that just seem hard to accomplish in one year.

More so, perhaps, are Bowdoin’s reported SATs. Very misleading—have a look. The footnote claims barely half of the students’ scores are recorded, and one can bet with being test optional, only the higher half of Bowdoin students reports those scores. So how does any ranking system use those data? US News should exclude that category unless a larger percentage of scores are used. Or, try to normalize the data to the full entering class, which would be just as fair/questionable as including only half the scores.

There is no way Bowdoin’s first year class SATs are similar to or higher than Amherst’s, Swarthmore’s, Williams’, or Harvard’s for that matter. Yet that is what is reported.

There is little to differentiate among these top 10 LACs. Each has its strengths, its weaknesses, its stereotypes, etc. But slight movements up and down, subjected to the vagaries of statistics that might or might not be commonly reported (the apples to apples problem), or even accurately reported (per Green Mt. Eph’s post)are hardly definitive or even meaningful. But debating and discussing can be fun.

#6 Comment By David Dudley Field ’25 On September 14, 2014 @ 8:56 pm

David, since in other contexts you seem to enjoy deriding those who make speculative, non-evidence based claims, what is your evidence that “it is very important that we maintain this #1 ranking, mainly for admissions, and especially for international students”?

Everyone that I have spoken to that is involved in international admissions — both as admissions officers and applicants — has told me this. I don’t have any hard evidence. Of course, back in the day of need-blind international admissions, the fact that Williams, alone among a small handful of schools, was need-blind for international students was a huge pull.

#7 Comment By David Dudley Field ’25 On September 14, 2014 @ 8:59 pm

I stand corrected on the Middlebury and Amherst claims. Or, at least, I have not studied these issues as closely as I should. And, those predictions, made in 2009, did not anticipate a tripling of the S&P 500.

But, doesn’t it have to be the case that Amherst’s per endowment numbers are under stress because they have increased the size of each class? Again, maybe that didn’t happen, or hasn’t happened yet. But, if it has, the endowment per student must have, by definition, come down.