- EphBlog - http://ephblog.com -

We’re #1 (for the 14th straight time)

Williams is #1 in the US News ranking for the 14th year in a row.

The 2017 Best Colleges rankings are out from U.S. News & World Report — and there are some familiar schools in the top slots. For the sixth straight year, Princeton University was named No. 1 in the “best national universities” category by the magazine, which surveys more than 1,800 colleges in America for its annual list. Meanwhile, Williams College in Massachusetts took the top spot among best national liberal arts colleges for the fourteenth consecutive year.

Always good news! Do readers want a multi-day examination of this topic? Most of my views are unchanged from previous years.

1) Every time that Williams appears in a headline like this with Princeton, the value of the Williams brand improves. Kudos to Adam Falk and the rest of the administration! There are few things more important (rightly or wrongly) to the College’s reputation, especially with international applicants and their families, then maintaining this ranking. Staying #1 may not be hard, given Williams’ resources, but screwing this up could have been easy.

2) Many schools do a lot of suspect/sleazy things to improve their rank. Does Williams? Morty, infamously, capped discussion class size at 19 to ensure that the maximum number of classes met this US News cut off.

3) There is a great senior thesis to be written about the rankings, similar to this article on the US News law school rankings. If you write such a thesis, hundreds of people around the country will read it.

4) Below the break is a copy of the methodology, saved for the benefit of future historians.

Data Sources

Most of the data come from the colleges. This year, 93 percent of the 1,374 ranked colleges and universities surveyed returned their statistical information during the spring and summer 2016 data collection window.

A ranked college is defined as a college in the National Universities, National Liberal Arts Colleges, Regional Universities and Regional Colleges categories that is numerically ranked or listed as Rank Not Published.

In total, U.S. News has collected data on more than 1,800 colleges. While all the data appear on usnews.com, only 1,374 schools are included in the rankings described in this methodology and given a numerical rank or Rank Not Published designation.

We obtained missing data from a number of sources, including the National Collegiate Athletic Association (graduation rates), the Council for Aid to Education (alumni giving rates) and the U.S. Department of Education’s National Center for Education Statistics (information on financial resources, faculty, SAT and ACT admissions test scores, acceptance rates and graduation and retention rates).

Estimates, which U.S. News does not display, may be used in the ranking calculation when schools fail to report particular data points that are not available from other sources. Missing data are reported as N/A in the ranking tables on usnews.com.

For colleges that were eligible to be ranked but refused to fill out the U.S. News statistical survey in spring and summer 2016, we have made extensive use of the statistical data those institutions were required to report to the National Center for Education Statistics, including such factors as SAT and ACT scores, acceptance rates, number of faculty, student-faculty ratios, and graduation and retention rates. These schools are footnoted as nonresponders.

Ranking Model Indicators

The indicators used to capture academic quality fall into a number of categories: graduation and first-year student retention rates, assessment by administrators at peer institutions, faculty resources, student selectivity, financial resources, alumni giving, graduation rate performance and, for National Universities and National Liberal Arts Colleges only, high school counselor ratings of colleges.

The indicators include input measures that reflect a school’s student body, its faculty and its financial resources, along with outcome measures that signal how well the institution educates students.

The measures, their weights in the ranking formula and an explanation of each follow.

Graduation and retention rates (22.5 percent): The higher the proportion of first-year students who return to campus for sophomore year and eventually graduate, the better a school is apt to be at offering the classes and services that students need to succeed.

This measure has two components: six-year graduation rate (80 percent of the score) and first-year retention rate (20 percent). The graduation rate indicates the average proportion of a graduating class earning a degree in six years or less; we consider first-year student classes that started from fall 2006 through fall 2009. First-year retention indicates the average proportion of first-year students who entered the school in the fall 2011 through fall 2014 and returned the following fall.

Undergraduate academic reputation (22.5 percent): The U.S. News ranking formula gives weight to the opinions of those in a position to judge a school’s undergraduate academic excellence. The academic peer assessment survey allows top academics – presidents, provosts and deans of admissions – to account for intangibles at peer institutions, such as faculty dedication to teaching.

To get another set of important opinions on National Universities and National Liberal Arts Colleges, U.S. News also surveyed 2,200 counselors at public high schools, each of which was a gold, silver or bronze medal winner in a recent edition of the U.S. News Best High Schools rankings. The counselors surveyed represent every state and the District of Columbia.

Each academic and counselor surveyed was asked to rate schools’ academic programs on a scale from 1 (marginal) to 5 (distinguished). Those who didn’t know enough about a school to evaluate it fairly were asked to mark “don’t know.”

The score used in the rankings is the average score of those who rated the school on the 5-point scale; “don’t knows” are not counted as part of the average. To reduce the impact of strategic voting by respondents, U.S. News eliminated the two highest and two lowest scores each school received before calculating the average score.

The academic peer assessment score in this year’s rankings is based on the results from surveys in spring 2015 and spring 2016.

Both the Regional Universities and Regional Colleges rankings rely on one assessment score, by the academic peer group, for this measure in the rankings formula. In the case of National Universities and National Liberal Arts Colleges, the academic peer assessment accounts for 15 percentage points of the weighting in the ranking methodology, and 7.5 percentage points go to the high school counselors’ ratings.

The results from the three most recent years of counselor surveys, from spring 2014, spring 2015 and spring 2016, were averaged to compute the high school counselor reputation score. This was done to increase the number of ratings each college received from the high school counselors and to reduce the year-to-year volatility in the average counselor score.

Ipsos Public Affairs collected the data in spring 2016. Of the 4,635 academics who were sent questionnaires, 39 percent responded. This response rate is down very slightly from the 40 percent response rate in spring 2015 and the 42 percent response rate to the surveys conducted in spring 2014 and spring 2013.

The counselors’ one-year response rate was 9 percent for the spring 2016 surveys, up slightly from 7 percent in spring 2015.

Faculty resources (20 percent): Research shows that the more satisfied students are about their contact with professors, the more they will learn and the more likely they are to graduate. U.S. News uses five factors from the 2015-2016 academic year to assess a school’s commitment to instruction.

Class size is 40 percent of this measure. Schools receive the most credit in this index for their proportion of undergraduate classes with fewer than 20 students. Classes with 20-29 students score second highest; those with 30-39 students, third highest; and those with 40-49 students, fourth highest. Classes that have 50 or more students receive no credit.

Faculty salary (35 percent) is the average faculty pay, plus benefits, during the 2014-2015 and 2015-2016 academic years, adjusted for regional differences in the cost of living using indexes from the consulting firm Runzheimer International. U.S. News also weighs the proportion of professors with the highest degree in their fields (15 percent), the student-faculty ratio (5 percent) and the proportion of faculty who are full time (5 percent).

Student selectivity (12.5 percent): A school’s academic atmosphere is determined in part by students’ abilities and ambitions.

This measure has three components. U.S. News factors in the admissions test scores for all enrollees who took the critical reading and math portions of the SAT and the composite ACT score (65 percent of the selectivity score).

U.S. News also considers the proportion of enrolled first-year students at National Universities and National Liberal Arts Colleges who graduated in the top 10 percent of their high school classes or the proportion of enrolled first-year students at Regional Universities and Regional Colleges who graduated in the top quarter of their classes (25 percent).

The third component is the acceptance rate, or the ratio of students admitted to applicants (10 percent).

The data are all for the fall 2015 entering class. While the ranking calculation takes account of both the SAT and ACT scores of all entering students, the ranking tables on usnews.com display the score range for whichever test most students took.

U.S. News use footnotes online to indicate schools that did not report to U.S. News the fall 2015 SAT and ACT scores for all first-time, first-year, degree-seeking students for whom the schools had data. Schools sometimes fail to report SAT and ACT scores for students in these specific categories: athletes, international students, minority students, legacies, those admitted by special arrangement and those who started in summer 2015.

U.S. News also uses footnotes to indicate schools that declined to tell U.S. News whether all students with SAT and ACT test scores were represented.

For schools that did not report all scores or that declined to say whether all scores were reported, U.S. News reduced the value of their SAT and ACT scores in the Best Colleges ranking model by 15 percent. This practice is not new; since the 1997 rankings, U.S. News has discounted the value of such schools’ reported scores in the ranking model, because the effect of leaving students out could be that lower scores are omitted.

If a school told U.S. News that it included all students with scores in its reported SAT and ACT scores, then those scores were counted fully in the rankings and were not footnoted.

If less than 75 percent of the fall 2015 entering class submitted SAT and ACT scores, their test scores were discounted by 15 percent in the ranking calculations. U.S. News also used this policy in the 2016 edition of the rankings.

Financial resources (10 percent): Generous per-student spending indicates that a college can offer a wide variety of programs and services. U.S. News measures financial resources by using the average spending per student on instruction, research, student services and related educational expenditures in the 2014 and 2015 fiscal years. Spending on sports, dorms and hospitals doesn’t count.

Graduation rate performance (7.5 percent): This indicator of added value shows the effect of the college’s programs and policies on the graduation rate of students after controlling for spending and student characteristics, such as test scores and the proportion receiving Pell Grants. U.S. News measures the difference between a school’s six-year graduation rate for the class that entered in 2009 and the rate U.S. News had predicted for the class.

If the school’s actual graduation rate for the 2009 entering class is higher than the rate U.S. News predicted for that same class, then the college is enhancing achievement, or overperforming. If a school’s actual graduation rate is lower than the U.S. News prediction, then it is underperforming.

Alumni giving rate (5 percent): This reflects the average percentage of living alumni with bachelor’s degrees who gave to their school during 2013-2014 and 2014-2015, which is an indirect measure of student satisfaction.

To arrive at a school’s rank, U.S. News first calculated the weighted sum of its standardized scores. The final scores were rescaled so that the top school in each category received a value of 100, and the other schools’ weighted scores were calculated as a proportion of that top score. Final scores were rounded to the nearest whole number and ranked in descending order. Schools that are tied appear in alphabetical order and are marked as tied on all ranking tables.

Comments Disabled (Open | Close)

Comments Disabled To "We’re #1 (for the 14th straight time)"

#1 Comment By ephalum On September 14, 2016 @ 2:26 pm

David, can you provide one single piece of evidence that this ranking is in any way important to the college’s reputation, let alone critically important? Maybe it was in the 1980s when these rankings first came out. But I don’t think you can.

Williams was number 2 for years (3 some years) and now it’s been number 1 for many years, and I’ve seen no relative difference in terms of its standing compared to Amherst, Swarthmore, and Pomona with applicants or employers. Those four schools enroll the strongest student bodies, and have the strongest outcomes for their alumni, year after year after year after year. But Williams does not do better than the other three in terms of the student body it enrolles — they are basically dead even year after year with minor, immaterial, and inconsistent variations.

Sometimes one of them drops as low as sixth, but there is no evidence that relatively small changes in rankings makes even an iota of difference to applicants when it comes to schools at the top of the pyramid like this. If it did, we’d see Williams with more applicants with better credentials relative to its closest peers when it is ranked first vs. when it is ranked second or third. But we don’t. The applicant pools to these schools, as well as yield outcomes, as well as job and graduate school placements, seem to have zero correlation with small changes in U.S. News Rankings. I mean, it’s fun for bragging rights and all, but beyond that? There is no evidence, none, that ranking first vs. ranking second or third or fourth in U.S. News is meaningful in any way, so far as I’ve ever seen. If it was, Amherst wouldn’t consistently receive something like 1000-1500 more applications than Williams does (and basically tie Williams year after year in terms of which school yields applicants admitted to both, so far as I’m aware) despite bearing the ignominy of being ranked second for over a decade … are there more than 2-3 students per year who choose Williams over Amherst solely by virtue of U.S. News ranking? Frankly, I’d be surprised if it is even that many.

#2 Comment By eph’12 On September 15, 2016 @ 4:34 pm


Yes, Williams does sleazy things to game these numbers.

As recently as several years ago, when I was there, they would “force” students on financial aid who were having mental health problems to stay on campus, even if it was not in the student’s best interests, or else risk having to pay back the aid received.

This was also done with students who took breaks for other reasons…all so that they could keep the retention rates and graduation within 6 year rates down. These are just the things I’m aware of; there’s almost certainly more.

I’m sure all of this is vetted by the school’s lawyers…

You’ve always been interested in the management of the endowment (rightly so) and the compensation of those in charge of it.

I’d be just as interested to know just how much money the college spends each year on its legal fees and what those fees were specifically for.

This is probably no different than any other school, but I don’t care about other schools–only Williams. (At least as far as learning about them could improve Williams.)

#3 Comment By John C. Drew, Ph.D. On September 15, 2016 @ 7:40 pm

– eph’12

I have always been interested in how Williams College treats students with profound mental health problems. I have assumed that the student’s mental illness (their distance from reality) was hidden or masked by their adherence to liberal ideas and a certain post-modern willingness to doubt the benefits of sanity.

I’m open to new perspectives, however. I would like to learn more about how the school may be forcing students with severe mental health problems to stay on campus or to return to campus too early simply to uphold the school’s extremely attractive #1 ranking.

This challenge would be particularly severe in dealing with students who have dual diagnosis — that is mental illness combined with substance abuse issues. Some of the most mentally ill Williams students I have studied have made spectacular headlines. I have no doubt that Williams’ rural setting, insular community, and hair-trigger conformism would make it an unhealthy environment for a truly ill student.

#4 Comment By Williams Alum On September 16, 2016 @ 8:35 am

Would be interested on comments re: 1) above.


#5 Comment By David Dudley Field ’25 On September 16, 2016 @ 9:51 am

can you provide one single piece of evidence that this ranking is in any way important to the college’s reputation, let alone critically important?

The behavior of colleges themselves is the best evidence. Background readings here.

But Freeland, the man who had helped successfully launch UMass Boston over the previous two decades, had a plan. Freeland believed that if Northeastern could justify its increased costs to students and parents, it could be saved. And one gauge consistently determined a college’s value: its position on the U.S. News & World Report “Best Colleges” rankings. Freeland observed how schools ranked highly received increased visibility and prestige, stronger applicants, more alumni giving, and, most important, greater revenue potential. A low rank left a university scrambling for money. This single list, Freeland determined, had the power to make or break a school.

This is just one of dozens of examples that I could provide of college presidents doing whatever it takes to increase their rankings. Are they idiots? I doubt it! Another example:

Five colleges have recently been caught cooking their admissions statistics in order to secure higher spots in the U.S. News college rankings …
Tulane University, Bucknell University, Claremont McKenna College, Emory University, and George Washington University have all been implicated in the past year alone.

If the US News rankings don’t matter, then why would Tulane et al lie?

#6 Comment By David Dudley Field ’25 On September 16, 2016 @ 9:53 am

If it did, we’d see Williams with more applicants with better credentials relative to its closest peers when it is ranked first vs. when it is ranked second or third. But we don’t.

The last time Williams was ranked lower than #1 was in 2002. Did you carefully study applicant credentials during that era and then compare it to now? I doubt it! (If you did, please reveal your data sources and methodology.)

#7 Comment By Williams Alum On September 16, 2016 @ 10:57 am

No doubt, a 30 gap move matters. Probably, even a ten gap move matters. But does moving one spot? Two? Who knows. Onus of proof seems to be on you. You are making a claim, that being #1 instead of #2 is one of the top few most important things to Williams as an institution. I would love to know your metrics, data sources, and methodology.

Don’t necessarily think you are wrong, just a bold claim to go unsupported.


#8 Comment By David Dudley Field ’25 On September 16, 2016 @ 12:08 pm

You are making a claim, that being #1 instead of #2 is one of the top few most important things to Williams as an institution.

Not how I would phrase it. Better: Being #1 instead of #2 is one of the most important things that the Williams Administration can plausibly influence.

The most important thing for Williams is a $2 billion endowment. But you should not praise/blame Falk much for that. He can’t control it much. Similarly, to the overall quality of the student body. It was great before he showed up. It will be great after he leaves. Both those things, and many others, are much more important than the US News rank, although, obviously, both enter into the US News rank.

But a President can affect things on the margin, as Morty did with his edict on class sizes. To the extent that Falk is doing things like this that help to ensure that we are #1, he is doing a good job and should be praised.

Side note: From talking to many students, I am convinced that #1 versus #2 matters in certain situations: mainly international students and students considering passing on H/Y/P/S because they want a small college. I don’t know if this is a large number of students in absolute terms.

#9 Comment By Williams Alum On September 16, 2016 @ 1:32 pm

Makes sense.

There are few things more important (rightly or wrongly) to the College’s reputation, especially with international applicants and their families, then maintaining this ranking.

I’m not convinced you have evidence for this that isn’t anecdotal. If you do, I would love to see it. If you want to update the statement to say there are few things more important and controllably by PAF, I would believe that more easily.

I agree, PAF should be praised for whatever he does to keep us #1 regardless. I sure like being #1.