Frequently Asked Questions About Newsweek’s High School Rankings 2016


Who is Newsweek’s research partner?

Newsweek is working with Westat to produce the 2016 Newsweek High School Rankings. Westat is a 100 percent employee-owned research firm headquartered in Rockville, MD. One of the nation’s leading research firms, Westat has directly supported and informed the U.S. Department of Education’s research programs through their analyses and data collection projects. Westat will survey schools that meet our threshold criteria on behalf of Newsweek to obtain in-depth data on college readiness indicators.

How are schools ranked?

Here are the 2 steps we devised to rank schools:

Threshold Analysis: First, we will create a high school achievement index based on performance indicators (i.e., proficiency rates on state standardized assessments) using data reported by states to EDFacts at the U.S. Department of Education.

Ranking Analysis: We will then survey schools that meet our threshold criteria to obtain in-depth data on the following college readiness indicators:

  • Number of counselor FTE
  • College acceptance and enrollment
  • SAT and ACT participation and performance
  • AP, IB, and AICE participation and performance
  • Dual enrollments

*The equity analysis was dropped for the 2016 rankings due to limitations in the EDfacts data

What data do schools need to provide for Newsweek’s survey?

  • Schools that meet our threshold criteria will be invited to submit the following data for the 2014-2015 school year:
  • School demographics and counselors
  • College acceptance and enrollment
  • SAT and ACT participation and performance
  • AP, IB, and AICE participation and performance
  • Dual enrollments

Where did you get your data?

Here’s the breakdown:

  • Short-list analysis data (reading/language arts and math state test proficiency, as well as the FRPL percentage) came from the federal database of the National Center for Education Statistics (, or NCES. We obtained records for over 15,819 schools and applied our threshold cutoff to make a short list of 4,760 schools for the absolute list and 4,452 for the relative list.
  • College Readiness analysis data was obtained by surveying approximately 6,500 schools that made the above cutoff. We received responses from about 25 percent of all surveyed schools. A portion of the data (namely, Magnet/Charter status, Student Retention and FRPL percentage) used in the College Readiness analysis came from the CCD. We also asked schools to report their FRPL percentage in our survey as well, so we could denote schools where there was a 15 percentage point or greater discrepancy between the FRPL percentage indicated by the federal database and the self-reported FRPL percentage. We ran further checks on the collected data to account for any mistakes, rechecking and/or removing any figures that were improbably high.

What year’s data is used for the rankings?

We used 2012-2013 data for our short-list analysis, as that is the most recent reliable data available in the public federal NCES database.* For our survey, however, we gathered data for the 2014-2015 school year to give as recent a snapshot as possible of school performance on college readiness. We did not survey for 2015-2016 data, as many schools did not yet have all of the data for the running school year.

*Please refer to the technical brief for information regarding changes in state assessment in SY 2013—14.

How is the 2016 methodology different from last year’s?

The 2016 methodology is largely unchanged from the 2014 and 2015 rankings methodology, which can be found here:

How did you assign weights to the factors used to calculate a school’s College Readiness Score?

We chose what we feel is a common sense approach to the weighting: We assigned the greatest weights to college enrollment (25 percent) and graduation rate (20 percent) because we felt that of the six factors in our analysis, these two were the most significant indicators of a school’s success in preparing its students for college. In support of this approach, we ran a special analysis in 2014 to ensure that this weighting scheme did not significantly alter the rankings. In this sensitivity analysis, we compared an approach that assigns equal weights (16.67 percent) to all six factors and an approach based on weights determined by a statistical method called a principal component analysis (PCA) to the results of assigning weights, based on our own judgment of relative importance. Our 2014 weighting sensitivity analysis found that each of these different weighting approaches did not significantly change the rankings, so we chose to stick with judgmental weights to reflect our view of the relative importance of the factors. To learn more about our weighting sensitivity analysis, read Westat’s full technical brief from 2015.

Why does Newsweek only rank public schools?

Newsweek is dedicated to helping as many students and parents make informed decisions about education. Along these lines, we focus on public schools because they are generally accessible to all students. Public and private schools are different types of schools with different sets of requirements and would require a different methodology for ranking them. That’s why we chose to focus on doing one job, and doing it well, in 2016.

Why did my school not qualify?

In addition to not meeting the threshold analysis criteria (see methods for further discussion on analysis), there is a variety of reasons a particular school may not have qualified. The initial sample of schools is generated from the Common Core of Data, and only included operational regular public high schools with at least one student and data on the number of students eligible for free- or reduced-price lunch. If a school did not have 2012-13 assessment data, it was excluded from the threshold analysis.

FAQ compiled by  Newsweek  and Westat.