Frequently Asked Questions About Newsweek's High School Rankings

Brian Finke

Why are there two sets of rankings?

The question, What Are America's Top Schools? has different answers depending on whether or not student poverty is taken into account. We realized that rather than trying to make an all-encompassing list, we could reward more great schools by examining performance from more than one vantage point. For this reason, Newsweek has published "America's Top High Schools," a ranking of schools based solely on achievement, and "Beating the Odds," a ranking of schools that factors in student poverty. To measure student poverty levels, we used the percentage of students eligible for free or reduced-price lunch at each school.

Frequently Asked Questions | Understanding the Rankings | Top 10 High Schools | America's Top High Schools | Beating the Odds - Top Schools for Low Income Students | Mapping America's Top High Schools | Newsweek's Top High Schools Special Section

How were the schools ranked?

Here are the three steps we devised to rank schools:

Threshold Analysis: First, we created a high school achievement index based on performance indicators (i.e., proficiency rates on state standardized assessments). For the absolute list, the index was used to identify high schools that perform at or above the 80th percentile within each state. For the relative list, the index was used to identify high schools that perform 0.5 standard deviations or more than their state's average when accounting for students' socioeconomic status.

Ranking Analysis: For the high schools on both lists identified in the threshold analysis, we created a college readiness score based on the following six indicators:

o Enrollment Rate—25 percent

o Graduation Rate—20 percent

o Weighted AP/IB composite—17.5 percent

o Weighted SAT/ACT composite—17.5 percent

o Holding Power (change in student enrollment between 9th-12th grades; this measure is intended to control for student attrition)—10 percent

o Counselor-to-Student Ratio —10 percent

For the absolute rankings, we rank ordered the schools by their college readiness index scores. For the relative list, we ranked the schools based on how well the schools performed relative to the average relationship between schools' college readiness index scores and percentage of students eligible for free and reduced priced lunches.

Equity (Gold Star) Analysis: Of the top high schools identified in the ranking analysis, we then identified schools in which economically disadvantaged students performed better than the state average for all students in reading and mathematics. This part of the analysis did not affect the rankings. Instead, we incorporated this step to recognize schools that have equitable academic performance for economically disadvantaged students as indicated by their performance levels relative to the state average for all students on both the reading and mathematics assessment. These schools are marked with a gold star on our full ranking lists*.

* A large number of schools did not have sufficient data available on NCES to be included in our equity analysis.

To read the full technical brief on our methodology, click here.

Who did the research?

Newsweek partnered with Westat to produce the 2014 Newsweek High School Rankings. Westat is a 100 percent employee-owned research firm headquartered in Rockville, Maryland. One of the nation's leading research firms, Westat has directly supported and informed the U.S. Department of Education's research programs through its analyses and data-collection projects.

Who was on Newsweek's Advisory Panel?

Ms. Elisa Villanueva-BeardCo-CEOTeach for America
Dr. Russell RumbergerDirectorCalifornia Dropout Research Project Dir. And Professor at UC Santa Barbara
Mr. Nick MorganExecutive DirectorThe Strategic Data Project at the Harvard Center for Education Policy Research
Dr. David SternEmeritus Professor of EducationUC Berkeley
Dr. Brad CarlAssociate Director, ResearcherThe Value-Added Research Center at the Wisconsin Center for Education Research

How is the 2014 methodology different from last year's?

The 2014 methodology extends on Newsweek's 2013 ranking methodology in several ways. First, the 2013 methodology relied solely upon self-reported data provided by schools to create a college readiness score. This year, we used public data, which are more reliable, collected by the National Center for Education Statistics (NCES) to supplement self-reported information about college readiness provided by schools. Student achievement data obtained from NCES were used to select a sample of schools from which to collect college readiness data via survey. Additionally, the 2014 methodology includes a variable to control for average student poverty levels, which are known to be associated with student achievement and college readiness. We controlled for student poverty levels to address critiques of past rankings that claim that school performance as measured by average student test scores is as much or more a function of student background characteristics than of factors within a school's control, and thus most rankings erroneously attribute variation in student performance to schools alone. To illustrate the difference between a list that controls for student poverty and one that does not, we constructed two different rankings, an absolute ranking and a relative ranking. Finally, we added two factors to our college readiness analysis– Holding Power (change in student enrollment between 9th-12th grades; this measure is intended to account for student attrition that may affect performance on the other variables like graduation rate) and counselor-to-student ratio.

Where did you get your data?

Here's the breakdown:

Ø Threshold analysis data (reading/language arts and math state test proficiency, as well as free or reduced-price lunch [FRPL] percentage) came from the publicly available federal database NCES ( We obtained records for 14,454 schools and applied our threshold cutoff to make a shortlist of 4,400 schools for the absolute list and 4,100 for the relative list.

Ø College readiness analysis data was obtained by surveying the more than 5,600 schools that made the above cutoff. We received responses from 37 percent of all surveyed schools. A portion of the data (namely, holding power and FRPL percentage) used in the college readiness analysis came from NCES. We ran further checks on the collected data to account for any mistakes, rechecking and/or removing any figures that were improbably high.

Ø Equity analysis data was the same data we used for the threshold analysis.

What year's data is used for the rankings?

All the data used in our analysis was for the 2011-2012 school year. The reason behind this is that the most recent data that is available from the NCES is for the 2011-2012 school year. We wanted to use this publicly available, government-vetted data for our threshold analysis to make our methodology more stringent and reliable, rather than using all self-reported school data. Then, when we surveyed schools for college readiness data, we asked for 2011-2012 school year figures to maintain consistency throughout our whole methodology process.

How did you assign weights to the factors used to calculate a school's college readiness score?

We assigned the greatest weights to college enrollment (25%) and graduation rate (20%) because we felt that of the six factors in our analysis, these two were the most significant indicators of a school's success in preparing its students for college. We chose what we feel is a common sense approach to the weighting and did a separate analysis to ensure that this weighting scheme did not significantly alter the rankings. In this sensitivity analysis, we tested an approach that assigns equal weights (16.67%) to all six factors; we also conducted a statistical approach called a principal component analysis (PCA) , which combines data elements into composite scores. We found that each of these different weighting approaches did not significantly change the rankings so we stuck with the judgmental weights to reflect our view of the relative importance of the factors. To learn more about our weighting sensitivity analysis, read Westat's full technical brief.

Why does Newsweek only rank public schools?

Newsweek is dedicated to helping as many students and parents make informed decisions about education. Along these lines, we focused on public schools because they are general accessible to all students. Public and private schools are different types of schools with different sets of requirements and would require a different methodology for ranking them. That's why we chose to focus on doing one job, and doing it well, in 2014.

Should I worry if my/my child's school dropped in rank from previous years?

No–the 2014 methodology is significantly different from that of 2013 and previous years. Our rankings offer a snapshot of school performance in just one year, using a limited, albeit broad, set of criteria which differ from those used in the past.

What is the purpose of the rankings and how should I use them?

We urge parents, students and education professionals to consult the Newsweek High School Rankings as a guide to identifying schools where students from different backgrounds thrive and get the best preparation for college.

These two lists are different and answer different types of questions. If you are interested in knowing which schools had the highest achieving students on our college readiness index, regardless of whether the performance is attributable to the social economic status of the students in the school, then you might refer to the absolute list. If, on the other hand, you are interested in knowing which schools had the highest achieving students on our college readiness index when controlling for social economic status, then you might consult the relative list. Regardless of your aim, it our hope that our approach can serve as a helpful reference point in the national conversation about education and equality.

FAQ compiled by Newsweek and Westat