Frequently Asked Questions About Newsweek’s High School Rankings 2015

08_06_FAQ_01
MBI/Alamy

Why are there two sets of rankings?

The question “What are America’s top schools?” has different answers, depending on whether student poverty is taken into account. We realized that rather than trying to make an all-encompassing list, we could reward more great schools by examining performance from more than one vantage point. For this reason, Newsweek annually publishes America’s Top High Schools, a ranking of schools based solely on achievement (our “absolute” list), and Beating the Odds, a ranking of schools that factors in student poverty (our “relative” list). The relative list measures how schools are performing compared with how they are statistically expected to perform, given their number of students in poverty. To measure student poverty levels, we used the percentage of students eligible for free or reduced-price lunch (FRPL) at each school.

How were the schools ranked?

Please read the 2015 Newsweek High School Rankings Methodology.

Who did the research?

Newsweek partnered with Westat to produce the 2015 Newsweek High School Rankings. Westat is a 100 percent employee-owned research firm headquartered in Rockville, Maryland. One of the nation’s leading research firms, Westat has directly supported and informed the U.S. Department of Education’s research programs through its analyses and data collection projects.

Was there an advisory panel for the 2015 methodology?

No, because the 2015 methodology is so similar to the 2014 methodology. To see the list of advisers who helped Newsweek and Westat create the current methodology last year, see the 2014 Rankings FAQ.

How is the 2015 methodology different from last year’s?

The 2015 methodology is largely similar to the 2014 rankings methodology, with several improvements:

  1. In addition to AP and IB (International Baccalaureate) course participation and performance, participation in dual enrollment courses has been added as a criterion for the rankings. Dual enrollment courses are college-level courses that some high schools offer in collaboration with colleges or universities, which give students the opportunity to earn high school and college credit simultaneously. We added this measure to recognize schools that offer this important college-readiness opportunity to their students in addition to, or instead of, AP and IB courses. (Note: We also surveyed schools for Advanced International Certificate of Education participation—AICE is a college-level course program similar to IB—in both 2014 and 2015, but ultimately did not include AICE course data in our analysis because very few schools reported this data.)
  2. We denoted which schools are charter or magnet schools; this data was not available for all schools in the federally collected Common Core of Data (CCD).
  3. We included more schools in our short-list analysis, by considering both regular and vocational schools and by lowering the threshold for schools to be considered for our absolute list.

Where did you get your data?

Here’s the breakdown:

  • Short-list analysis data (reading/language arts and math state test proficiency, as well as the FRPL percentage) came from the federal database of the National Center for Education Statistics (www.data.gov), or NCES. We obtained records for over 16,200 schools and applied our threshold cutoff to make a short list of 4,823 schools for the absolute list and 4,544 for the relative list.
  • College Readiness analysis data was obtained by surveying the more than 6,500 schools that made the above cutoff. We received responses from about 38 percent of all surveyed schools. A portion of the data (namely, Magnet/Charter status, Student Retention and FRPL percentage) used in the College Readiness analysis came from the CCD. We also asked schools to report their FRPL percentage in our survey as well, so we could denote schools where there was a 15 percentage point or greater discrepancy between the FRPL percentage indicated by the federal database and the self-reported FRPL percentage. We ran further checks on the collected data to account for any mistakes, rechecking and/or removing any figures that were improbably high.
  • Equity analysis data was the same data we used for the short-list analysis.

What year’s data is used for the rankings?

We used 2012-2013 data for our short-list analysis, as that is the most recent data available in the public federal NCES database. For our survey, however, we gathered data for the 2013-2014 school year to give as recent a snapshot as possible of school performance on college readiness. We did not survey for 2014-2015 data, as many schools did not yet have all of the data for the running school year.

How did you assign weights to the factors used to calculate a school’s College Readiness Score?

We chose what we feel is a common sense approach to the weighting: We assigned the greatest weights to college enrollment (25 percent) and graduation rate (20 percent) because we felt that of the six factors in our analysis, these two were the most significant indicators of a school’s success in preparing its students for college. In support of this approach, we ran a special analysis in 2014 to ensure that this weighting scheme did not significantly alter the rankings. In this sensitivity analysis, we compared an approach that assigns equal weights (16.67 percent) to all six factors and an approach based on weights determined by a statistical method called a principal component analysis (PCA) to the results of assigning weights, based on our own judgement of relative importance. Our 2014 weighting sensitivity analysis found that each of these different weighting approaches did not significantly change the rankings, so we chose to stick with judgmental weights to reflect our view of the relative importance of the factors. To learn more about our weighting sensitivity analysis, read Westat’s full technical brief from 2014.

Should I worry if my/my child’s school dropped in rank from previous years?

No. There are several reasons why a school’s rank may change significantly from year to year that are not tied to a drastic change in performance. One reason is that a school’s performance on state tests (which we use for our short-list analysis) can vary from year to year, especially when there are major changes to the state tests. In addition, we are only able to compare the schools that participate in our survey in our final rankings analysis each year, so the set of schools that we are able to rank differs from year to year and their standing relative to one another is therefore different. Our rankings offer a snapshot of school performance in just one year, using a limited, albeit broad, set of criteria.

Why does Newsweek rank only public schools and not private schools?

Newsweek is dedicated to helping as many students and parents as possible make informed decisions about education. For this reason, we focused on public schools because they are generally accessible to all students. Public and private schools are different types of schools with different sets of requirements and would require a different methodology for ranking them. In addition, while all public schools are require to report state test proficiency to the federal NCES database, there is not a comparable standard set of publicly available performance data for private schools.

What public schools were not considered for the rankings?

We did not include schools in our initial short-list analysis that are classified by the NCES as “Other/Alternative” schools. The NCES defines such schools as “a public elementary/secondary school that (1) addresses needs of students that typically cannot be met in a regular school, (2) provides nontraditional education, (3) serves as an adjunct to a regular school, or (4) falls outside the categories of regular, special education or vocational education” (source: NCES Glossary, see “School Type” definitions). Newsweek’s decision not to include these schools does not reflect any judgment whatsoever on such schools’ merit (many schools classified as “Alternative” by the NCES have excellent performance) but rather the need to compare a set of schools that have the fundamental similarities shared by schools classified as “Regular.” In other words, we felt that comparing “Regular” and “Alternative” schools in one analysis would be like comparing apples to oranges. We also did not include schools classified as “Special Education Schools” by the NCES.

What is the purpose of the rankings and how should I use them?

We urge parents, students and education professionals to consult the Newsweek High School Rankings as a guide to identifying schools where students from different backgrounds thrive and get an excellent head start on college preparedness.

These two lists are different and answer different types of questions. If you are interested in knowing which schools had the highest-achieving students on our college readiness index, regardless of whether the performance is attributable to the socioeconomic status of the students in the school, you might refer to the absolute list. If, on the other hand, you are interested in knowing which schools had the highest-achieving students on our college readiness index when controlling for socioeconomic status, you might consult the relative list. Regardless of your aim, we hope our approach can serve as a helpful reference point in the national conversation about education and equality.

Read More: Americas Top High Schools 2015 | Methodology

FAQ compiled by Newsweek and Westat.