FAQ about Top High Schools List

1.  How does the Challenge Index work?
We take the total number of Advanced Placement, International Baccalaureate or Cambridge tests given at a school in May, and divide by the number of seniors graduating in May or June. All public schools that NEWSWEEK researchers Dan Brillman and Gina Pace found that achieved a ratio of at least 1.000, meaning they had as many tests in 2007 as they had graduates, are put on the list on NEWSWEEK Website, and the 100 schools with the highest ratios are named in NEWSWEEK magazine.

NEWSWEEK published national lists based on this formula in 1998, 2000, 2003, 2005, 2006 and 2007. In the Washington Post, I have reported the Challenge Index ratings for every public school in the Washington area every year since 1998. I think 1.000 is a modest standard. A school can reach that level if only half of its students take one AP, IB or Cambridge test in their junior year and one in their senior year. But this year only about five percent of the approximately 27,000 U.S. public high schools managed to reach that standard.

2. Why does the number of schools on the list get larger after the magazine comes out?
We invite all qualifying schools we may have missed to email us their data. There is no national database that has the number of AP, IB and Cambridge tests and number of graduates for each public high school, so we have had to build our own. We are happy to capture the few schools we missed by using the publicity generated by publication of a new list.

In the past, schools have usually reported their passing rates on AP or IB as a sign of how well their programs were doing. When I say passing rate, I mean the percentage of students who scored 3, 4 or 5 on the 5-point AP test or 4, 5, 6 or 7 on the 7-point IB test. (The Cambridge tests, although similar to AP and IB, are used in very few schools, and rarely appear in school assessments.) Passing AP or IB scores are the rough equivalent of a C or C-plus in a college course and make the student eligible for credit at most colleges.

I decided not to count passing rates because I found that most American high schools kept those rates artificially high by allowing only top students to take the courses. In other instances, they opened the courses to all but encouraged only the best students to take the tests.

AP, IB and Cambridge are important because they give average students a chance to experience the trauma of heavy college reading lists and long, analytical college examinations. Studies by U.S. Department of Education senior researcher Clifford Adelman in 1999 and 2005 showed that the best predictors of college graduation were not good high-school grades or test scores, but whether or not a student had an intense academic experience in high school. Such experiences were produced by taking higher-level math and English courses and struggling with the demands of college-level courses like AP or IB. Two other studies looked at more than 150,000 students in California and Texas and found if they had passing scores on AP exams they were more likely to do well academically in college.

To send a student off to college without having had an AP, IB or Cambridge course and test is like insisting that a child learn to ride a bike without ever taking off the training wheels until the day you send the kid out onto the city streets alone. It is dumb, and in my view a form of educational malpractice. But most American high schools still do it. I don't think such schools should be rewarded because they have artificially high AP or IB passing rates achieved by making certain just their best students take the tests.

NEWSWEEK and The Washington Post  have added a new statistic developed by the College Board that indicates how well students are doing on the exams at each school while still recognizing the importance of increasing student participation. It is the Equity and Excellence rate, the percentage of ALL graduating seniors, including those who never got near an AP course, who had at least one score of 3 or above on at least one AP test sometime in high school. That is the "E&E" on our list. "Subs. Lunch" on the list stands for the percentage of students who qualify for federally subsidized lunches, the best measure of the percentage of low-income students at each school.

The average Equity and Excellence rate in 2007 for all schools, including those that lacked AP programs, was 15.2 percent. On the 2008 NEWSWEEK list, we give the Equity and Excellence percentage for those schools that have the necessary data. We ask IB schools to calculate their IB, or combined AP-IB, Equity and Excellence rate, using a 4 on the 7-point IB test as the equivalent of a 3 on the AP.

4. Why do you divide by the number of graduating seniors, and does that mean you only count tests taken by seniors? Don't you know that juniors, and sometimes even sophomores and freshmen take AP tests?
We divide May and June graduates as a convenient way to measure the relative size of each school. That way a big school like Coral Reef High in Miami, which gave 3,562 AP or IB tests and graduated 646 seniors in 2007 for a rating of  5.514 this year, will not have an advantage over North Hills Preparatory in Irving, Tex., which gave only 395 AP or IB tests but also graduated only 71 seniors for a rating of 5.563. On the 2008 NEWSWEEK list they are right next to each other at numbers 18 and 19.

We count all tests taken at the school, not just those taken by seniors.

5. How can you call these the best or top schools if you are only using one narrow measure? High school is more than just AP or IB tests.
Indeed it is, and if I could quantify all those other things in a meaningful way, I would. But teacher quality, extracurricular activities and other important factors are too subjective for a ranked list. Participation in challenging courses and tests, on the other hand, can be counted, and the results expose a significant failing in most high schools--so far only five percent of the public high schools in the United States qualify for the NEWSWEEK list. I think that this is the most useful quantitative measure of a high school, and one of its strengths is the narrowness of the criteria. Everyone can understand the simple arithmetic that produces a school's Challenge Index rating and discuss it intelligently, as opposed to ranked lists like U.S. News & World Report's "America's Best Colleges," which has too many factors for me to comprehend. 

As for the words "top" and "best," they are always based on criteria chosen by the list maker. My list of best film directors may depend on Academy Award nominations. Yours may be based on ticket sales. I have been very clear about what I am measuring in these schools. You may not like my criteria, but I have not found anyone who understands how high schools work and does not think AP, IB or Cambridge test participation is important. I often ask people what quantitative measure of high schools they think is more important than this one. Such discussions can be interesting and productive.

I have been having such a debate with Andy Rotherham, co-director of the Education Sector think tank. He argues that some of the schools on the NEWSWEEK list have low average test scores and high dropout rates, and do not below belong on any best high-schools list. My response is that these are all schools with lots of low income students and great teachers who have found ways to get them involved in college level courses. So far, we have no proven way for educators in low income schools to improve significantly their average tests scores or graduation rates. Until we do, I don't see any point in making them play a game that, no matter how energetic or smart they are, they can't win.

6. Why don't I see famous public high schools like Stuyvesant in New York City or Thomas Jefferson in Fairfax County, Va., on the NEWSWEEK list?
We do not include magnet or charter high schools that draw high concentrations of top students whose average SAT or ACT scores significantly exceed the highest average for any normal enrollment school in the country. This year that meant such schools had to have an average SAT score below 1300 on the reading and math sections, or an average ACT score below 29, to be included on the list.

The schools you name are terrific places with some of the highest average test scores in the country, but it would be deceptive for us to put them on this list. The Challenge Index is designed to honor schools that have done the best job in persuading average students to take college level courses and tests. It does not work with schools that have no, or almost no, average students. The idea is to create a list that measures how good schools are in challenging all students, and not just how high their students' test scores are. The high-performing schools we have excluded from the list all have great teachers, but research indicates that high SAT and ACT averages are much more an indication of the affluence of the students' parents.

Using average SAT or ACT scores is a change from the previous system we used, which excluded schools that admitted more than half of their students based on grades and test scores. That system penalized some inner city magnet schools that had high Challenge Index ratings but whose average SAT or ACT scores were below those of many normal enrollment suburban schools, so we switched to a system that we consider more fair and clear.

On our Public Elites' list, however, we do acknowledge schools that did not make the big list because their average SAT or ACT scores were too high.

7. Aren't all the schools on the list doing very well with AP or IB? So why rank them and make some feel badly that they are on the lower end of the scale?
This is an important point. These are all exceptional schools. Every one is in the top five percent  of American high schools measured this way. They have all shown remarkable AP and IB strength. I am mildly ashamed of my reason for ranking, but I do it anyway. I want people to pay attention to this issue, because I think it is vitally important for the improvement of American high schools. Like most journalists, I learned long ago that we humans are tribal primates with a deep commitment to pecking orders. We cannot resist looking at ranked lists. It doesn't matter what it is--SUVs, ice cream stores, football teams, fertilizer dispensers. We want to see who is on top and who is not. So I rank to get attention, with the hope that people will argue about the list, and in the process, think about the other issues it raises.

8. Is it not true that school districts who pay the AP or IB exam fees for their students skew the results of your Challenge Index?  Shouldn't an asterisk be attached to schools in districts that do that? 
If I thought that those districts who pay for the test, and require that students take it, were somehow giving themselves an unfair advantage to make their programs look stronger, I would add that asterisk or discount them in some way. But I think the opposite is true. Districts who spend money to increase the likelihood that their students take AP or IB tests are adding value to the education of their students. Taking the test is good. It gives students a necessary taste of what college demands. It's bad that many students in AP courses avoid taking the tests just because they prefer to spend May of their senior year sunning themselves on the beach or buying their prom garb. (Since AP and IB tests must be graded by human beings, the results arrive long after June report cards, so they usually do not count as part of the class grade and most schools allow students to skip the AP test if they wish. IB is organized differently, and few IB students miss those exams.)

If paying test fees persuades students, forcing them to take the test, that's just as good as if a school spent money to hire more AP teachers, or if the school made it difficult for students to drop out of AP without a good reason. I was happy when the state of Arkansas and most districts in Northern Virginia began to pay the test fees and require that the tests be taken. I hope many other districts follow suit.

9. Why don't you count the college exams that high-school students take at local colleges?
I would like to, but NEWSWEEK has tried to count what are often called dual enrollment exams (those given to high-school students who have taken local college courses) and it proved to be too difficult. The problem is that we want to make sure that the dual enrollment final exams are comparable to the AP, IB and Cambridge exams that define the index. We tried to set a standard--we would only count dual enrollment final exams that were at least two hours long and had some free response questions that required thought and analysis, just as the AP, IB and Cambridge exams do. And we wanted to be sure that the exams were written and scored by people who were not employed by the high school so that, like AP, IB and Cambridge exams, they could not be dumbed down to make the school or the teacher look good. Some high schools provided us with the necessary information, but most could not. It was too difficult for them to persuade the colleges managing the exams to help them, or they did not have the staff to gather the data we required. We did not want to be counting extra exams only from those schools that could afford extra staff, so we decided to stay with AP, IB and Cambridge, while we thought about better ways to count dual enrollment.

10. Why do some states have so many schools on your list while others have so few?
The more schools I've examined, the more I've come to believe in the power of high-school cultures, which are differ around the country for reasons that often have little to do with the usual keys to high-school performance (i.e., the income and education of the parents.)

It's no surprise that California, New York, Texas and Florida lead the nation in number of schools on the list. But it is more difficult to explain why the much less populous Virginia and Maryland come right after those megastates in the number of challenging high schools, and why Iowa, with some of the highest test scores in the country, has only a handful of high schools that met the criteria.

My tentative explanation is that some areas have had the good fortune to get school boards and superintendents who see that they serve their students better by opening up AP, IB and Cambridge to those who want to work hard. Once a few districts in a state do that, others follow. And once a state has success, its neighboring states begin to wonder why they aren't doing the same.

11. Why limit your list to public high schools? Don't you think those of us who pay tens of thousands of dollars to educate their children at private schools are also interested in how our schools measures up? 
My children attended both public and private high schools, so I share your interest in rating both varieties. The public schools are very quick to give NEWSWEEK and The Washington Post the data we need. They are, after all, tax-supported institutions. The private schools, sadly, have resisted this and most other attempts to quantify what they are doing. The National Association of Independent Schools has even warned its members against cooperating with reporters like me who might be trying to help who they call consumer-conscious parents like you. They say that parents should reject such numerical comparisons and instead visit each private school to soak up its ambience. I am all for visits, but I think what those private schools are essentially saying is that parents like you and I are too stupid to read a list in a magazine or newspaper and reach our own sensible conclusions about its worth.

A few private schools have shared their data with me, but since the majority are resisting, any list of private schools would be too incomplete to be useful.  

12. Should I worry if my child's high school has dropped in rank since the last NEWSWEEK list?
No. Keep in mind, as I said before, that every school on the list is in the top five percent of all American high schools measured in this way. If you want to gauge your school's progress, look at its rating, not its ranking. Many schools drop in rank each year because there is so much more competition on the list, but at the same time improve their ratio of tests to graduating seniors. That means they are getting better, and the rank is even less significant. Also, almost all schools on the list drop in rank in the updated Web site version of the list a few weeks after it first appears in NEWSWEEK, because we add schools that get their data to us after the deadline.

I realize it is my fault that people put too much emphasis on the ranks. If I didn't rank, this would not happen. I was startled that people even remembered what their school's rank was in previous years. The important thing is that your school is on the list, not where on the list it is.

13. Don't students in some schools that have both IB and AP tests practice a form of double-dipping? I hear that many of the IB students take both the IB and the AP tests in the same subject. Doesn't that skew your index?
It would, but we look for it and subtract it from each school's total number of tests any AP tests taken by IB students who did not take a separate AP course in that subject. 

14. Why are you making such a big deal out of AP? I hear more and more selective colleges are saying they don't like the program and are raising the score for which they will grant course credit, and some high schools are dropping AP altogether. I've heard some people say the courses are either watered down so the schools can stuff more students in and look good on your index, or that they limit a teacher's ability to be creative.
There is a bit, but only a small bit, of truth in what you've heard. Many selective colleges are making it harder to get credit for taking AP, IB and Cambridge courses and tests in high schools, but their reasons for doing so are unclear. Former philosophy professor William Casement, who's analyzed this trend, says he thinks AP courses and tests are not as good as the introductory college courses and tests they were designed to substitute for, and that is why those colleges are pulling back. There is unfortunately almost no evidence to back up his theory. In fact, the colleges have done almost no research on the quality of their introductory courses, while the College Board has expert panels that regularly compare AP courses with college intro courses to make sure they are on the same level.

Some high-school educators think the colleges don't like to give AP credit because it costs them revenue. There is no evidence to support that theory either, but it is clear that selective college admissions offices, as opposed to their credit-granting departments, are very happy to see AP or IB courses on applicants' transcripts. 

As for high schools rejecting AP, there are about 50 that have done that. They are almost all private, expensive, and represent less than two tenths of one percent of the nation's high schools.  Thousands of high schools, by contrast, are opening more AP or IB courses, which they say are the only national programs that provide a high and incorruptible standard for student learning.

Because AP and IB exams are written and scored by outside experts, it's impossible to water down an AP or IB course without exposing what you have done---unless of course you make sure very few of the students take the tests. That is why we count tests, not courses, for the index. And as for teacher creativity, AP and IB encourages it more than any other high-school program I know. The tests reward creative thinking and original analysis. Creative teachers who produce creative students find their AP and IB test scores are very high.

15. Even AP teachers don't like the NEWSWEEK list. Some whose schools made the list are its biggest critics. What do you think of that?
They are smart and hard-working educators who are entitled to their opinions. But so are those AP teachers who tell me the list helps them gain support for their students. Here is what Brian Rodriguez, who teaches AP American history and AP European history at Encinal High School in Alameda, Calif., told me about the impact of AP on non-AP courses in a school with many low-income and minority students:

"AP teachers rarely teach only AP classes. They have many other responsibilities to their department, collaborative educational focus groups, and as liaison to our middle schools. The AP techniques honed in years of teaching or gleaned from seminars are used in the regular classrooms (at a slower pace, but no less effective). For instance, I am teaching a unit on Vietnam to my regular U.S. history class. I will use the PowerPoint lecture I developed for my AP class on that subject, teach the students to take notes, use the Socratic method discussion techniques so effective in AP classes, and then teach writing methods and tips I use so effectively in my AP classes. In addition, I will teach these techniques to our new teachers at history department meetings, prepare a pamphlet on multiple choice testing techniques that was distributed to all the teachers at our school to prepare them for state standardized testing, and then visit our local middle schools to make a presentation to the teachers there.  In summary, AP teaching can be school wide, and raises all the ships in the harbor.

Methodology:
NEWSWEEK published its first list of top U.S. high schools 10 years ago. It was based on a school-assessment method I invented to dramatize my distress at the way the vast majority of high schools were barring students from challenging courses. One C student I knew was so angry at being denied a chance to take Advanced Placement U.S. history that she studied on her own and passed the AP test, but her school still would not change its rules. My list would compare such schools unfavorably with their few enlightened neighbors. I knew many principals and superintendents were not going to like it.

Here are the first words of my introduction to what I call the Challenge Index, published in my 1998 book, "Class Struggle: What's Wrong (and Right) about America's Best Public High Schools," from which NEWSWEEK took that first list of 243 ranked schools:

"Nearly every professional educator will tell you that ranking schools is counterproductive, unscientific, hurtful and wrong. Every likely criteria you might use in such an evaluation is going to be narrow and distorted. A school that stumbles one year may be fine the next. I accept all those arguments. Yet as a reporter and as a parent, I think that in some circumstances a ranking system, no matter how limited, can be useful."

In the 10 years since I have received tens of thousands of e-mails from educators, parents, students and taxpayers about the NEWSWEEK list.  They have mixed views on my attempt to force a discussion of high-school rigor by listing in NEWSWEEK those few schools that have encouraged more students to take Advanced Placement, International Baccalaureate and Cambridge courses and tests. This year the most far-sighted schools, nearly 1,400 of them, are ranked on Newsweek.com. That is only 5 percent of all U.S. high schools, but a big improvement from the 243 schools on the first list.

The method we use to calculate each school's index rating is simple, and can be applied by readers to their neighborhood schools. We count the total number of college-level exams taken at a school by ALL students each May, and divide by the number of graduating seniors. Any school with a ratio of 1.000 or higher, meaning it gave at least as many tests as it had graduates, is placed on the NEWSWEEK list.

Although simple, many educators tell me the measure captures, in a way other school statistics do not, a different attitude about students in the schools that make the list. Those schools turn out to have principals and teachers who are trying hardest to raise the achievement of each child, with college as a useful goal for all until students are old enough to decide what they want to do. "AP is the best thing to measure in a high school," said Tom Di Figlio, a veteran AP psychology teacher at Spanish River High School in Boca Raton, Fla., "because it is a real achievement test that marks proficiency in college-level courses."

We rank these great schools so people will read the list and the accompanying stories. From long experience as journalists, we know if we did not rank them, few people would pay attention. Some readers endorse my view that NEWSWEEK's recognition has given support to the many AP, IB and Cambridge teachers who want to welcome into their college-level courses all students willing to do the work. We understand, however, that many readers find this way of looking at schools odd, and offputting.  Most Americans consider schools with the highest average test scores to be the best, even though their students' success is heavily influenced by the financial status of their parents. The Challenge Index was designed in part to undermine the view that schools with lots of rich kids are good, and schools with lots of poor kids are bad. I love pointing to the many schools on the NEWSWEEK list that are full of low-income students, and often rank higher than much more affluent rivals.

This year, a group of 38 school superintendents from five states wrote to say they did not want their schools included. "We all believe that all schools, communities-and your readers-are poorly served by NEWSWEEK's persistent efforts to use a single statistic, the number of students who sit for AP or IB exams, to rank schools," their letter said. "In reality, it is impossible to know which high schools are 'the best' in the nation.  Determining whether different schools do or don't offer a high quality of education requires a look at many different measures, including students' overall academic accomplishments and their subsequent performance in college, and taking into consideration the unique needs of their communities."

I called John Chambers, superintendent of the Byram Hills district in New York and a leader of the letter-writing group. He agreed that the data we sought was public information, and that he and other superintendents would provide it if we insisted. I told him we believe that we serve not superintendents, but readers, and they wanted to see the list. Chambers said OK, but could we let readers know about the attempted boycott? I said I thought that was a great idea.

Here are the districts whose superintendents endorsed the letter:
New York: Ardsley, Bedford, Blind Brook-Rye, Brewster, Bronxville, Byram Hills, Chappaqua, Dobbs Ferry, Greenburgh/North Castle, Hewlett-Woodmere, Katonah-Lewisboro, Mamaroneck, Mt. Pleasant-Cottage School, North Shore, Ossining, Rye Neck, Scarsdale, Spackenkill, Tuckahoe, Valhalla.
New Jersey: Montclair, Montgomery, Tenafly, Verona.
Connecticut: Darien, Simsbury, Stonington, Wilton.
Illinois: Decatur #61, Deerfield/Highland Park #113, Evanston, Glenbrook #225, Lincoln-Way #210, New Trier #203, Oak Park and River.  Forest.
Massachusetts: Amherst-Pelham, Masconomet, Wayland.

I understand where they are coming from. I love talking to them about the issue. The majority of schools still refuse to let all students who want to work hard into their AP courses. But the numbers who have opened those courses are growing, as the NEWSWEEK list shows, and each year more and more principals and superintendents write me to say they are glad they decided to change their policy.

Join the Discussion