Steroid Abuse: The Dangers Facing Teens

We take the total number of Advanced Placement, International Baccalaureate or Cambridge tests given at a school in May, and divide by the number of seniors graduating in May or June. All public schools that NEWSWEEK researchers Dan Brillman, Halley Bondy and Becca Kaufman found that achieved a ratio of at least 1.000, meaning they had as many tests in 2006 as they had graduates, are put on the list on the NEWSWEEK Web site, and the 100 schools with the highest ratios are named in the magazine.

NEWSWEEK published national lists based on the same formula in 1998, 2000, 2003, 2005 and 2006. In the Washington Post, I have reported the Challenge Index ratings for every public school in the Washington area every year since 1998. I think 1.000 is a modest standard. A school can reach that level if only half of its students take one AP, IB or Cambridge test in their junior year and one in their senior year. But this year only about 5 percent of all U.S. public high schools managed to reach that standard and be placed on the NEWSWEEK list.

At the top of the Web site list we invite all qualifying schools we may have missed to e-mail us their data so that we can put them on the list. There is no national database that has the number of AP, IB and Cambridge tests and number of June graduates for each public high school, so we have had to build our own. We are happy to capture the few schools we missed by using the publicity generated by publication of a new list.

In the past, schools have usually reported their passing rates on AP or IB as a sign of how well their programs were doing. When I say passing rate, I mean the percentage of students who scored 3, 4 or 5 on the 5-point AP test or 4, 5, 6 or 7 on the 7-point IB test. (The Cambridge tests, although similar to AP and IB, are used in very few schools, and rarely appear in school assessments.) Passing AP or IB scores are the rough equivalent of a C or C-plus in a college course and make the student eligible for credit at most colleges.

I decided not to count passing rates in the way schools had done in the past because I found that most American high schools kept those rates artificially high by allowing only top students to take the courses. In some other instances, they opened the courses to all but encouraged only the best students to take the tests.

AP and IB are important because they give average students a chance to experience the trauma of heavy college reading lists and long, analytical college examinations. Studies by U.S. Department of Education senior researcher Clifford Adelman in 1999 and 2005 showed that the best predictors of college graduation were not good high-school grades or test scores, but whether or not a student had an intense academic experience in high school. Such experiences were produced by taking higher-level math and English courses and struggling with the demands of college-level courses like AP or IB. Two recent studies looked at more than 150,000 students in California and Texas and found if they had passing scores on AP exams they were more likely to do well academically in college.

To send a student off to college without having had an AP or IB course and test is like insisting that a child learn to ride a bike without ever taking off the training wheels. It is dumb, and in my view a form of educational malpractice. But most American high schools still do it, and I
don't think such schools should be rewarded because they have artificially high AP or IB passing rates achieved by making certain just their best students take the tests.

NEWSWEEK and The Washington Post, however, have added a new statistic developed by the College Board that indicates how well students are doing on the exams at each school while still recognizing the importance of increasing student participation. It is the equity and excellence rate, the percentage of all graduating seniors, including those who never got near an AP course, who had at least one score of 3 or above on at least one AP test sometime in high school. The average equity and excellence rate in 2006 for all schools, including those that lacked AP programs, was  14.8 percent. In the 2007 NEWSWEEK list, we give the equity
and excellence percentage for those schools that have the necessary data. We ask IB schools to calculate their IB, or combined AP-IB, equity and excellence rate, using a 4 on the 7-point IB test as the equivalent of a 3 on the AP.

We divide by June graduates as a convenient measure of the relative size of each school. That way a big school like Coral Reef High in Miami, which gave 3,273 AP or IB tests and graduated 634 seniors in 2006 for a rating of  5.162 this year, will not have an advantage over Eastern Sierra Academy in rural Bridgeport, Calif., which gave only 26 AP tests but also graduated only five seniors for a rating of 5.200. On the 2007 NEWSWEEK list they are right next to each other at numbers 19 and 20.

We count all tests taken at the school, and not just those taken by seniors.

Indeed it is, and if I could quantify all those other things in a meaningful way, I would give it a try. But teacher quality, extracurricular activities and other important factors are too subjective for a ranked list. Participation in challenging courses, on the other hand, can be
counted, and the results expose a significant failing in most high schools—notice that so far only about 5 percent of the public high schools in the United States qualify for the NEWSWEEK list. I think that this is the most useful quantitative measure of a high school, and one of its
strengths is the narrowness of the criteria. Everyone can understand the simple arithmetic that produces a school's Challenge Index rating and discuss it intelligently, as opposed to ranked lists like U.S. News & World Report's "America's Best Colleges," which has too many factors for me to comprehend.

As for the words "top" and "best", they are always based on criteria chosen by the listmaker. My list of best film directors may depend on Academy Award nominations. Yours may be based on ticket sales. I have been very clear about what I am measuring in these schools. You may not like my criteria, but I have not found anyone who understands how high schools work and does not think AP or IB test participation is important. I often ask people to tell me what quantitative measure of high schools they think is more important than this one. Such discussions can be interesting and productive.

For instance, Andy Rotherham and Sarah Mead of the think tank Education Sector argue that some of the schools on the NEWSWEEK list have low average test scores and high dropout rates and do not belong on any best high schools list. My response is that these are all schools with lots of low-income students and great teachers who have found ways to get them involved in college-level courses. We have as yet no proven way for educators in low-income schools to improve significantly their average tests scores or graduation rates. Until we do, I don't see any point in making them play a game that, no matter how energetic or smart they are, they can't win.

We do not include any magnet or charter high school that draws such a high concentration of top students that its average SAT or ACT score significantly exceeds the highest average for any normal-enrollment school in the country. This year, that meant such schools had to have an average SAT score below 1,300 on the reading and math sections, or an average ACT score below 27, to be included on the list.

The schools you name are terrific places with some of the highest average test scores in the country, but it would be deceptive for us to put them on this list. The Challenge Index is designed to honor schools that have done the best job in persuading average students to take
college-level courses and tests. It does not work with schools that have no, or almost no, average students. The idea is to create a list that measures how good schools are in challenging all students, and not just how high their students' test scores are. The high-performing schools we have excluded from the list all have great teachers, but research indicates that high SAT and ACT averages are much more an indication of the affluence of
the students' parents.

Using average SAT or ACT scores is a change from the previous system we used, which excluded schools that admitted more than half of their student based on grades and test scores. That system penalized some inner-city magnet schools that had high Challenge Index ratings but whose average SAT or ACT scores were below those of some normal enrollment suburban schools, so we switched to a system that we consider fairer and clearer.

We do, however, acknowledge this year on our Public Elites list the 19 schools that did not make the list because their average SAT or ACT scores were too high.

You make a very important point. These are all exceptional schools. Every one is in the top 5 percent of 27,000 American high schools measured this way. They have all shown remarkable AP and IB strength. I am mildly ashamed of my reason for ranking, but I do it anyway. I want people to pay attention to this issue, because I think it is vitally important for the improvement of American high schools. Like most journalists, I learned long ago that we are tribal primates with a deep commitment to pecking orders. We cannot resist looking at ranked lists. It doesn't matter what it is—-SUVs, ice-cream stores, football teams, fertilizer dispensers. We want to see who is on top and who is not. So I rank to get attention, with the hope that people will argue about the list and in the process think about the issues it raises.

If I thought that those districts who pay for the test and require that students take it were somehow cheating, and giving themselves an unfair advantage that made their programs look stronger than they were, I would add that asterisk or discount them in some way. But I think the opposite is true. Districts who spend money to increase the likelihood that their students take AP or IB tests are adding value to the education of their students. Taking the test is good. It gives students a necessary taste of what college demands. It is bad that many students in AP courses avoid taking the tests just because they prefer to spend May of their
senior year sunning themselves on the beach or buying their prom garb. (Since AP and IB tests must be graded by human beings, the results arrive long after June report cards, so they do not count as part of the class grade and most schools allow students to skip the AP test if they wish. IB is organized differently, and few IB students miss those exams.)

If paying test fees persuades students, indeed forces them, to take the test, that is good,  just as it is good if  a school spends money to hire more AP teachers or makes it difficult for students to drop out of AP without a good reason. I was happy to see that when suburban Fairfax County, Va., began to pay the test fees and require that the tests be taken, many other districts in the Washington area followed suit.

I would like to. NEWSWEEK has tried to count what are often called dual-enrollment exams, those given in high-school courses supervised by local colleges. But it proved to be too difficult. The problem is that we want to make sure that the dual-enrollment final exams are comparable to the AP, IB and Cambridge exams that define the index. We tried to set a standard—we would only count dual-enrollment final exams that were at least two hours long and had some free-response questions that required thought and analysis, just as the AP, IB and Cambridge exams do. And we wanted to be sure that the exams were written and scored by people who were not employed by the high school so that, like AP, IB and Cambridge exams, they could not be dumbed down to make the school or the teacher look good. Some high schools provided us with the necessary information, but most could not. It was too difficult for them to persuade the colleges managing the exams to help them, or they did not have the staff to gather the data we required. We did not want to be counting extra exams only for those schools that could afford extra staff, so we decided to stay with AP, IB and Cambridge, while we thought about better ways to count dual enrollment.

The more schools I have examined, the more I have come to believe in the power of high-school cultures, which are different in different parts of the country for reasons that often have little to do with the usual keys to high-school performance—the incomes and educations of the parents.

In 2005, California, New York, Texas and Florida led the nation, in that order, in number of schools on the list. That was no surprise. But it was more difficult to explain why much less populous Virginia and Maryland came right after those megastates in the number of challenging high schools, and why Iowa, with some of the highest test scores in the country, had only three high schools that met the criteria. Six states had no schools on the list at all.

My tentative explanation is that some areas have had the good fortune to get school boards and superintendents who see that they serve their students better by opening up AP and IB to everyone. Once a few districts in a state do that, others follow. And once a state has success with AP or IB, its neighboring states begin to wonder why they aren't doing the same.

My children attended both public and private high schools, and I share your interest in rating both varieties. The public schools are very quick to give NEWSWEEK and The Washington Post the data we need. They are, after all, tax-supported institutions. The private schools, sadly, have resisted this and most other attempts to quantify what they are doing so that parents could compare one private school to another. The National Association of Independent Schools has even warned its members against cooperating with reporters like me who might be trying to help what they call consumer-conscious parents like you. They say that parents should reject such numerical comparisons and instead visit each private school to soak up its ambiance. I am all for visits, but I think those private schools are essentially saying that parents like you and me are too stupid to read a list in a magazine or newspaper and reach our own sensible conclusions about its worth.

A few private schools have shared their data with me, but since the majority are resisting, any list of private schools would be too incomplete to be very useful.

No. Keep in mind, as I said before, that every school on the list is in the top 5 percent of all American high schools measured in this way. If you want to gauge a school's progress, look at its rating, not its ranking. Many schools drop in rank each year because there is so much more
competition to be on the list, but at the same time improve their ratio of tests to graduating seniors. That means they are getting better, and the rank is even less significant. Also, almost all schools on the list drop in rank in the updated Web-site version of the list a few weeks after the list first appears in NEWSWEEK, because we add schools that get their data to us
after the deadline.

I realize it is my fault that people put too much emphasis on the ranks. If I didn't rank, this would not happen. I was startled that people even remembered what their school's rank was in previous years. The important thing is that your school is on the list, not where on the list
it is. As for why I rank, when it creates so much trouble, see question 7.

There is a bit, but only a small bit, of truth in what you have heard. Many selective colleges are making it harder to get credit for taking AP and IB courses and tests in high schools, but their reasons for doing so are unclear. Former philosophy professor William Casement, who has analyzed this trend, says he thinks AP courses and tests are not as good as the introductory college courses and tests they were designed to substitute for, and that is why those colleges are pulling back. There is, unfortunately, almost no evidence to back up his theory. In fact, the colleges have done almost no research on the quality of their introductory courses, while the College Board has expert panels that regularly compare AP courses with college intro courses to make sure they are on the same level.

Some high-school educators think the colleges don't like to give AP credit because it costs them revenue. There is no evidence to support that theory, either, but it is clear that selective college-admissions offices are very happy to see AP or IB courses on applicants' transcripts.

As for high schools rejecting AP, there are exactly 12 who have done that. They are all private, all very expensive and represent 3/100ths of 1 percent of the nation's high schools. Thousands of high schools, by contrast, are opening more AP or IB courses, which they say are the only national programs that provide a high and incorruptible standard for student learning.

Because AP and IB exams are written and scored by outside experts, it is impossible to water down an AP or IB course without exposing what you have done—unless of course you make sure very few of the students take the tests. That is why we count tests, not courses, for the index. And as for teacher creativity, AP and IB encourages it more than any other high-school program I know. The tests reward creative thinking and original analysis. Creative teachers who produce creative students find their AP and IB test scores are very high.

They are smart and hard-working educators who are entitled to their opinions. But so are those AP teachers who tell me the list helps them gain support for their students. Here is what Brian Rodriguez, who teaches AP American  and European history at Encinal High School in Alameda,
Calif., told me about the impact of AP on non-AP courses in a school with many low-income and minority students:

"AP teachers rarely teach only AP classes. They have many other responsibilities to their department, collaborative educational focus groups and as liaison to our middle schools. The AP techniques honed in years of teaching or gleaned from seminars are used in the regular
classrooms (at a slower pace, but no less effectively). For instance, I am teaching a unit on Vietnam to my regular U.S. history class. I will use the PowerPoint lecture I developed for my AP class on that subject, teach the students to take notes, use the Socratic-method discussion techniques so effective in AP classes, and then teach writing methods and tips I use so effectively in my AP classes. In addition, I will teach these techniques to our new teachers at history department meetings, prepare a pamphlet on multiple-choice testing techniques that was distributed to all teachers at our school to prepare them for state standardized testing and then visit our local middle schools to make a presentation to the teachers there. In
summary, AP teaching can be schoolwide, and raises all the ships in the harbor."

Join the Discussion