Ted Gott’s Exam Index

My friend Ted Gott has just prepared a new index to the AP Calculus exam free-response questions from 1998 to the present. Expanding on an earlier spread sheet by Mark Howell, Ted has referenced all the questions to the Learning Outcomes (LO) and Essential Knowledge (EK) statements of the new Course and Exam Description. AND has live links to the individual questions and their scoring standards.

AND HERE IT IS: FRQ Index by topic 1998 to 2017 (Updated to include 2017 FRQs. August 2, 2017)

And for Multiple-choice questions MC unsecure Index by topic 1998 to 2018

Clicking on the arrows at the top of each column allows you to search by LOs and EKs, and to find other questions on the same topic.The right most column provides a direct link to the scoring standard for that question.

What a great resource. THANK YOU TED !


AP Exam Review

Don’t panic! It is not time to start reviewing.

I try to keep these posts ahead of the typical AP Calculus timeline so you can have time to think them over and include what you want to use from them (if anything).

Over the next 6 weeks I will post several times each week. The post will be previous posts on reviewing slightly revised and updated. Today’s post is “Ideas for reviewing for the AP Exam” originally posted on February 25, 2013.

Ideas for reviewing for the AP Exam

Part of the purpose of reviewing for the AP calculus exams is to refresh your students’ memory on all the great things you’ve taught them during the rear. The other purpose is to inform them about the format of the exam, the style of the questions, the way they should present their answer, and how the exam is graded and scored.

Using AP questions all year is a good way to accomplish some of this. Look through the released multiple-choice exams and pick questions related to whatever you are doing at the moment. Free-response questions are a little trickier since the parts of the questions come from different units. These may be adapted or used in part.

At the end of the year I suggest you review the free-response questions by type – table questions, differential equations, area/volume, rate/accumulation, graph, etc. That is, plan to spend a few days doing a selection of questions of one type so that student can see how that type question can be used to test a variety of topics. Then go onto the next type. Many teachers keep a collection of past free-response questions filed by type rather than year. This makes it easy to study them by type.

In the next few posts I will discuss each type (there are 10) in turn and give suggestions about what to look for and how to approach the question.

Simulated Exam

Plan to give a simulated (mock) exam. Each year the College Board makes a full exam available. The exams for 1998, 2003, 2008, and 2012 are available at AP Central  and the secure 2013 – 2016 exams are available through your audit website. If possible, find a time when your students can take the exam in 3.25 hours. Teachers often do this on a weekend day or in the evening. This will give your students a feel for what it is like to work calculus problems under test conditions. If you cannot get 3.25 hours to do this give the sections in class using the prescribed time. Some teachers schedule several simulated exams. Of course, you need to correct them and go over the most common mistakes.

Explain the scoring

There are 108 points available on the exam; each half is worth the same – 54 points. The number of points required for each score is set after the exams are graded.

For the AB exam, the points required for each score out of 108 point are, very approximately:

  • for a 5 – 69 points,
  • for a 4 – 52 points,
  • for a 3 – 40 points,
  • for a 2 – 28 points.

The numbers are similar for the BC exams are again very approximately:

  • for a 5 – 68 points,
  • for a 4 – 58 points,
  • for a 3 – 42 points,
  • for a 2 – 34 points.

The actual numbers are not what is important. What is important is that students to know is that they can omit or get wrong many questions and still earn a good score. Students may not be used to this (since they skip or get so few questions wrong on your tests!). They should not panic or feel they are doing poorly if they miss a number of questions. If they understand and accept this in advance they will calm down and do better on the exams. Help them understand they should gather as many points as they can, and not be too concerned if they cannot get them all. Doing only the first 2 parts of a free-response question will probably put them at the mean for that question. Remind them not to spend time on something that’s not working out, or that they don’t feel they know how to do.


Print a copy of the directions for both parts of the exam and go over them with your students. Especially, for the free-response questions explain the need to show their work, explain that they do not have to simplify arithmetic or algebraic expressions, and explain the three-decimal place consideration. Be sure they know what is expected of them.The directions are here: AB Directions and BC Directions. Yes, this is boiler plate stuff, but take a few minutes to go over it with your students. They should not have to see the directions for the first time on the day of the exam.

Next Posts:

Thursday February 23, 2017: A list of resources for you and your students in preparation for the exams.

Friday February 24: Using Practice Exams

Tuesday February 28: The Writing Questions on the AP Exams

Friday March 3: Type 1 of the 10 type questions: Rate and Accumulation

Tuesday March 7: Type 2 Linear Motion




“Easier” Exams

There were two questions posted recently on the AP Calculus Community bulletin board. One teacher was concerned that his students took two different forms of the Calculus exam, and the means were not the same. He felt that one group has an easier time than the other. The other writer noted that on his (physics) exam three questions were not counted – there appeared to be only 32 questions instead of the 35 he expected.

My answer, which you may be interested in, was:

It is impossible to make two forms of the same test of equal difficulty. I repeat: It is impossible to make two forms of the same test of equal difficulty. (And if the two forms are equal in difficulty, it is due more to dumb luck than good management.)

What the ETS (Educational Testing Service) does to account for this fact is to adjust the cut points for the scores (5-4-3-2-1). A form of the exam that is “easier”, in the sense of having higher overall means, also has higher cut points. Regardless of the difficulty of the form of the exam the score (5-4-3-2-1) reflects the same amount of knowledge of the subject (as best as possible). Any other scheme would certainly not be fair. So, there is no need to be concerned that someone else had an easier exam than your students. They may well have, but their and your students’ score (5-4-3-2-1) reflects the same knowledge. Your students, and those with the easier exam, will get the score they earned.

Then I suggested that he consider his students one at a time without regard to the form of the test they took. Check and see if the students got the score you expected them to get. Keeping in mind that students often surprise or disappoint us, did the students get the scores he anticipated. If, in general, they got the scores he expected without regard to the form, then the ETS did its job.

As to the second concern: The ETS looks at the results individually for each and every question on the exam. If everyone scores very low on a particular question, or if some identifiable sub-group (men, women, one or more minorities) has scores that are way out of line with everyone else, the question is rejected and not scored. The other scores are re-weighted accordingly and the final score (5-4-3-2-1) reflects the same knowledge of the subject. This happens in math and science, but I suspect it happens more often in history, English, and the social sciences.

You might also refer to my recent post of May 12, 2014 Percentages Don’t Make the Grade on this topic.

Updated and revised July 12, 2014.


The AP Instructional Planning Report, IPR, is available today from your audit website, the same place you found your students’ scores. While we all like to see how or students performed individually on the AP exams, the IPR may be of more use to you. It will help you learn where the strengths and weaknesses of your students and your teaching are. Here are some suggestions on what to look for and how to use the report.

The first page contains graphs and data comparing your classes to everyone who wrote the exam. You can see how your students did overall, on the multiple-choice section, and on the free-response section.

The second page is more detailed and more useful in analyzing the results. Here you will find data by topic from the multiple-choice section, and by question for the free-response section. The numbers in the “group mean” column are your students’ average. The “global mean” column is the average of all the students who took this form of the exam.

At a glance you can compare your students with everyone who wrote the exam. If your results are higher, that’s great. If not, keep in mind that this may not be just a reflection on your teaching. If your school has open enrollment and requires that everyone write the exam, then you have to expect scores lower than average. That is not a bad thing for you or your lower scoring students. Students who write an AP exam and score one or two still do better in college than students who never took an AP course. By better I mean that they require less remediation, have higher GPAs, and more of them graduate from college on time than students who never tried AP.

Now, try this: for each topic on the list, divide your classes’ mean by the global mean. Your results will be greater than one if your students did better than the entire group or less than one if they did not do as well.  Even if the ratios are all under one, look for the topics with higher ratios. These are the topics your students learned well. The topics with low ratios compared to the others are where you need to find a different approach or spend more time next year.  This works even if all the ratios are over one.

I first learned this approach from Dixie Ross. Her take on IPRs which is worth reading can be found in her blog for AP teacher here.

Percentages Don’t Make the Grade

Well, the AP exams have been written and the dust has settled. Folks are posting their answers on the Community Bulletin Boards. (I never post mine – too many mistakes.) The other thing that always gets discussed at this time of year is whether this year’s exam is more difficult or less difficult than last year’s.

I am sure this year’s was more difficult or less difficult than last year’s because it is impossible to make two exams of the same difficulty.

But it doesn’t matter.

The grades will reflect, as best as possible, that a student knows as much calculus as students with the same score did last year. That’s the important thing.

Because it is impossible for anyone or any group to make two exams of the same difficulty, percentages tell you nothing. The percentage of the number of points that a student earns out of the number possible tells you just that and nothing more. If the tests are not of the exact same difficulty, then percentages are meaningless.

What to do?

The Educational Testing Service (ETS) who writes and administers the AP exams for the College Board carefully pretests each question. Also, there are a number of questions from last year’s exam on this year’s exam. These questions, called equators, allow ETS to judge the difficulty of the other questions on this year’s exam compared to last year’s. It allows them to judge the ability of this year’s student cohort compared to last year’s. Each question is considered individually. Questions that score poorly or questions that identifiable groups of students do far worse compared to the entire group taking the test are not counted in the final score. (For example, in 2008 question AB 19 was not counted; too many missed it.) They compare the results of questions within each exam. With this information they “scale” the exams and decide on the cut points, the high and low raw scores that earn a 5, 4, 3, 2, or 1.

A teacher on a day-to-day basis cannot do so detailed an analysis. Yet we still need to give students grades. We need to scale the exams.  I was quite happy this year using a scheme Dan Kennedy suggested some years ago (see resources tab above). This worked quite well for me in BC Calculus and in 8th grade Algebra 1. Perhaps you have another system.

Percentages just don’t make the grade.

Update September 22, 2014: Matthew Braddock, Mathematics Instructor & Webmaster, at the Dr. Henry A. Wise, Jr., High School in Prince George’s County, Maryland sent me a GeoGebra applet that will calculate the grades using Dan Kennedy’s scheme described in the link above. It runs at a website so you do not need GeoGebra on your computer or iPad to use it. Simply enter the information and it will do the rest. Thank you Matthew. 

Update December 3, 2018. The link above is no longer active. This link is to a similar app by Dan Anderson on Desmos. Thank you Dan. For more on this scaling test see the post: On Scaling.

Updated: September 22,2014, Kennedy link fixed February 9, 2018