Insights from the March SAT: What Test-Takers Need to Know
The College Board has successfully transitioned from the static paper SAT of old to the digital adaptive SAT. Between 1926 and 2023, tens of millions of students took their SATs on paper booklets. On March 9, some 200,000 US students took the SAT on a laptop or tablet computer, and 99.8% of them, all but roughly 400 students, were able to complete the exam and submit their scores. To accomplish this, the College Board built upon its experience of administering the digital SAT to 300,000 international students over the past year and administering the digital PSAT to over 1.5 million students in October.
Smooth sailing on the technical side
The relative absence of technical issues of the inaugural domestic SAT comes on the back of lessons learned from prior administrations of College Board tests. In May 2020, when schools nationwide were shuttered, over 30,000 students struggled to complete and submit their digitally administered AP exams. The morning of Wednesday, October 11, 2023, tens of thousands of students were adversely affected when College Board servers were overwhelmed for nearly 90 minutes, as 1.1 million students attempted to sit for the digital PSAT.
The March SAT, in contrast, went off without a hitch. And by and large, student feedback regarding the testing experience has been positive. However, feedback regarding the difficulty, timing and scoring of the March test has not been universally positive. The March SAT was, by design, a more difficult test, and is likely reflective of SATs to come.
Calibration: March was harder than most of the College Board’s Bluebook practice tests
Any student who took Bluebook tests 1, 2, or 3 may have developed an inflated sense of confidence going into March. The first three practice tests had relatively fewer challenging items in the upper adaptive modules (particularly on Math) than the March test, and these practice tests rewarded students with inflated scaled scores. Some students who were scoring in the mid-upper 1400s on the early Bluebook tests ended up with scores in the low-mid 1300s in March. In contrast, Bluebook test 4 is closer in difficulty to the March SAT. Student performance on the digital PSAT in October and the paper SATs administered in 2023 seems to be better aligned with student performance on the March test. Only the early Bluebook tests were miscalibrated.
Déjà vu, all over again
The fact that the early practice provided by the College Board is too easy mirrors our experience with the 2016 SAT redesign. In the 2015-2016 lead-up to the March SAT, students practiced on College Board’s four initial practice tests. These tests had a surprisingly short shelf life, and within a couple of years from launch, the College Board officially retired two of the four test forms, citing they were no longer accurate or a reliable predictor of performance. We suspect that we’ll see similar editing of practice tests as the College Board gathers more student data.
Better-calibrated practice tests have already arrived, and more are coming
On Wednesday, March 20, nine days after the first SAT administration, the College Board released Bluebook practice tests 5 and 6, which are more in line with the difficulty level of March. The College Board has promised two additional practice tests, which should likewise be more closely calibrated with the difficulty level of the March test.
A harder test is necessary to smooth out the scoring distribution
If a test is too easy, the curve can be punishing for students who miss one or two items. Over the years there have been SATs where missing a single item could drop a student from a score of 800 to a 750, and missing a second item could drop a student to a 720. Colleges are making distinctions between students across the scoring spectrum, and more granular differentiation has value in their admissions process. The test writers at the College Board wanted to meet the needs of admissions offices, to better differentiate students in the 700-800 range, and to this end, a more challenging test is necessary. Test writers need to include enough difficult items to make meaningful distinctions between scores of 800, 790, 780, all the way down the curve. The March SAT had an abundance of harder items in the more challenging adaptive modules.
The new scoring is a bit more complicated given the algorithmic scoring of items, AKA “item response scoring”, with different items receiving different relative weights, and scoring differences from landing in the easier or the harder adaptive modules. Things are also complicated by the fact that the College Board no longer gives out detailed score reports but instead offers its vague seven bars, the “skills insight” meter. Given the relative lack of transparency, we will never know with certainty how the specific items missed translated into scaled scores.
How hard was the Math section?
Most of the complaints regarding the March administration stem from the challenge level of the Math section. The advanced Math module had more challenging items than many students were expecting. The questions tested familiar concepts but combined them in original ways that required creative thinking and a robust understanding of math topics. One particular geometry question required students to:
- Set up a multi-figure diagram from words rather than figures
- Use the area of a right triangle to back out the length of a side
- Use the Pythagorean theorem
- Use similarity to assign values to a second right triangle
No single step of this process is outside of the SAT math canon, but it’s the combination of elements, the multiplicity of steps, that provides the challenge, requiring students to piece together distinct math concepts to solve a problem.
The math was predictable in the concepts it assessed, but the novelty of the presentation created the challenge. The problem-solving skills required for these advanced questions can be developed. Students need to learn to think critically and to apply different problem-solving strategies when exposed to novel questions. It’s impossible to predict which advanced items will appear on a given test, but students can practice problem-solving, being resourceful, and trying multiple approaches.
The Bluebook tests alone are not enough
Some students were relying upon their mastery of the four Bluebook tests to prepare them fully for March, but this was not a recipe for success. The collection of challenging items in the four practice tests was too small of a sample set to adequately prepare students for all of the possible iterations and combinations of key concepts. Students who mastered the several dozen hard items may have been too rigid or lacking in the requisite flexibility and creativity to adapt to the novel presentations in March.
In addition to the novelty of the questions, there were objectively more hard questions in March, which required consistency, stamina and thoughtful pacing.
Students had more time per question, and they needed it
While students have more time per question on the digital SAT, they will likely need that added time, especially for the higher-level adaptive math modules.
The College Board is generously giving students 95.5 seconds per math question, compared to the 60 seconds given per question on the ACT. Students have 59% more time per question, and in many cases, they will need all of that additional time. Difficult SAT math questions often take time to set up and require multiple computational steps. For those students who are shooting for a top math score, they will need to skillfully allocate time for the toughest questions at the end of the second module. One of the best ways to do this is practice solving easy and middle-level math problems with a focus on efficiency.
While timing challenges on math were fairly frequent, we also heard from students who struggled with timing on the verbal section and the more advanced reading passages. Students will need to practice time management, particularly for the hardest passages, using Bluebook tests 4-6 to help them practice pacing.
Scores are back, now what?
The March scores came back on Friday, March 22, ten business days after the test administration. We anticipate, with time, that scores will be returned significantly faster. The score reports matched those reports from the PSAT and international digital SAT administrations and revealed very little useful detailed information for students.
We had a fairly standard range of score increases on the March SAT. The score increases from paper to digital looked comparable to those we had seen from paper to paper, supporting the College Board’s claim, and validating its pilot study, that the two versions of the tests should be viewed on the same scale.
We have always recommended that students plan to test twice and leave time for a potential third test. While most students who took the March test had the experience of taking the PSAT, the College Board has made it clear that the PSAT, especially on the math side, does not include the hardest types of questions. Students now have the advantage of using better-calibrated practice materials as they prepare for the May and June tests and beyond.
The digital SAT is an improved test, with many advantages over classical, paper-based tests. The College Board has managed to create a reliable and superior testing experience. Students who were waiting on the sidelines can now have the confidence to move forward and prepare for this test that can positively impact their admissions outcomes and scholarship opportunities.
Applerouth is here to help students looking for support with their test preparation and has developed three proprietary digital practice tests with robust and detailed student feedback to help students learn from their mistakes and take steps towards mastery of the SAT. Don’t hesitate to reach out if you need help—schedule a call with one of our Program Directors or give us a call at 866-789-PREP (7737) today.
Questions? Need some advice? We're here to help.
Take advantage of our practice tests and strategy sessions. They're highly valuable and completely free.