Taking the SAT or ACT is a rite of passage for high school students applying to college. Millions of juniors and seniors take at least one of the tests every year, albeit reluctantly, and most colleges still require it to be considered for admission. But a growing number of colleges are putting much less emphasis on test scores. Many have made the test entirely optional. The change comes as more schools question the usefulness of standardized tests in predicting college readiness.
Want to your students to discuss this Above the Noise episode with peers from around the country? Sign up for KQED Learn (https://learn.kqed.org/) and join the conversation (https://learn.kqed.org/topics)!
What are the main arguments for and against the SAT/ACT?
Critics of the test say that, as opposed to grades, it is not a particularly good predictor of college success. Some also say it discriminated against lower income students, who don’t have the same test-prep opportunities.
Supporters of the test say it offers another important data point among the many academic variables that colleges consider. The test, they say, is always changing to reflect current academic skills and readiness. And there is little hard evidence, they argue, that lower income students face any measurable disadvantage.
How many colleges have stopped requiring it?
Roughly 1,000 schools, mostly liberal arts colleges, give the SAT/ACT limited consideration. Some are “test-optional,” which means the applicant can decide whether or not to submit test scores.
When did colleges start requiring the test?
It began with Harvard after WWI, which adapted an army intelligence exam to attract and identify qualified applicants outside the East Coast boarding school mold. Other prestigious schools soon followed, and by the late 1940s, most colleges started to require the SAT.
SOURCES AND ADDITIONAL RESOURCES