Ten men decided what every American teenager should learn, and most of their decisions still hold.
On July 9, 1892, the National Education Association appointed a committee of ten educators, chaired by Charles William Eliot, president of Harvard, to address the chaos of American secondary education. High schools across the country taught different subjects, in different sequences, with different standards. Colleges could not evaluate applicants because no two transcripts meant the same thing.1
The committee organized nine subject-area subcommittees, each staffed with educators who surveyed current practices, debated standards, and submitted recommendations. The final report, published in 1893, proposed four courses of study for high schools: Classical, Latin-Scientific, Modern Languages, and English. All four required substantial work in core academic subjects, including history, foreign languages, mathematics, and natural sciences.2
The report's defining principle was that rigorous academic education benefited all students equally, whether or not they planned to attend college. Eliot rejected the idea of a separate, less demanding curriculum for students headed to work rather than university.3
G. Stanley Hall, president of Clark University and a prominent psychologist, attacked the report as elitist, arguing that most high school students were incapable of rigorous academic work and should be sorted into vocational tracks based on their aptitudes.4
High school enrollment was under six percent of American teenagers when the Committee convened. By 1930, it exceeded fifty percent. The committee's framework, expanded and modified by subsequent reformers, became the structural foundation of the American high school. The Carnegie Unit, introduced in 1906 as a credit measurement, reinforced the committee's emphasis on standardized, time-based coursework.5