Before it was a machine, a computer was a person, usually a woman, doing arithmetic by hand.
The word computer first referred to a person who performed calculations. The earliest known usage in English dates to 1613, in a book by Richard Braithwaite titled The Yong Mans Gleanings, where it described someone who computes or reckons.1
Human computers performed the mathematical labor behind astronomical tables, artillery range calculations, and census data. During World War II, the role expanded dramatically. The U.S. Army employed hundreds of women as computers at the Ballistic Research Laboratory in Aberdeen, Maryland, calculating firing tables for artillery weapons. Six of them, Kay McNulty, Betty Jennings, Betty Snyder, Marlyn Meltzer, Fran Bilas, and Ruth Lichterman, were selected to program the ENIAC, one of the first electronic general-purpose computers, in 1945.2
NASA employed African American women as human computers at the Langley Research Center in Hampton, Virginia, beginning in the 1940s. Dorothy Vaughan, Mary Jackson, Katherine Johnson, and their colleagues calculated flight trajectories for the agency's early space missions. Johnson's orbital calculations were essential to John Glenn's 1962 mission; Glenn reportedly insisted that she personally verify the electronic computer's numbers before he would fly.3
The transition from human computers to electronic machines occurred gradually through the 1940s and 1950s. By the 1960s, the word computer referred almost exclusively to machines. The people who had done the same work were reclassified as mathematicians, analysts, or programmers. The term that once named a skilled human role became the name of the device that replaced it.4