Frederick Taylor broke work into tasks. The job description listed them so the worker would not have to think.
The modern job description emerged from the scientific management movement of the early twentieth century. Frederick Winslow Taylor’s system required that every task be analyzed, documented, and assigned to a specific role, separating planning from execution.1 Before Taylor, a skilled worker decided how to do the work. After Taylor, a document decided what the work was.
The U.S. government formalized the practice during World War I, when the need to rapidly classify and assign millions of workers demanded standardized descriptions of what each position required.2 The Classification Act of 1923 established a system of job classification for the federal civil service, tying pay grades to written descriptions of duties and qualifications.
By mid-century, job descriptions had become standard in both public and private employment. They served multiple purposes: recruitment, performance evaluation, legal compliance, and organizational design. The document that Taylor envisioned as a tool for efficiency became a legal artifact, defining the boundaries of what an employer could require.3
The format has remained remarkably stable for a century: a title, a summary, a list of duties, a list of qualifications. The person who fills the role is expected to match the description, not the other way around. The document assumes that work can be fully specified in advance and that the right person for a position is the one whose existing skills most closely match a predetermined list.4