The first known employment statute made it a crime to refuse work or demand higher wages.
The Statute of Labourers, enacted by the English Parliament in 1349, is among the earliest legal instruments governing the terms of employment.1 Written in the aftermath of the Black Death, which had killed roughly a third of England's population, the statute attempted to freeze wages at pre-plague levels and compel able-bodied workers to accept employment at those rates.
The labor shortage created by mass death had given surviving workers leverage for the first time. They could demand higher pay and better conditions. The statute was Parliament's response: making it illegal to refuse work or to leave a position before the end of an agreed term.2
The master-servant laws that followed in England over the next five centuries treated the employment relationship as one of status, not contract. Workers were bound by their social position, not by a negotiated agreement. The modern employment contract, understood as a voluntary exchange of labor for compensation between legal equals, emerged only in the nineteenth century, shaped by industrialization and the rise of contract law.3
In the United States, the doctrine of at-will employment, which allows either party to terminate the relationship at any time for any reason, became the default legal framework by the late nineteenth century. Horace Gray Wood codified the principle in his 1877 treatise on master-servant law.4
Much of the world operates under different assumptions. In France, the contrat de travail requires employers to demonstrate cause for dismissal. In Japan, lifetime employment norms at large corporations created an implicit contract that was cultural rather than legal. In Germany, works councils negotiate alongside formal contracts. The written employee handbook became the American instrument for documenting what the contract itself often left unstated.