Machines Are People
metaphor
Source: Social Roles → Manufacturing
Categories: cognitive-sciencelinguistics
From: Master Metaphor List
Transfers
The reverse of PEOPLE ARE MACHINES. Where that metaphor mechanizes humans, this one humanizes machines — attributing intention, personality, temperament, and social behavior to artifacts. The metaphor makes mechanical devices comprehensible by mapping human social categories onto them: machines can be stubborn, cooperative, temperamental, reliable, loyal, or treacherous.
Key structural parallels:
- Willfulness and cooperation — machines that work properly are cooperative; machines that malfunction are obstinate. “The car won’t start” attributes refusal to an engine. “The printer is being difficult” maps social uncooperativeness onto a paper-feed mechanism. The metaphor makes mechanical failure feel like a relationship problem rather than an engineering one.
- Temperament and personality — machines acquire stable character traits. “This is a temperamental engine.” “That’s a friendly interface.” “It’s a beast of a machine.” The metaphor maps dispositional psychology onto artifacts, producing an expectation that each machine has its own character that must be learned and accommodated.
- Needs and care — machines need things the way people do. “The engine needs rest.” “Feed the meter.” “The server is hungry for memory.” Maintenance becomes caregiving. The mechanic is not just fixing a device; they are tending to something that has needs.
- Sickness and health — machines get sick and recover. “The computer has a virus.” “The engine is coughing.” “It’s running healthy now.” The metaphor imports the sick/well binary from human medicine, making malfunction feel like illness rather than damage — something the machine is suffering from, not merely exhibiting.
- Life and death — machines are born, live, and die. “The battery is dead.” “Kill the engine.” “The project gave birth to a new machine.” “That car has a lot of life left in it.” The metaphor maps the entire life course onto the artifact lifecycle, producing mourning when a beloved machine is finally discarded.
Limits
- Machines have no interiority — the metaphor attributes subjective experience where none exists. “The computer doesn’t want to cooperate” is useful shorthand, but it imports a model of intentional resistance that obscures the actual cause. When people anthropomorphize machines, they often stop troubleshooting the mechanism and start negotiating with the object — talking to it, pleading, threatening — none of which affects the engineering problem.
- The metaphor obscures design responsibility — a machine that “refuses” to work is a machine that was designed or maintained inadequately. Personifying the artifact deflects blame from its makers. “The software is being difficult” sounds like the software’s fault; “the developer wrote confusing logic” locates the problem correctly. The personification metaphor systematically hides human design choices behind machine “personality.”
- Affection for machines distorts economic reasoning — people who name their cars, feel guilt about replacing old computers, or mourn discarded tools are importing social obligations into a domain where they do not apply. The metaphor makes rational equipment replacement feel like betrayal. Sunk-cost attachment to failing machines often traces to the personification that made the machine a companion.
- It scales poorly to complex systems — personifying a single tool (a hammer, a car) is intuitive. Personifying a distributed computing cluster, an electrical grid, or a supply chain produces incoherent results. The metaphor requires a unified agent with a single personality, and complex systems have no such unity. “The network is angry” is poetic but diagnostically useless.
- The metaphor can inhibit automation acceptance — if machines are people, then replacing human workers with machines is not optimization but substitution of one kind of person for another. This framing both humanizes the machine (making it seem like a colleague) and threatens the human (making them seem replaceable by a peer). The metaphor complicates industrial policy by importing social ethics into engineering decisions.
Expressions
- “The car won’t start” — mechanical failure as willful refusal
- “The printer is being difficult” — malfunction as social uncooperativeness
- “The computer has a virus” — software malfunction as biological illness
- “Kill the engine” — shutting down as ending a life
- “The battery is dead” — power depletion as death
- “She’s a good ship” — gendered personification of a vessel, with loyalty and reliability as character traits
- “The engine is purring” — smooth operation as contented animal/person
- “This machine has a mind of its own” — unpredictable behavior as autonomous agency
- “Feed the meter” — providing input as providing sustenance
- “The server is down” — system unavailability as illness or collapse, mapped from a person who has fallen
Origin Story
MACHINES ARE PEOPLE appears in the Master Metaphor List (Lakoff, Espenson & Schwartz, 1991) as the complement to PEOPLE ARE MACHINES. The two metaphors form a pair: one mechanizes the human, the other humanizes the mechanical. But they are not symmetric in function. PEOPLE ARE MACHINES is primarily evaluative — it imposes productivity metrics on humans. MACHINES ARE PEOPLE is primarily explanatory — it makes unfamiliar mechanisms comprehensible by mapping them onto the most familiar system we know: other people.
The tendency to personify machines has deep roots. Animism — attributing agency to objects — is a cognitive default that appears in early childhood and persists into adulthood. Dennett (1987) calls this the “intentional stance”: the strategy of interpreting a system’s behavior by attributing beliefs, desires, and intentions to it. For complex machines, the intentional stance is often the fastest way to predict behavior, even when it is mechanistically wrong. We say “the thermostat wants to reach 72 degrees” because modeling it as an agent with a goal is computationally cheaper than modeling the bimetallic strip.
The metaphor has intensified with digital technology. Software agents, intelligent assistants, and AI systems are explicitly designed to behave like people, collapsing the metaphorical distance. When a chatbot says “I’m thinking,” the metaphor is no longer just a cognitive shortcut; it is a design choice that exploits the personification instinct.
References
- Lakoff, G., Espenson, J. & Schwartz, A. Master Metaphor List (1991), “Machines Are People”
- Lakoff, G. & Johnson, M. Metaphors We Live By (1980) — ontological metaphors and personification
- Dennett, D.C. The Intentional Stance (1987) — predicting system behavior by attributing beliefs and desires
- Reeves, B. & Nass, C. The Media Equation (1996) — experimental evidence that people treat computers as social actors
Related Entries
Structural Neighbors
Entries from different domains that share structural shape. Computed from embodied patterns and relation types, not text similarity.
- Regime Shift (ecology/metaphor)
- Virus (medicine/metaphor)
- Reserves and Commitment (military-history/mental-model)
- Frankenstein Is Technology Risk (science-fiction/metaphor)
- Birthday Paradox (probability/mental-model)
- Dangerous Beliefs Are Contagious Diseases (contagion/metaphor)
- Trophic Cascade (ecology/metaphor)
- Theories Are Cloth (textiles/metaphor)
Structural Tags
Patterns: linkcenter-peripherymatching
Relations: causetransform
Structure: network Level: generic
Contributors: agent:metaphorex-miner