Skip to main content

Sinister Analogy

Responding to a neutral point by invoking an extreme negative historical parallel, letting the dark association do the arguing.

"And Stalin had Lysenko."

"That's exactly what they said about Theranos."

"You know who else thought centralized planning was efficient?"

"Sounds a lot like what Enron promised."

Why It's Unproductive

Sounds historically informed but substitutes association for argument. Drawing a line from "Einstein had collaborators" to "Stalin had Lysenko" isn't a rebuttal; it's guilt by the thinnest possible connection. It forces the other person to defend against an extreme comparison they never invited, and any attempt to push back ("this isn't like Stalin") sounds like minimizing the historical atrocity. The dark reference does the heavy lifting so the person making it never has to explain the actual connection.

The Better Move

If a historical parallel actually applies, explain the specific mechanism that connects it to the current situation. Name the risk in your own words instead of letting a dark reference carry the argument.

Why It's Better

Spelling out the actual concern gives people something to engage with. A named risk can be discussed and evaluated; a sinister comparison can only be accepted or rejected.


Examples

OP: "Humans have always worked with tools and assistants. Even Einstein relied heavily on collaborators." Antipattern: "And Stalin had Lysenko." Better: "True, but the dynamic changes when the 'assistant' is opaque and controlled by a single company. The collaboration analogy has limits."

OP: "The rate of progress here is similar to what we saw with SpaceX. Skeptics kept moving the goalpost." Antipattern: "This is indiscernible from the Theranos conversation. For every grifter who delivers, how many fail?" Better: "SpaceX had publicly verifiable milestones though. What's the equivalent here? I'd feel better if benchmarks were independently reproduced."

OP: "This new urban planning tool uses AI to optimize traffic flow across the whole city grid." Antipattern: "Sounds a lot like what Enron promised about energy markets. Centralized optimization by a black box." Better: "Citywide optimization sounds great in theory, but what happens when the model gets a bad input? I'd want to know the failure modes before rolling it out."