Reductive Analogy
Dismissing a concern about something new by equating it to something older and familiar where the concern doesn't apply.
"Worried about AI-generated code? Should I credit my linter as a co-author too?"
"Algorithmic feeds controlling what people see? Newspapers chose what to print too. Editors have always been gatekeepers."
"Deepfakes? Photoshop has existed for 30 years. Where was the panic then?"
"Self-driving cars scare you? Cruise control already lets cars drive themselves on the highway."
Why It's Unproductive
The surface similarity is usually real, which is what makes this feel like a strong point. But it works by flattening the differences that actually matter. A linter applies deterministic rules to code you wrote; an LLM generates novel code from a prompt. Photoshop required skill and hours of work; a deepfake takes seconds and a photo. Collapsing that gap lets someone appear to have addressed the concern while skipping the part that made it worth raising. It's tempting because pattern-matching to the familiar feels like wisdom, but it shortcuts the thinking instead of doing it.
The Better Move
If you see a real parallel to something familiar, use it to sharpen the discussion, not close it. Name what's similar and then ask what's different this time. The interesting part of any new thing is where the analogy breaks down, not where it holds.
Why It's Better
Takes the comparison seriously instead of weaponizing it. Pressing on the differences instead of just the similarities is how conversations get somewhere new.
Examples
OP: "Open source projects should require contributors to disclose when code is AI-generated."
Antipattern: "Should they also disclose when they used autocomplete? Or Stack Overflow? Where does it end?"
Better: "What's the actual risk you're worried about? Copyright liability, code quality, something else? That would help figure out whether disclosure even solves it."
OP: "I'm concerned about deepfakes being used to impersonate public figures."
Antipattern: "People have been doing impressions and Photoshopping images forever. This is nothing new."
Better: "The difference in scale and effort seems real though. A Photoshop job took skill and was usually obvious. What makes these harder to deal with?"
OP: "Algorithmic feeds are shaping public opinion in ways we don't fully understand."
Antipattern: "Newspapers chose what to put on the front page too. Editors have always been gatekeepers."
Better: "Editorial curation is a real precedent, but it was one editor picking for a whole audience. Algorithmic feeds personalize at scale, which seems like a different dynamic. Is anyone studying the difference?"