30mins

The Straw Man and Other Intellectual Sins

July 18, 2025

"You can't reason someone out of a position they didn't reason themselves into."

This old adage has never felt more relevant than it does today. We're living through an epidemic of logical fallacies - from political discourse that resembles a particularly nasty episode of EastEnders to social media debates that would make medieval theologians weep with despair.

Last week, I watched a school governing body meeting descend into chaos over a proposal to introduce mindfulness sessions. What started as a reasonable discussion about student wellbeing quickly evolved into accusations that the school was “promoting Eastern mysticism" and "abandoning academic rigour." The headteacher's measured explanation about stress reduction techniques was transformed into a caricature of New Age nonsense that bore no resemblance to the original proposal.

Classic straw man fallacy in action. And it's everywhere.

We've become a society that mistakes volume for validity, where winning an argument matters more than understanding the truth. Social media algorithms reward the most provocative takes, not the most thoughtful ones. Political discourse has degenerated into a competition to see who can most effectively misrepresent their opponent's position. Even in education - supposedly the bastion of clear thinking - we see fallacious reasoning creeping into policy debates, staff meetings, and classroom discussions.

This isn't just an academic concern. Logical fallacies aren't quirky philosophical curiosities confined to dusty logic textbooks. They're the weapons of mass intellectual destruction that are poisoning our ability to think clearly, communicate effectively, and make sound decisions. In a world facing genuine challenges - from climate change to artificial intelligence to social inequality - we simply can't afford to think this poorly.

AI Generated Image. Midjourney Prompt: mass intellectual destruction ar16:9

The stakes couldn't be higher. When leaders make decisions based on fallacious reasoning, people suffer. When voters choose candidates based on flawed logic, democracy weakens. When teachers accept educational policies built on logical errors, students lose out. And when we allow this kind of thinking to become normalised in our daily interactions, we're not just dumbing down discourse - we're eroding the very foundations of rational society.

The Psychology of Intellectual Shortcuts

Before we dive into specific fallacies, we need to understand why intelligent people - and we're all guilty of this - fall for logical errors so readily. The answer lies in how our minds actually work, as opposed to how we think they should work.

Human reasoning isn't a computer processing information dispassionately. It's a biological system evolved for survival, not truth-seeking. Psychologist Daniel Kahneman's work on cognitive biases that we’ve mentioned before reveals that we have two systems of thinking: System 1 (fast, automatic, intuitive) and System 2 (slow, deliberate, analytical). Most of our daily thinking happens in System 1, which is brilliant for avoiding immediate danger but terrible at logical reasoning.

“Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen. Any recent salient event is a candidate to become the kernel of a causal narrative.” Daniel Kahneman

This explains why logical fallacies feel so natural. They're cognitive shortcuts that helped our ancestors survive in small tribal groups but become liabilities in complex modern societies. When someone criticises a policy we support, our System 1 brain doesn't carefully analyse their argument - it looks for ways to dismiss them quickly so we can get back to more pressing concerns (like not being eaten by lions, metaphorically speaking).

AI Generated Image. Midjourney Prompt: system 1 and 2 thinking daniel kahneman ar16:9

The confirmation bias - our tendency to seek information that confirms our existing beliefs while ignoring contradictory evidence - works hand-in-hand with logical fallacies. We don't just accidentally reason poorly; we actively seek out poor reasoning that supports our preferred conclusions. It's why Brexit debates became so toxic, why COVID-19 discussions turned into tribal warfare, and why educational policy discussions often generate more heat than light.

Social psychologist Leon Festinger's theory of cognitive dissonance provides another piece of the puzzle. When we encounter information that contradicts our beliefs, we experience psychological discomfort. The easiest way to resolve this discomfort isn't to change our beliefs - that's hard work - but to attack the source of the contradictory information. Enter the ad hominem fallacy: "You can't trust that research on grammar schools because it comes from a left-wing think tank."

This isn't a character flaw; it's a feature of human psychology. We're not broken computers that need fixing - we're evolved beings whose reasoning systems are optimised for different challenges than those we face today. Understanding this is crucial because it means we can't simply will ourselves to think more clearly. We need systematic approaches to recognise and counter our natural tendency toward fallacious reasoning.

The Straw Man: Democracy's Silent Killer

Of all the logical fallacies plaguing modern discourse, the straw man might be the most insidious. Named after the practice of military training against straw dummies rather than real opponents, this fallacy involves misrepresenting someone's argument to make it easier to attack. Instead of engaging with what your opponent actually said, you construct a weaker version of their position - a "straw man" - and knock that down instead.

The straw man fallacy is particularly dangerous because it often goes unnoticed. Unlike more obvious errors in reasoning, straw man arguments can sound reasonable to casual observers. They're the intellectual equivalent of sleight of hand - by the time you realise what's happened, the damage is done.

AI Generated Image. Midjourney Prompt: sleight of hand ar16:9

An educational example is pertinent. When researchers suggested that excessive homework might be counterproductive for primary school children, critics responded by attacking the “anti-homework lobby” for wanting to "eliminate all academic challenge and turn schools into glorified playgrounds." The original argument - that too much homework might not be effective - was transformed into a caricature about abandoning academic standards entirely.

This transformation serves multiple purposes. It makes the original argument easier to defeat (who wants to turn schools into playgrounds?), it appeals to people's fears about educational standards, and it avoids engaging with the actual research on homework effectiveness. The straw man has done its job: derailing a potentially productive discussion about evidence-based practice.

The philosopher Arthur Schopenhauer, in his work The Art of Being Right (my wife’s manifesto!), said something similar:

“The tricks, dodges, and chicanery, to which they [men] resort in order to be right in the end, are so numerous and manifold and yet recur so regularly that some years ago I made them the subject of my own reflection and directed my attention to their purely formal element after I had perceived that, however varied the subjects of discussion and the persons taking part therein, the same identical tricks and dodges always come back and were very easy to recognise. This led me at the time to the idea of clearly separating the merely formal part of these tricks and dodges from the material and of displaying it, so to speak, as a neat anatomical specimen.” Arthur Schopenhauer

He was writing satirically about how to win arguments dishonestly, but his techniques have become depressingly common in modern discourse.

The straw man fallacy is particularly prevalent in political and educational contexts because these areas involve complex, nuanced positions that can be easily caricatured. When someone argues for more funding for special educational needs, opponents might respond to the “special needs lobby" wanting to "waste money on politically correct nonsense." When teachers request smaller class sizes, critics attack the teacher unions for wanting to "avoid accountability." In each case, the original argument is distorted beyond recognition.

What makes straw man arguments so effective is that they often contain a grain of truth. Most educational research does have ideological implications. Teacher unions do sometimes oppose accountability measures. But by inflating these grains of truth into massive distortions, straw man arguments prevent us from engaging with the actual substance of important debates.

AI Generated Image. Midjourney Prompt: grain of truth ar16:9

The damage extends beyond individual arguments. When straw man reasoning becomes normalised, it erodes trust in public discourse. People stop expecting good-faith engagement with their ideas and start preparing defensive responses to anticipated misrepresentations. This creates a vicious cycle where everyone is arguing against positions nobody actually holds, while the real issues remain unaddressed.

The Greatest Hits: Fallacies That Rule Our World

While the straw man deserves special attention, it's far from the only logical fallacy shaping our discourse. Understanding the most common fallacies isn't just academic exercise - it's practical survival skill for navigating modern life.

The Ad Hominem Attack might be the most recognisable fallacy. Instead of addressing someone's argument, you attack their character, credentials, or circumstances. "You can't trust that report on social media and mental health - it's written by someone who doesn't even have teenagers." This fallacy is particularly common in educational debates, where personal attacks on researchers, teachers, or policymakers substitute for engaging with evidence.

The philosopher John Stuart Mill warned against this in On Liberty, arguing that suppressing or dismissing ideas based on their source rather than their merit impoverishes our understanding. Mill understood that truth can come from unexpected sources, and that ad hominem attacks often reveal more about the attacker's intellectual insecurities than about the original argument's validity.

“If all mankind minus one were of one opinion, mankind would be no more justified in silencing that one person than he, if he had the power, would be justified in silencing mankind.” John Stuart Mill

The False Dilemma presents complex issues as simple either/or choices. "Either we maintain rigorous academic standards or we abandon excellence entirely." This fallacy is endemic in educational policy debates, where nuanced positions are reduced to binary choices that don't reflect reality. The false dilemma is particularly dangerous because it shuts down creative thinking and prevents us from finding innovative solutions to complex problems.

One example is the phonics versus whole language reading debate, or knowledge versus skills in the curriculum. These false binaries force educators into opposing camps when research consistently shows that effective approaches integrate both elements. The most successful schools don't choose between high expectations and student wellbeing - they recognise these as mutually reinforcing rather than competing priorities.

The appeal lies in simplicity - binary thinking requires less cognitive effort than holding multiple variables in tension. But this intellectual laziness comes at a cost. When we frame educational challenges as either/or propositions, we eliminate the nuanced middle ground where most effective solutions actually reside.

The key to resisting false dilemmas is learning to ask better questions. Instead of asking "Should we focus on knowledge or skills?" ask "How can we design learning experiences that develop both knowledge and skills in mutually reinforcing ways?"

AI Generated Image. Midjourney Prompt: the false dilemma ar16:9

The Appeal to Authority involves citing someone's status or credentials as evidence for their argument, rather than evaluating the argument itself. "Professor X from Cambridge says this, so it must be true." While expert opinion certainly matters, this fallacy occurs when we stop thinking critically about claims simply because they come from authoritative sources. In education, this might manifest as unquestioning acceptance of research findings because they come from prestigious institutions, without considering methodology or potential biases.

This fallacy is particularly seductive in education because we're conditioned to respect expertise - and rightly so. Teachers are experts in their classrooms, researchers understand methodology, and experienced leaders have valuable insights. The problem arises when deference to authority replaces critical evaluation of ideas.

Consider how often educational conferences feature keynote speakers whose primary qualification seems to be their impressive bio rather than the quality of their insights. (And I do not miss the irony of this in my own life!) "Former Associate Assistant Principal turned strategy consultant" or "Professor of Education at Elite University" become shorthand for credibility, regardless of whether their actual proposals withstand scrutiny. 

The appeal to authority also manifests in how we consume educational research. A study from Stanford or Oxford carries immediate weight, but this prestige can blind us to fundamental flaws: small sample sizes, correlation mistaken for causation (and this is huge!), or findings that don't transfer across contexts. The institutional halo effect prevents us from asking basic questions about methodology or relevance.

Social media amplifies this tendency. Educational Twitter (X) is full of posts that gain traction primarily because they quote respected figures, regardless of whether the claims are well-supported. "As Dylan Wiliam says..." or "Research from Harvard shows..." become conversation-stoppers rather than conversation-starters.

The psychological appeal is obvious - thinking is hard work, and deferring to experts feels both efficient and safe. If someone with impressive credentials supports a position, we can adopt it without the cognitive effort of independent evaluation. It's intellectual outsourcing that feels responsible whilst actually being intellectually lazy.

AI Generated Image. Midjourney Prompt: thinking is hard work ar16:9

But expertise isn't transferable across all domains. A brilliant mathematician might have terrible insights about classroom management. A successful head teacher in one context might be completely wrong about policy for different schools. Credentials in one area don't guarantee wisdom in all areas.

The solution isn't to reject expertise - that way lies the anti-intellectual populism that dismisses all expert knowledge as elite conspiracy. Instead, we need what philosopher Karl Popper called "critical rationalism" - respect for expertise combined with systematic questioning of claims, regardless of their source.

“No rational argument will have a rational effect on a man who does not want to adopt a rational attitude.” Karl Popper

This means asking different questions: 

  • What evidence supports this claim? 
  • How was this evidence gathered? 
  • Does it apply to my context? 
  • What alternative explanations exist?
  • Who might benefit from this being true? 

These questions aren't disrespectful to experts - they're essential for intelligent engagement with expertise.

Effective leaders model this approach. They listen respectfully to expert advice whilst maintaining their critical faculties. They understand that the mark of intellectual maturity isn't unquestioning deference to authority, but the ability to evaluate ideas on their merits whilst appropriately weighting the credibility of their sources.

The Slippery Slope suggests that one action will inevitably lead to a chain of negative consequences. "If we allow remote working, productivity will collapse and office culture will disappear entirely." This fallacy is particularly common in discussions about workplace flexibility, social policy, and technological adoption, where reasonable changes are rejected based on exaggerated fears about their ultimate consequences.

The slippery slope fallacy thrives in environments of uncertainty and change. When faced with proposals that challenge existing systems, our minds naturally leap to worst-case scenarios. "If we introduce a four-day working week, soon everyone will expect three days, then two, then no one will work at all." The logical chain feels compelling, but it's built on unstable foundations.

"If we allow any restrictions on free speech, we'll end up in totalitarian censorship." "If we provide any social benefits, we'll create a dependency culture where no one works." These arguments feel persuasive because they tap into genuine concerns whilst avoiding the difficult work of evaluating specific proposals on their merits.

The business world is particularly susceptible to slippery slope thinking. Companies resist adopting new technologies because "if we automate this process, we'll lose all human oversight." They reject flexible policies because "if we allow casual dress, professional standards will collapse entirely." The underlying assumption is that any deviation from current practice will inevitably spiral out of control.

Social media platforms amplify these tendencies by rewarding dramatic predictions over measured analysis. "This policy change is the beginning of the end" generates more engagement than "This policy change has both benefits and risks that need careful consideration." The algorithm doesn't distinguish between productive and destructive discourse - it simply amplifies whatever keeps people scrolling.

The psychological appeal lies in its simplicity. Complex changes have multiple possible outcomes, but the slippery slope offers a clear narrative: change leads inevitably to disaster. This certainty feels reassuring compared to the genuine uncertainty that accompanies most significant decisions.

But logical slopes aren't physical ones - they have guardrails called human agency, institutional safeguards, and democratic oversight. Most proposed changes can be tried, evaluated, and adjusted without triggering unstoppable cascades toward extremes.

The antidote requires distinguishing between possible and inevitable consequences. Instead of asking "Where might this lead?" we should ask "What evidence suggests that this progression is likely?" and "What mechanisms exist to prevent negative outcomes?" Real analysis examines specific proposals rather than constructing elaborate chains of hypothetical disasters.

Reasonable people can disagree about policies without resorting to catastrophic thinking (but do re-read the Popper quote above…) The slippery slope fallacy short-circuits this necessary democratic deliberation by making any change seem unreasonably risky.

AI Generated Image. Midjourney Prompt: bandwagon ar16:9

The Bandwagon Fallacy assumes that something is true or valuable because many people believe it. "Everyone's investing in crypto, so it must be a good idea." While popular opinion isn't irrelevant, this fallacy occurs when we substitute social proof for logical reasoning. In business, this might manifest as adopting strategies simply because they're trending, without considering whether they're appropriate for specific contexts.

The bandwagon effect is deeply rooted in our evolutionary psychology. For most of human history, following the crowd was often the safest strategy - if everyone was running from something, it was probably wise to run too. This instinct served us well when facing sabre-toothed tigers*, but it's less helpful when navigating complex modern decisions that require independent analysis. 

(*By the way, I found out thanks to my wonderful editor, Sarah, that sabre-toothed tigers didn't actually exist and their species name is actually sabre-toothed cat. It was (apparently) easier to identify it as a tiger because it looked like one and may have posed a similar danger. She thought it was an unusual quirk that we should be aware of as it kind of fits nicely into the bandwagon fallacy itself!)

We see this in how investment bubbles form (and not just from crypto-bros). During the dot-com boom, investors poured money into companies with no viable business models simply because "everyone else was doing it." The 2008 housing crisis followed similar patterns - people bought properties they couldn't afford because rising prices convinced them that "property always goes up." The social proof felt more compelling than financial fundamentals.

Corporate culture is again susceptible to bandwagon thinking. Companies rush to adopt open-plan offices because "everyone's doing it," despite mounting evidence about their negative effects on productivity and wellbeing. They implement agile methodologies or adopt cloud-first strategies because these approaches have achieved buzzword status, not because they've carefully evaluated their suitability.

When we see our networks embracing particular ideas, products, or movements, the pressure to conform intensifies. The visible enthusiasm of others creates an illusion of consensus that may not reflect broader reality. The algorithm shows us more of what our connections are sharing, creating echo chambers where certain ideas appear more popular than they actually are.

The business world's obsession with "best practices" often masks bandwagon thinking. "Industry leaders are all doing X, so we should too" becomes a substitute for rigorous analysis of whether X actually delivers results in specific contexts. What works for a Silicon Valley startup may be disastrous for a traditional manufacturing company, but the bandwagon effect obscures these crucial differences.

The psychological comfort of following the crowd is undeniable. Again, independent thinking requires effort, research, and the willingness to be wrong. It's cognitively easier to assume that the collective wisdom of others has already done this work for us. Going against popular opinion also carries social risks - we might be seen as contrarian, difficult, or out of touch.

But popularity and validity aren't synonymous. History is littered with widely-held beliefs that proved spectacularly wrong: the earth being flat, smoking being healthy, or bloodletting being an effective medical treatment. Majority opinion can be influenced by manipulation, misinformation, or simple misunderstanding.

The antidote requires developing what Kahneman called "slow thinking" - the deliberate, analytical approach that resists quick judgements entirely based on social proof. This means asking critical questions: 

  • Why is this popular? 
  • What evidence supports its effectiveness? 
  • How does my situation differ from those where this approach has succeeded?

Effective decision-making weighs popular opinion as one factor among many, not as the determining factor. Smart leaders listen to trends whilst maintaining their analytical independence, understanding that the wisdom of crowds has limits and that sometimes the crowd is simply wrong.

The Weaponisation of Logical Fallacies

What's particularly troubling about the current state of discourse is how logical fallacies have moved from accidental errors to deliberate weapons. Social media platforms, political strategists, and even some educational leaders have discovered that fallacious reasoning can be more effective than sound logic for achieving their goals.

The attention economy rewards provocative content over thoughtful analysis. A well-crafted straw man argument will generate more engagement than a measured response to someone's actual position. An ad hominem attack will go viral while a substantive critique languishes in obscurity. The platforms that dominate our information landscape are systematically biased toward the kind of thinking that undermines rational discourse.

Professional communicators have become skilled at constructing straw man arguments that resonate with their base while appearing reasonable to casual observers. They know that most people won't fact-check their characterisations of opponents' positions, so they can get away with increasingly dramatic distortions. And because we live in bubbles or echo chambers, this problem gets worse and worse.

The philosopher Hannah Arendt warned about this in The Origins of Totalitarianism, noting that the ideal subject of totalitarian rule is not the convinced Nazi or Communist, but people for whom "the distinction between fact and fiction, true and false, no longer exists." When logical fallacies become normalised, we lose our ability to distinguish between reasonable and unreasonable arguments, making us vulnerable to manipulation.

Policy debates often feature carefully constructed straw man arguments designed to generate outrage rather than understanding. Research findings are misrepresented through selective quotation and false dilemmas. Teachers and school leaders find themselves defending against positions they never held, while the real issues in education remain unaddressed.

The consequences extend beyond individual debates. When students observe adults engaging in consistently fallacious reasoning, they learn that this is how intelligent people argue. They internalise the idea that winning matters more than truth, that rhetorical skill is more valuable than logical rigour. We're not just failing to teach critical thinking - we're actively modeling its opposite.

AI Generated Image. Midjourney Prompt: arguing without intelligence ar16:9

The Neuroscience of Fallacious Thinking

Recent advances in neuroscience are providing fascinating insights into why logical fallacies feel so natural and why they're so difficult to resist. When we encounter information that challenges our beliefs, areas of the brain associated with physical pain literally light up. Changing our minds isn't just intellectually challenging - it's physically uncomfortable.

Research by neuroscientist Drew Westen found that when people encountered information that contradicted their political beliefs, the areas of their brains associated with reasoning showed little activity. Instead, circuits involved in emotion and conflict resolution became active. Essentially, our brains treat challenges to our beliefs as threats to be repelled rather than problems to be solved.

"But the political brain also did something we didn’t predict. Once partisans had found a way to reason to false conclusions, not only did neural circuits involved in negative emotions turn off, but circuits involved in positive emotions turned on. The partisan brain didn’t seem satisfied in just feeling better. It worked overtime to feel good, activating reward circuits that give partisans a jolt of positive reinforcement for their biased reasoning." Drew Westen

This explains why logical fallacies often feel satisfying rather than frustrating. When someone constructs a straw man argument that makes our opponent look foolish, it provides genuine psychological relief. The cognitive dissonance created by their challenging argument is resolved not through careful reasoning but through emotional dismissal. Our brains reward us for this resolution, even when it's based on fallacious thinking.

The implications for education are profound. Traditional approaches to teaching logic and critical thinking often assume that students just need to learn the rules of good reasoning. But if our brains are wired to resist logical challenges to our beliefs, we need different approaches. We need to teach students to recognise their emotional responses to challenging ideas and to develop strategies for engaging with discomfort rathe than avoiding it.

Fallacies in the Staffroom: Educational Leadership Under Attack

To drill deeper, I want to explore how the straw man fallacy plays out in typical school contexts. When a headteacher proposes implementing restorative justice practices, opponents might respond by attacking the “soft-on-discipline lefties” for wanting to "let students get away with anything." The original proposal - an evidence-based approach to behaviour management - is transformed into a caricature about abandoning all standards.

Similarly, when teachers request professional development time, critics might attack the teacher training industry for wanting to "waste money on pointless courses while students suffer." The reasonable request for ongoing learning is reframed as an attack on student welfare. These straw man arguments make productive policy discussions nearly impossible.

The false dilemma appears frequently in educational debates. "Either we focus on academic achievement or we care about student wellbeing." This artificial choice prevents schools from developing integrated approaches that address both concerns. It forces leaders to choose sides in debates where cooperation would be more productive.

Educational leaders also face ad hominem attacks when they advocate for challenging changes. "You can't trust the headteacher's views on assessment - they've never taught in a state school." Even when the criticism contains some truth, it deflects attention from the actual merits of the proposal. The focus shifts from "Is this idea sound?" to "Is this person qualified to suggest it?"

The appeal to authority manifests in educational contexts when research findings are accepted or rejected based on their source rather than their quality. "That's just government-sponsored research" or "It comes from a prestigious university, so it must be right." Both responses avoid the hard work of evaluating methodology, considering alternative explanations, and thinking critically about implications.

Perhaps most damaging is how logical fallacies undermine professional relationships. When colleagues consistently misrepresent each other's positions, trust erodes. When debates are decided by rhetorical skill rather than evidence, expertise becomes devalued. When ad hominem attacks become normalised, people stop sharing innovative ideas for fear of personal criticism.

The result is educational environments where important conversations don't happen, where problems persist because addressing them requires the kind of honest discussion that fallacious reasoning makes impossible. Schools need cultures of intellectual honesty to adapt and improve, but logical fallacies create cultures of defensive positioning instead.

AI Generated Image. Midjourney Prompt: logical immunity ar16:9

The Antidote: Developing Logical Immunity

Understanding logical fallacies isn't enough - we need practical strategies for recognising and countering them in real-time. This requires developing what we might call “logical immunity” - the ability to maintain clear thinking even when surrounded by fallacious reasoning.

The first step is learning to recognise the emotional signals that indicate fallacious thinking. When you feel your blood pressure rising during a debate, when someone's argument makes you want to attack their character rather than address their points, when you find yourself constructing increasingly extreme versions of their position - these are warning signs that logical fallacies are at work.

The ancient Stoic philosophers understood this. Marcus Aurelius wrote in his Meditations (one of my favourite books of all time): 

"How much trouble he avoids who does not look to see what his neighbour says or does, but only to what he does himself." Marcus Aurelius

This isn't about ignoring others' arguments, but about maintaining self-awareness during intellectual conflict.

The second step is developing a growth mindset toward intellectual challenges. Instead of seeing contrary arguments as attacks to be repelled, we can learn to view them as opportunities to refine our thinking. This doesn't mean accepting every argument uncritically, but it does mean engaging with the strongest version of opposing views rather than seeking out weaknesses to exploit.

The principle of charity in philosophical discourse provides a useful framework. This involves interpreting others' arguments in their strongest possible form, assuming good faith, and addressing their best points rather than their worst. It's the opposite of straw man reasoning - instead of weakening opponents' arguments, we strengthen them to ensure we're engaging with something worth defeating.

The third step is learning to ask better questions. Instead of "How can I prove this person wrong?" we should ask "What would it take to change my mind about this issue?" Instead of "How can I dismiss this evidence?" we should ask "What would this evidence mean if it were true?" These questions shift us from defensive to exploratory thinking.

The fourth step is developing comfort with intellectual uncertainty (I did talk about this in last week’s article when it comes to certainty and clarity too!). Many logical fallacies arise from our discomfort with not knowing. The false dilemma makes complex problems seem simple. The appeal to authority provides certainty without thinking. The bandwagon fallacy offers the comfort of collective agreement. Learning to say "I don't know" and "I need to think about this more" is crucial for maintaining intellectual honesty.

Teaching Logical Thinking in an Illogical World

For many of us, the challenge isn't just avoiding logical fallacies ourselves - it's helping others develop resistance to fallacious reasoning in a world that rewards it. This requires moving beyond traditional approaches to teaching logic and critical thinking, so probably starts in education as well as in the home for parents.

Traditional logic instruction often focuses on formal rules and abstract examples. Students learn to identify syllogisms and spot obvious fallacies in contrived scenarios. But this approach fails to prepare them for the sophisticated fallacies they'll encounter in real life. We need to teach logic in context, using examples from the kinds of debates students will actually face.

AI Generated Image. Midjourney Prompt: huge chalkboard with logical problem on it and student working it out ar16:9

This means bringing current controversies into the classroom and home - not to advocate for particular positions, but to practise logical analysis. When students can identify straw man arguments in political debates, ad hominem attacks in social media discussions, and false dilemmas in policy proposals, they develop practical skills for navigating modern discourse.

The psychologist Jerome Bruner's work on narrative thinking provides another approach. Bruner argued that humans naturally think in stories rather than logical propositions. We can use this tendency by teaching students to recognise the narrative structures underlying logical fallacies. The straw man fallacy tells a story about opponents as extremists. The slippery slope presents a cautionary tale about inevitable consequences. By understanding these narratives, students can better resist their emotional appeal.

We also need to teach students to recognise their own susceptibility to fallacious reasoning. This isn't about making them paranoid, but about developing intellectual humility. When students understand that everyone - including themselves - is prone to logical errors, they're more likely to engage in the kind of careful thinking that prevents them.

The ancient Greek concept of dialectic - the art of logical discussion - provides a useful model. In dialectical thinking, the goal isn't to win debates but to discover truth through careful questioning and analysis. Students need to learn that changing your mind in response to good evidence isn't a sign of weakness but of intellectual maturity.

Perhaps most importantly, we need to model logical thinking ourselves. When teachers and parents engage in straw man arguments, use ad hominem attacks, or rely on false dilemmas, students learn that these are acceptable ways of reasoning. But when teachers demonstrate intellectual honesty, charitable interpretation, and comfort with uncertainty, students learn that these are valued traits.

AI Generated Image. Midjourney Prompt: comfort with uncertainty ar16:9

Fallacies in the Age of Algorithms

The digital revolution has created new challenges for logical thinking. Social media platforms, search engines, and recommendation algorithms aren't neutral tools - they're designed to maximise engagement, often at the expense of rational discourse.

The ‘engagement economy’ rewards content that provokes strong emotional responses. A measured analysis of educational policy might receive a dozen likes, while a provocative straw man attack on the same policy goes viral (check out a great post from Dr Rona McKenzie about a recent MIT study - it’s brilliant!). The algorithms that determine what we see are systematically biased toward the kind of content that undermines logical thinking.

This creates what researchers call “false polarisation”. When algorithms show us increasingly extreme versions of opposing viewpoints, we lose touch with the reasonable middle ground - or third spaces - where most people actually reside. We start to believe that our opponents hold more extreme positions than they actually do, accepting that fallacies seem more reasonable.

The problem is compounded by the speed of digital communication. Traditional media allowed time for reflection and fact-checking. But social media rewards immediate responses, favouring quick emotional reactions over careful analysis. The result is an environment where logical fallacies spread faster than reasoned responses can catch up.

For educators, this means we need to teach students not just to recognise logical fallacies, but to understand how digital platforms amplify them. Students need to learn about filter bubbles, echo chambers, and algorithmic bias. They need to understand how the structure of digital communication incentivises dodgy reasoning.

This doesn't mean abandoning digital tools, but it does mean using them more thoughtfully. We can teach students to pause before sharing, to fact-check claims before believing them, and to seek out diverse perspectives rather than relying on algorithmic curation. Just because they heard it on TikTok doesn’t mean it’s gospel!

The challenge is significant, but not insurmountable. Digital tools can also be used to enhance logical thinking. Online fact-checking resources, collaborative reasoning platforms, and AI-assisted analysis can help students develop stronger critical thinking skills. The key is understanding how these tools work and using them intentionally rather than passively.

The consequences of widespread logical fallacies extend far beyond academic debates or online arguments. In a democracy, citizens need to be able to evaluate political claims, assess policy proposals, and make informed decisions about complex issues. When logical fallacies dominate public discourse, democratic decision-making becomes compromised.

The COVID-19 pandemic provided a stark illustration of these stakes and I don’t think we are done with that yet. Public health measures were often debated using straw man arguments (characterising mask mandates as "government tyranny"), false dilemmas (presenting choices as "economy versus health"), and ad hominem attacks (dismissing epidemiologists as "so-called experts"). These fallacies weren't just intellectually frustrating - they had real consequences for public health and economic recovery. Not only that, but genuine logical responses and treatments were dismissed on fallacious reasoning (love him or loathe him, Joe Rogan was attacked based on a whole heap of lies around his use of Ivermectin, labelled “horse dewormer”!) And worse still, authentic discourse about the cause of the pandemic was tagged as mis- or malinformation - and to be fair, the lab leak theory that was shut down has now been accepted as mainstream logic!

Climate change presents another area where logical fallacies have serious implications. When complex scientific evidence is dismissed through ad hominem attacks on researchers, when policy proposals are misrepresented through straw man arguments, when the issue is reduced to false dilemmas between "economy and environment," we lose the ability to address one of the most serious challenges facing humanity.

The rise of artificial intelligence adds another dimension to these stakes. As AI systems become more sophisticated, humans' comparative advantage will lie in areas that require creativity, empathy, and logical reasoning. But if we can't think clearly about complex issues, if we can't distinguish between sound and unsound arguments, if we can't engage productively with challenging ideas, then we're failing to develop the very skills that will remain uniquely human.

This isn't about becoming emotionless reasoning machines. Human reasoning will always involve emotion, values, and subjective judgment. But it's about developing the capacity to recognise when our emotional responses are leading us toward fallacious thinking, and having the tools to course-correct when necessary.

AI Generated Image. Midjourney Prompt: human course correcting with a compass ar16:9

Key Takeaways

Developing resistance to logical fallacies isn't a one-time achievement but an ongoing practice. Here are seven practical strategies for building logical resilience in yourself and others:

1. Learn to Recognise Your Emotional Triggers When you feel your blood pressure rising during a debate, when someone's argument makes you want to attack their character rather than address their points, when you find yourself constructing increasingly extreme versions of their position - these are warning signs that logical fallacies are at work. Develop the habit of pausing when you notice these signals.

2. Practise the Principle of Charity Always interpret others' arguments in their strongest possible form. Assume good faith and address their best points rather than their worst. If you can't state your opponent's position in a way they would recognise and accept, you're probably not ready to critique it effectively.

3. Ask Better Questions Instead of "How can I prove this person wrong?" ask "What would it take to change my mind about this issue?" Instead of "How can I dismiss this evidence?" ask "What would this evidence mean if it were true?" These questions shift you from defensive to exploratory thinking. (If you need help with this, check out my Questions Cards!)

4. Develop Comfort with Intellectual Uncertainty Learn to say "I don't know" and "I need to think about this more." Many logical fallacies arise from our discomfort with uncertainty. The false dilemma makes complex problems seem simple. The appeal to authority provides certainty without thinking. Embracing uncertainty is crucial for maintaining intellectual honesty.

5. Create Diverse Information Diets Actively seek out perspectives that challenge your existing beliefs. Read sources you disagree with, engage with people who think differently, and expose yourself to arguments that make you uncomfortable. This isn't about accepting everything uncritically, but about ensuring you understand what you're disagreeing with. In other words, get out of your echo chamber!

6. Teach Others to Recognise Fallacies Help colleagues, students, and family members develop logical immunity. When you spot a straw man argument, gently point out the misrepresentation. When you see an ad hominem attack, redirect attention to the actual argument. Teaching others reinforces your own logical discipline.

7. Model Intellectual Honesty Admit when you're wrong, acknowledge the limitations of your knowledge, and show that changing your mind in response to good evidence is a sign of intellectual maturity, not weakness. Your example influences others more than your arguments.

The battle against logical fallacies isn't just about winning debates or scoring intellectual points. It's about preserving our capacity for the kind of rational discourse that democracy requires, that scientific progress depends on, and that human flourishing demands. In a world where fallacious reasoning is increasingly weaponised, the ability to think clearly isn't just intellectually virtuous - it's practically essential.

As the philosopher Bertrand Russell warned: 

"The fundamental cause of the trouble is that in the modern world the stupid are cocksure while the intelligent are full of doubt." Bertrand Russell

Our task isn't to eliminate doubt - doubt is healthy and necessary. Our task is to eliminate the kind of cocksure stupidity that logical fallacies enable, replacing it with the kind of thoughtful uncertainty that leads to genuine understanding.

The straw man may be easier to defeat than the real argument, but defeating straw men won't solve real problems. Only by engaging honestly with the strongest versions of challenging ideas can we hope to build the kind of world our students deserve to inherit. And that's a fight worth having - with logic, evidence, and intellectual integrity as our weapons of choice.

Subscribe Now

Subscribe to receive the latest blog posts directly to your inbox every week.

By subscribing, you agree to our Privacy Policy.
Thank you! Your submission has been received!
Oops! Something went wrong. Please try again later.