• Are You on the Right Side of History?

    Spend any reasonable amount of time on social media and you will be sure to notice a mathematical phenomenon: At least 90% of people believe that at least 50% of everyone else is wrong. While this is obviously incorrect, it isn’t all that surprising; after all, studies show that 80-90% of drivers rate themselves as “above average” – another statistical impossibility. Perhaps it’s only logical to assume that people will overrate their ability to create an accurate and comprehensive worldview, just as they assess their driving skills with a positive bent. In this case, however, I don’t think it’s that simple.

    Confidence Rooted in Ignorance


    The Dunning-Kruger effect is a cognitive bias among people who overvalue their skills and/or intellect, simply because they aren’t intelligent or experienced enough to recognize what they don’t understand. We could apply that notion to this scenario as an explanation of why people overestimate the righteousness or accuracy of their views. It attempts to explain why we see so many people who are clearly emotionally led claim to be logical thinkers, or why people may lack nuance in regard to complex (or even simple, for that matter) issues. A person who is more likely to be persuaded by emotion is not only inherently less logical, but often lacks the ability to definitively assess whether they are actually being logical or not. Conversely, a highly logical person often overestimates their ability to understand the emotions and importance of emotion to others. But there’s more going on here than just overly-optimistic self-perception.

    The reality is that very few people (perhaps 1% or less) actually create much of their own worldview at all – they instead model their views from what they hear or read from someone else, or more likely, they create a composite of select views taken from numerous others.

    The Fallacy of Logical Perception

    Nearly all of us believe our views come from analysis, despite the fact that the vast majority of people aren’t comparatively analytical at all (The most analytical and logical personality type is said to represent around 3% of the population). In spite of this, most would insist that our conclusions are based on data. It is our conviction, that we thoroughly analyze and interpret the information in the world around us, and use our findings as the foundations for our beliefs. In reality, the process is quite the opposite.

    Author and social psychologist Jonathan Haidt demonstrates in his book: The Righteous Mind that our judgements are often created from our own cognitive intuition and bias, rather than conscious reasoning. We then seek out information, opinions, and sources that support and align with this bias, often ignoring or discrediting evidence to the contrary. It should also be noted that it is extremely rare for people to capitulate their stated views, even in the face of obvious and irrefutable logical evidence. For example, when asked how they determine what is morally wrong, many people confidently state that morality is determined by harm. When something causes direct harm to others, it is objectively wrong; when it does no harm, it is not. Subjects are then asked to state whether several hypothetical actions are “wrong” or not. Eventually, they discover one (or more) that they determine is wrong, but cannot identify anyone who is directly harmed (example: necrophilia). When asked if they would like to change their answer and determine that the act is either not wrong, or that their metric to determine morality is flawed, they don’t. They maintain their presupposition, even when it is in direct contrast to their belief structure. This results in two simultaneously opposing views that are in no way compatible, yet the subject chooses to hold onto their views instead of acknowledging the discrepancy and discarding their illogical position.

    Why We Hold on to Illogical Views

    In some sense, we are all emotional beings, and ego may prevent us from changing our views when presented with evidence that we are wrong. A more scientific hypothesis is that throughout history, acknowledging that we were wrong to ourselves and others may have been more detrimental to our position in society than actually continuing to be wrong. For example. a leader in a tribe has to exude confidence and reliability in order to maintain their following from fellow tribesmen. Reversing a proclamation or moral belief might signal ineptitude or fallibility that could result in a lower status or even persecution from the tribe. This dynamic still carries weight today, and we certainly still hold some remnants of this behavior. Unfortunately, this leads to less-than-ideal cognitive practices, and a greater opportunity for disingenuous thought and expression, even when we don’t realize we are doing it.

    How Do We Know if We Are on “The Right Side of History”?

    I see this statement online often; it was actually one of the things that motivated me to write this blog. It’s either a proclamation that the writer is on “the right side of history”, or that their opposition is on “the wrong side of history”. Ironically, it’s usually the person who feels emboldened to make this declaration that is most likely to be wrong. Perhaps it is the result of overcompensation driven by an inner self-doubt, or an appeal to self-authority brought on by a lack of evidence. Or, it may just be another example of the most ignorant people being the loudest. Regardless of the motivation, it typically comes from an emotional belief of being right, not a logical analysis of one’s position. This is often intensified by the false consensus effect, a cognitive bias that results in people assuming far more people share their views than actually do (online echo-chambers and algorithmic sorting further validate these feelings).

    So how do you know if you are on the right side of history? There aren’t many ways to be sure. Emotion is not a reliable guide to the truth. Validation through peer support or popularity is historically a terrible predictor – countless acts that we find repulsive today were committed because they were socially acceptable or popular at the time. The most reliable and effective source is history itself. If you aren’t willing to study history, you’re really just creating an imaginary reality that validates your emotions. While social conditions and technology are ever-changing, the broad strokes of history tend to be repetitive, and played a large part in forming the world we live in today.

    • Read. The more, the better. Not the latest online article from your favorite political pundit, social media memes, or the biased offerings of today’s legacy media – read about history. Build a foundation of knowledge so you understand where societies went wrong, and how we got where we are today. Seek historical information that lacks an agenda or aim to influence the reader. If your knowledge base comes from sources like “The feminist perspective ” or “The based conservative” (I made those up, but they probably exist), you’re just reading someone else’s opinion because it aligns with the views you already have. Be better. Be part of the 1% who actually form their own worldview based on knowledge, not tribal politics and emotion.
    • Seek to understand those you disagree with. Nearly every societal disagreement has validity on both sides. If you can’t make a convincing argument against your own views, you likely don’t have a sufficient understanding of the situation, and haven’t made an honest attempt to explore where you might be wrong.

    Ultimately, no one can proactively claim to be on the “right side of history”. The very notion that such a binary even exists is flawed. The closest we can come to earning that designation isn’t through confident proclamation or online echo chambers, but through relentless questioning of our own beliefs, unbiased discovery of past occurrences, and a comprehensive understanding of the positions held by those we disagree with.

    So ask yourself – not “Am I on the right side?”, but “Am I asking myself the difficult questions, or am I seeking to validate my position?” The answer to that one just might determine where history places you.

  • Free Will, Determinism, and the Logic of Accountability

    Do we have free will? For many, “free will” is the notion that we have the ability to make our own decisions, and the freedom to act upon them. This is often just an acknowledgment of will itself, not a claim that our will is completely free. Free from what – determinism, external coercion, a higher power? A deeper dive into that question asks where that will comes from in the first place, and if there are ever legitimate exceptions to the responsibility we attach to the principle of free will.

    Acquired Sociopathy

    In 2012, a man with no criminal record suffered a severe traumatic brain injury in a motor vehicle accident, with focal damage to the prefrontal cortex. Within a month, he committed an impulsive murder during a minor altercation. Neuropsychological tests showed significant deficits in moral reasoning and impulse control directly attributable to the accident. Does this mean he should be treated differently? Was the act of murder completely of his own free will, or were there circumstances beyond his control that contributed?

    In 2000, a 40-year-old Virginia school teacher with no prior history of sexual deviance suddenly developed intense pedophilic urges. He began collecting child pornography and inappropriately propositioning women, among them, his own stepdaughter. He was convicted and sent to prison, where he complained of headaches. An MRI revealed a large orbitofrontal tumor. After surgical resection of the tumor, the pedophilic behavior vanished completely. Seven months later, the behavior returned, and tests revealed that the tumor had regrown. Following a second resection, the behavior disappeared again. This demonstrates, once again, that physical characteristics of the brain can dramatically alter mental function.

    Cases like these (and dozens like them) have established the existence of a syndrome known as acquired sociopathy (sometimes called frontal-lobe syndrome), in which damage to the prefrontal or orbitofrontal cortex dramatically lowers the threshold for impulsive violence or sexual deviance in previously normal individuals. This concept has been used in court to mitigate punitive measures, supporting the belief that defendants may not bear full moral responsibility for their transgressions, potentially caused by a pathological condition beyond their control.

    Disruptors in Pre-birth or Early Life

    Brain anomalies are not limited to tumors and physical accidents, though. Numerous congenital and early-life conditions are strongly associated with violence and anti-social behavior.

    Conditions present at birth or developed very early, such as malformations or damage to the orbitofrontal/ventromedial prefrontal cortex, early white-matter disorder, amygdala dysfunction, and fetal alcohol spectrum disorder, all correlate with significantly higher predispositions to impulsive aggression.

    We label certain patterns as “conditions”, but they’re simply variations along a spectrum, visible only because they stand out from the statistical norm.

    A Product of Nature

    Genetics and the prenatal environment shape our neural architecture, and thus our capacity to process information, regulate emotions, and inhibit impulses; long before we have any awareness or agency. For example: several studies have demonstrated that IQ is mostly heritable in adults. A person with a 70 IQ cannot “will” themselves to an IQ of 130 any more than a 5’5″ person can will themselves to 6’5″. They didn’t get a vote in their IQ potential at birth, either.

    The same principles apply to temperament. Tendencies toward aggression, neuroticism, preference in taste of food, even what time we go to bed all show measurable genetic influence. While the list of genetic variation is nearly endless, the key point is simple: a substantial portion of who we are, how we think, and the paths we are likely to take, is shaped by the biological equipment we received in-utero – equipment over which we had zero control.

    Environmental Influence

    That said, genetics is not destiny. Much of who we are is in response to the environment we experience throughout our lives. Studies of identical twins raised apart show that significantly different life experiences produce markedly different personalities, habits, and values, despite having identical DNA. Language acquisition demonstrates how experiences at a young age can affect outcomes. A child immersed in a new language before puberty almost always develops an accent nearly indistinguishable from the native tongue, whereas an adult learner rarely does, regardless of the effort they invest. Early trauma, neglect, and other experiences can have equally significant effects on brain function later in life.

    The Unchosen Foundations of Choice

    We like to think we freely choose our actions, but every choice is made by a brain we didn’t design, shaped by genes we didn’t pick, and experiences we didn’t control. That’s why nearly every religious tradition and ancient wisdom warns against the perils of pride. If our abilities and inclinations ultimately come from gifts we never gave ourselves, taking credit begins to look like arrogance, and a moral plagiarism that demands applause instead of gratitude.

    The same logic cuts both ways. If we withhold pride from the lucky because they didn’t earn their good fortune, it’s only fair that we withhold contempt from the unlucky, because they didn’t choose their deficits either. That recognition doesn’t remove accountability, it simply asks us to replace self-righteousness and scorn with something closer to moral empathy.

    Back to the Car Accident – Should it Change Our Perspective ?

    In cases where people commit violent acts after experiencing traumatic brain injuries, most of us have some level of consideration for their condition. We will perhaps view these situations differently from those in which the assailant didn’t undergo a dramatic re-wiring of the brain. The fundamental problem with this assessment is to assume that an act of “re-wiring” is somehow more profound than wiring that was faulty in the first place. A man who acts violently because he has poor impulse control (or any other character defect) will always have an origin for his shortcomings, whether it be the biological hardware he was born with, the experiences he had in life, or (most likely) both. Do we see someone who was born without legs differently from someone who lost their legs in a accident? Would we expect the first man to walk any more than we would expect the second man to?

    Examples of situational influence are endless in life. Studies have shown that victims of childhood sexual abuse are up to 8 times more likely to perpetuate sexual abuse in adulthood, and children raised in violent or chaotic households are many times more likely to repeat those patterns themselves. People who grow up around violence, drug use, theft, and other criminal activity are 12-25 times more likely to become involved in criminal acts. Where one person may be lucky enough to have the intelligence or the guidance to avoid these pitfalls, others will not. There exists no negative or destructive expression in humans that cannot be attributed to circumstance or biology. There is a cause for everything.

    A Premise for Action

    Once we acknowledge that people are a product of biology and environment, it becomes difficult to hold anyone accountable for anything. Over-expressed empathy can lead to excusing all poor behavior, presenting a clear problem for society.

    A solution is to start by understanding that these influences are real, powerful, and deeply unfair. Some of us just got a bad ticket in the lottery of life. That reality deserves compassion, but it must never become an excuse. The only question that matters is this: Is tolerating this behavior good for society? If the answer is no (and it almost always is), then we cannot permit it, no matter how tragic the offender’s backstory.

    While this can feel emotionally cold on the surface, it’s actually the most humane path available, and here’s why: Failing to punish misconduct does not merely allow it to continue; it actively breeds more of it. Every unpunished act exposes new people to the very conditions – trauma, normalization, eroded boundaries – that make them likely to repeat the same harm. Each failure to sanction the original offender quietly manufactures both a new victim and, statistically, a future offender. This is known as a positive feedback loop, in which a system’s output amplifies the input, reinforcing and accelerating the cycle. The only way to stop the cycle is to stop the output.

    As the saying goes, life isn’t fair. It’s often, in fact, tragic. Despite our emotional disgust in some people’s behavior, it’s logically appropriate to find some sympathy for even the worst offenders. We may wish things had unfolded differently, but the past is fixed. Think of the familiar trope in zombie films: the hero watches a friend or loved one become infected, then is forced to grapple with the impossible choice of putting them down, or risking becoming a victim themselves. In reality, our choices aren’t so stark. We’re not asked to slay zombies to save the world. We do, however, have a responsibility to preserve and protect what’s good in our world – to stop the infection of destructive cycles, and respond to harmful behavior in consistent, principled ways.