Brandolini's Law: The Exhausting Asymmetry of Truth and the Violence of Lies
Brandolini's Law: The Exhausting Asymmetry of Truth and the Violence of Lies
On Brandolini's Famous Tweet from 2013
By Alexander Mills
•• 10 views
Brandolini's Law: The Exhausting Asymmetry of Truth and the Violence of Lies
The Tweet That Named Our Frustration
On a cold January evening in 2013, Italian programmer Alberto Brandolini sat down at his computer, freshly frustrated from watching yet another political talk show. He'd just finished reading Daniel Kahneman's masterwork Thinking, Fast and Slow, and the collision between the book's insights and what he'd seen on television sparked something. Former Prime Minister Silvio Berlusconi and journalist Marco Travaglio were engaged in one of their typical heated exchanges—claims flying fast and loose, facts secondary to rhetoric.
Brandolini opened Twitter and typed 23 words that would resonate across the internet for years to come:
"The bullshit asimmetry: the amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it."
That single tweet, posted at 2:29 AM on January 11, 2013, gave a name to something millions of people had experienced but couldn't quite articulate. It wasn't just an observation—it was a law of the digital information age. Today, we know it as Brandolini's Law, or more formally, the Bullshit Asymmetry Principle.
Understanding the Asymmetry
The principle itself is deceptively simple: creating misinformation is fast and easy, while debunking it is slow and exhausting. You can claim the moon is made of cheese in five seconds. Explaining why it's not—discussing lunar geology, the Apollo missions, spectroscopy, and addressing every tangential conspiracy theory that sprouts from the original claim—takes considerably longer.
This isn't just about time. The asymmetry runs deeper. When someone makes an outlandish claim, they face no burden of proof. They don't need citations, peer review, or coherent logic. They just need to sound confident. But the person refuting that claim? They need evidence, context, expertise, patience, and often the rhetorical skill to make truth compelling against fiction's native appeal.
As Brandolini's principle gained traction, it struck a chord particularly with software developers, scientists, and educators—people who dealt professionally with precision and accuracy, and who found themselves increasingly exhausted by the Sisyphean task of correcting misinformation online.
But there's a darker dimension to this asymmetry that Brandolini's tweet only hints at: when lies cannot be easily refuted, they don't just spread—they destroy. And when the legal system itself becomes a bottleneck for truth, the asymmetry can turn violent.
Navigating Privacy Standards in Coaching and Healthcare
When Lies Meet the Courtroom: The Cost of Truth
The legal system should theoretically correct for Brandolini's Law. Courts are designed to sort truth from fiction through rigorous processes. But in practice, the courtroom often amplifies the asymmetry rather than resolving it.
Consider the economics alone. A false accusation costs nothing to make. Speaking the words takes seconds. Filing papers costs a few hundred dollars in court fees. But defending yourself? That's where Brandolini's Law becomes financially ruinous.
Defamation lawsuits typically cost between fifteen thousand and twenty-five thousand dollars on average, though complex cases can reach hundreds of thousands. Initial filing fees range from two hundred to five hundred dollars, but expert testimony alone can run several thousand. If your case crosses state lines, you'll need attorneys licensed in multiple jurisdictions. If the false statements are still being published, you might need emergency motions—expensive, urgent legal work to stop ongoing harm.
The burden of proof falls on the victim. You must demonstrate that the statement was false, that it was communicated to others, that it caused you measurable harm, and that the person making it acted with negligence or malice. For public figures, the standard is even higher—you must prove "actual malice," meaning the person knew the statement was false or showed reckless disregard for the truth.
Meanwhile, the person who lied? They can simply keep lying. Each new false statement resets the clock, requiring another round of expensive legal work. The asymmetry is built into the system: creating lies is cheap and fast; proving them false is expensive and slow.
And that's assuming you can afford to fight at all. Many victims of false accusations never sue, not because they lack a case, but because they lack the resources. The lie spreads unchallenged. Reputations crumble. Lives are destroyed. All while the legal system, theoretically designed to provide redress, sits out of reach behind a paywall most people can't afford.
Richard Jewell: When the Asymmetry Turns Deadly
Sometimes the stakes go beyond money. Sometimes, Brandolini's Law intersects with the darkest impulses of human nature, and the asymmetry becomes a matter of life and death.
Richard Jewell was working as a security guard during the 1996 Summer Olympics in Atlanta. Shortly after midnight on July 27, he discovered a suspicious backpack beneath a bench in Centennial Olympic Park. Inside were pipe bombs—the largest such device in FBI and ATF history at that time. Jewell alerted authorities and helped evacuate the area. The bomb exploded anyway, killing two people and injuring 111. Thanks to Jewell's actions, hundreds more were likely saved.
For about 72 hours, Jewell was a hero. Then he became a suspect.
The FBI theorized that Jewell might have planted the bomb himself to gain recognition—a twisted attempt at manufactured heroism. This theory was based almost entirely on psychological profiling: Jewell was a "wannabe cop," overweight, living with his mother at age 33, with what some characterized as an overzealous approach to security work. The FBI had no physical evidence linking him to the crime. No witnesses placed him planting the bomb. No forensics supported the theory.
But the leak happened anyway. On July 30, 1996, the Atlanta Journal-Constitution published a special edition with the headline: "FBI Suspects 'Hero' Guard May Have Planted Bomb." The media feeding frenzy began immediately.
The New York Post called Jewell a "Village Rambo" and "a fat, failed former sheriff's deputy." Late-night host Jay Leno joked on national television about "big, fat stupid guys" at the Olympics. News crews camped outside Jewell's home. Reporters dug through his trash. His past employers were contacted. Every detail of his life was scrutinized, distorted, and broadcast to millions.
Creating the suspicion took moments—a leak from an FBI source, a headline, a broadcast. Refuting it consumed Jewell's life for 88 days. He was interrogated under false pretenses—FBI agents brought him to their headquarters claiming they needed him for a training video, then proceeded to question him as a suspect without counsel. His apartment was searched. His mother was harassed. He couldn't work. He couldn't go outside without being mobbed.
On October 26, 1996, the U.S. Attorney finally sent Jewell a letter stating he was no longer a target of the investigation. The letter offered no apology. By then, the damage was comprehensive. Jewell filed lawsuits against NBC, CNN, the Atlanta Journal-Constitution, and others. NBC settled for half a million dollars; after legal fees and taxes, Jewell kept roughly a third. Other settlements followed, but none of them gave him back his reputation or the months of hell he'd endured.
The real bomber, Eric Rudolph, wasn't captured until 2003. He pleaded guilty in 2005 to the Olympic bombing and three other attacks. Jewell attended the plea hearing but made no comment. He worked as a sheriff's deputy in Georgia for a few years, finally achieving his dream of wearing a real police uniform. He died of heart failure in 2007 at age 44, his health compromised by diabetes and the stress of his ordeal.
The asymmetry? A few federal agents and journalists created a narrative in hours that destroyed a man's life. Correcting that narrative required years of litigation, millions of dollars in legal costs, and still never fully restored what was lost. One false theory, leaked to the press. Nearly a decade of consequences. And even after Jewell's death, his name remained associated with suspicion in the minds of those who never followed the story to its conclusion.
The Duke Lacrosse Case: When a Lie Becomes a Weapon
If Richard Jewell's case showed how law enforcement and media can amplify Brandolini's Law, the Duke lacrosse case demonstrated how false accusations can weaponize it—and how the legal system's asymmetry can be exploited with devastating effect.
On March 13, 2006, Crystal Mangum, working as an exotic dancer, was hired to perform at a party hosted by Duke University's lacrosse team. She later accused three players—David Evans, Collin Finnerty, and Reade Seligmann—of raping her in a bathroom at the residence.
The accusation took minutes to make. Refuting it would consume 394 days and cost the accused students, their families, and Duke University millions of dollars. Just this month, in December 2024, Mangum publicly admitted for the first time that she fabricated the entire story.
Durham County District Attorney Mike Nifong was in the middle of a difficult Democratic primary when the accusations surfaced. The case became his political strategy. He made inflammatory public statements about the case, declared that a rape had definitely occurred, and promised aggressive prosecution—all before the evidence was fully examined.
DNA tests on Mangum's body found genetic material from multiple males. None of it matched any lacrosse player. Nifong withheld this exculpatory evidence from defense attorneys, instead telling the court and public that only Mangum's boyfriend's DNA had been found. When the private lab results showing DNA from multiple unidentified males came to light, it was revealed that Nifong had conspired with the lab director to hide this information.
Mangum's story changed repeatedly. She gave multiple inconsistent accounts of what happened, where it happened, and who was involved. Alibi evidence placed one defendant, Seligmann, at an ATM and in a taxi at the time of the alleged assault. The taxi driver, Moezeldin Elmostafa, signed a sworn statement. Shortly after, he was arrested on a two-and-a-half-year-old shoplifting charge—not for shoplifting himself, but for driving the actual shoplifter. Many saw this as intimidation. He was later acquitted.
The lacrosse team's season was canceled. Coach Mike Pressler was forced to resign. The three accused students faced death threats. Photos of lacrosse players were posted around Durham with captions requesting information. Faculty members published advertisements calling out the "Social Disaster" and creating a perception of guilt. Duke President Richard Brodhead suspended the team, a move widely criticized as premature since no charges had even been filed yet.
The asymmetry played out in stark form. Mangum's accusation required no evidence, no consistency, no proof. The defendants had to produce timestamped photographs, ATM receipts, taxi records, cell phone data, witness statements, and undergo intensive DNA testing—all while their faces were plastered across national media as alleged rapists. Legal costs mounted into the millions.
In April 2007, North Carolina Attorney General Roy Cooper dismissed all charges and declared the players innocent. He spoke of a "tragic rush to accuse and a failure to verify serious allegations." Nifong was disbarred in June 2007 for withholding evidence and making misleading statements. He served one day in jail for contempt of court.
The three players reached a confidential settlement with Duke University. They filed civil lawsuits against the city of Durham, Nifong, and others. In their statement, they said: "It is impossible to fully describe what we, our families and team endured. As we said from day one, we are innocent. But it took three hundred and ninety-four days, and the intervention of the North Carolina Attorney General, before our innocence was formally declared."
Prosecutors declined to charge Mangum with filing false reports or perjury. She faced no legal consequences for the false accusations. Years later, in 2013, she was convicted of second-degree murder for stabbing her boyfriend. She's currently serving her sentence in a North Carolina prison.
In December 2024, during a podcast interview from prison, Mangum admitted: "I testified falsely against them by saying that they raped me when they didn't, and that was wrong. I made up a story that wasn't true because I wanted validation from people and not from God."
Eighteen years. That's how long it took for the full truth to be publicly acknowledged. One false accusation, made in minutes. Three young men's reputations destroyed, tens of millions in legal costs, a coach's career ended, a university's reputation tarnished, a prosecutor disbarred, and countless hours of investigative work—all to refute what should never have been given credibility in the first place.
Pizzagate: When Online Fiction Becomes Real-World Danger
If the Duke case showed how false accusations exploit legal asymmetries, Pizzagate demonstrated that Brandolini's Law can directly incite violence. The misinformation doesn't just spread faster than truth—it spreads faster than law enforcement can respond.
In late 2016, a bizarre conspiracy theory emerged from the depths of anonymous internet forums. Hackers had released emails from Hillary Clinton's campaign chairman John Podesta via WikiLeaks. Users on 4chan began scrutinizing the messages, convinced they contained hidden meanings. Someone noticed mentions of "pizza" and "cheese." The leap was swift: these must be code words for something sinister.
The conspiracy metastasized around Comet Ping Pong, a Washington D.C. pizzeria owned by James Alefantis, a Democratic donor mentioned in the emails. Within days, elaborate theories spread claiming the restaurant was a front for a child trafficking operation involving prominent Democrats. Social media posts from the restaurant showing children at birthday parties were reinterpreted as "evidence." Completely unrelated incidents—missing children from Portugal, decades-old rumors—were woven into the narrative.
Creating this conspiracy took hours. Anonymous users on 4chan connected imaginary dots. A Missouri woman named Carmen Katz shared the story on Facebook—probably took her minutes. Douglas Hagmann brought it to InfoWars, reaching nearly eight million unique visitors monthly. The whole apparatus of conspiracy distribution activated with terrifying speed.
Meanwhile, Comet Ping Pong faced a deluge of death threats. Staff received calls threatening violence. The restaurant owner had to hire security. Journalists began the exhausting work of debunking: investigating Alefantis's background, verifying that the restaurant had no basement (despite claims of underground dungeons), tracing the conspiracy's origins through layers of anonymous forums, interviewing employees, explaining why correlation doesn't equal causation.
And then, on December 4, 2016, Edgar Maddison Welch drove six hours from North Carolina to Washington with an AR-15 rifle. Consumed by the conspiracy theory and believing he was on a rescue mission, he entered Comet Ping Pong and fired shots inside the restaurant, searching for tunnels and imprisoned children. He found ping pong tables and a kitchen. No one was hurt, but they easily could have been.
Welch received a four-year prison sentence. But the conspiracy didn't die. When Elon Musk posted references to Pizzagate in 2023, mentions spiked by over 9,500 percent. Each revival requires the same exhausting cycle of debunking.
The asymmetry here is particularly violent. Someone fires a gun in a restaurant because of lies that took minutes to fabricate. Law enforcement must investigate. Victims must testify. Media must fact-check. The restaurant must defend itself in court of public opinion. Years of consequences for seconds of fiction.
This is where Brandolini's Law reveals its most disturbing implication: when lies spread faster than truth, and when refuting them is exponentially harder than creating them, violence can fill the gap. When people believe false accusations that authorities are slow to debunk, they sometimes take matters into their own hands. The shooter didn't wait for investigations or evidence. He believed the lie was so urgent that it justified immediate action.
Why the Asymmetry Exists
Several factors conspire to make Brandolini's Law so consistently and devastatingly true:
Cognitive shortcuts favor speed over accuracy. Kahneman's "fast thinking" loves a catchy lie. Our brains are wired for pattern recognition and narrative, not rigorous fact-checking. An appealing story that confirms what we already suspect? That spreads. A careful deconstruction of that story? That requires "slow thinking"—deliberate, effortful, and exhausting.
Specialization creates knowledge barriers. When someone makes a false accusation of criminal behavior, debunking it requires legal expertise, investigative resources, forensic analysis, and courtroom time. The false claim requires no expertise at all. The asymmetry is built into the structure of knowledge itself.
Proving negatives is harder than making claims. "This person didn't commit this crime" is infinitely more difficult to demonstrate than making the accusation. The accuser can cherry-pick any suspicious behavior, ignore exculpatory evidence, and wave away alibis. The accused must account for every minute of their time, produce documentation for movements, and explain why absence of evidence matters. One is a sprint; the other is a marathon.
The legal system amplifies the asymmetry. In theory, courts should correct for misinformation. In practice, they make it worse. Defense costs tens of thousands or hundreds of thousands of dollars. The accuser faces no financial penalty for lying, especially in criminal cases where prosecutors handle the case. Even if you win a defamation lawsuit, you often can't recover your full legal costs. The system is stacked against truth.
Confirmation bias amplifies misinformation. People share things that align with their worldview, not things that challenge it. A false accusation that resonates emotionally spreads like wildfire. A correction? That's friction. That's cognitive dissonance. Many people won't even read it, and those who do often double down on their original belief—a phenomenon psychologists call the "backfire effect."
Real-World Consequences in the Workplace
Brandolini's Law doesn't just plague criminal justice and social media—it infiltrates professional environments too. Picture this: A team member reads an article claiming blockchain technology is a security panacea and proposes rewriting your entire codebase using it. Several colleagues get excited. Now you, as a technical leader, must invest hours explaining the nuances of cybersecurity, the specific vulnerabilities blockchain doesn't address, and why wholesale rewrites are almost never a good idea. You might need to arrange training sessions. Create presentations. Have one-on-one conversations.
The original claim? Five minutes of enthusiastic Slack messages.
Your response? Weeks of effort, political capital, and exhausting technical discussions.
Or consider workplace accusations. Someone claims a colleague engaged in misconduct. Even if completely false, refuting it requires HR investigations, witness interviews, documentation reviews, and potential legal counsel. The accusation takes seconds. The investigation takes weeks or months. Even if cleared, the accused often faces lingering suspicion—the shadow of Brandolini's Law stretching across their career.
The pattern repeats across industries: finance, education, medicine, engineering. Wherever expertise and reputation matter, Brandolini's Law lurks, and false claims can destroy faster than truth can rebuild.
When to Fight, When to Walk Away
Perhaps the most important insight about Brandolini's Law is knowing when the battle is worth fighting. Not every piece of misinformation deserves your time—but some lies are too dangerous to ignore.
Consider the impact versus the effort. Will this misinformation cause real harm? Is the audience large or influential? Are people making decisions based on false information? If a colleague incorrectly claims one programming language is superior to another at lunch, maybe let it slide. If they're falsely accusing someone of misconduct, you must speak up.
Look for shortcuts. Sometimes you don't need to build a comprehensive rebuttal from first principles. Point to a credible source. Highlight the speaker's conflicts of interest. Ask probing questions that expose logical inconsistencies. Make them do some of the work.
Focus on the audience, not the source. Trying to convince someone deeply invested in misinformation is usually futile. They have psychological and sometimes financial stakes in their position. But the people listening? They might be persuadable. Frame your response for them.
Know when you need legal help. If someone is making false accusations that could damage your reputation, career, or freedom, document everything immediately. Consult an attorney, even if you can't afford to sue—they can advise on protective measures. Don't assume the truth will automatically prevail.
Know when to write someone off. There's a useful aphorism: debating some people is like playing chess with a pigeon—no matter how good you are, the pigeon will knock over the pieces, defecate on the board, and strut around like it won. Some battles aren't worth fighting. But distinguish between pointless arguments and dangerous lies.
Strategies for Managing the Asymmetry
While Brandolini's Law describes an inherent imbalance, that doesn't mean we're powerless. Several strategies can help:
Apply skepticism systematically. Use frameworks like CRAAP (Currency, Relevance, Authority, Accuracy, Purpose) to evaluate information before accepting or sharing it. Insist on reputable evidence. Check sources. Question motives. This is especially critical with accusations—extraordinary claims require extraordinary evidence.
Document obsessively when facing false accusations. If you're accused of something you didn't do, document everything. Save communications. Preserve evidence. Get witness statements. The burden of proof may theoretically be on the accuser, but in practice, you'll need overwhelming documentation to clear your name.
Embrace the scientific method. Form hypotheses, test them, and actively try to falsify your own beliefs. This approach naturally filters out weak claims while strengthening valid ones.
Inoculate rather than react. Teaching people how to identify misinformation before they encounter it—a technique called "prebunking"—can be more effective than debunking after the fact. Media literacy is preventative medicine.
Create authoritative resources. When misinformation recurs, pointing to existing comprehensive fact-checks is more efficient than starting from scratch each time. Building and maintaining these repositories takes effort upfront but saves energy in the long run.
Respond with curiosity, not combat. When someone shares misinformation, especially someone you care about, approaching with genuine questions rather than accusations can be more effective. "That's interesting—where did you hear that?" opens dialogue. "That's completely wrong" shuts it down. But be ready to escalate if the misinformation is genuinely dangerous.
Support legal reforms. Advocate for systems that make it easier to clear your name when falsely accused. Support stronger penalties for knowingly false accusations. Push for mechanisms that help victims of defamation access justice without being bankrupted by legal fees.
The Broader Implications
Brandolini's Law reveals something troubling about modern information ecosystems and justice systems alike: they're structurally biased toward misinformation. Platforms optimize for engagement, not accuracy. Algorithms amplify content that triggers strong emotions—anger, fear, outrage—because that content generates clicks, shares, and ad revenue. Truth is often boring. Lies are frequently compelling.
The legal system, meant to correct for this, instead amplifies it. False accusations are cheap to make and expensive to refute. Even when truth prevails, it rarely arrives fast enough or completely enough to undo the damage. Richard Jewell was exonerated, but he died with his name still attached to suspicion in many minds. The Duke lacrosse players were declared innocent, but it took 18 years for their accuser to publicly admit she lied—and even then, many never heard the admission.
This isn't just an individual problem; it's systemic. When innocent people must spend hundreds of thousands defending themselves from lies, society suffers. When journalists must spend days refuting conspiracies that took minutes to create, investigative reporting suffers. When law enforcement chases false leads while real criminals remain free, justice suffers.
The asymmetry becomes even more pronounced when it intersects with violence. Edgar Maddison Welch fired a rifle in a pizzeria based on lies that took hours to fabricate and days to spread. By the time authorities could thoroughly debunk the conspiracy, he'd already driven six hours with a loaded weapon. Brandolini's Law doesn't just describe an information problem—it describes a violence problem.
Living with the Law
Alberto Brandolini's observation wasn't meant to demoralize us, though it can feel that way. It was meant to illuminate a problem so we can approach it more strategically. Understanding the asymmetry helps us allocate our energy wisely, set realistic expectations, and develop better tools for navigating an information landscape increasingly hostile to truth.
The law also reminds us to extend compassion—to those falsely accused who must spend fortunes clearing their names, to fact-checkers grinding through endless claims, to journalists investigating complex stories while competing with viral sensationalism. Their work is Sisyphean not because they're doing it wrong, but because the game is rigged.
More than a decade after Brandolini posted that late-night tweet, his principle has only grown more relevant. We're drowning in information and starving for wisdom. The asymmetry persists, perhaps even worsens as AI tools make generating convincing misinformation easier than ever. False accusations can spread globally in hours. Court systems work on timelines of months or years.
But knowing the law exists—giving it a name, understanding its mechanisms—is the first step toward changing the game. We can't eliminate the asymmetry, but we can stop being surprised by it. We can build better systems. We can teach better skills. We can choose our battles more carefully. We can demand reforms that make truth less expensive and lies more costly.
And maybe, just maybe, we can make the truth a little less exhausting to defend.
Because the alternative—a world where lies travel faster than truth, where false accusations destroy lives before justice can intervene, where the asymmetry can turn violent—is too grim to accept without a fight.