Community Check The Noisy Room A Story About Common Confusion

It began with a study. In December of 2025, Stanford researchers analyzed 2.2 billion social media posts looking for a pattern. They wanted to know what percentage of users posted severely toxic content. Not rudeness, not sarcasm, but speech that was so hateful that 90% of the world would flag it as being problematic.1 With this data in hand, they then asked thousands of people to answer a simple question: Take a guess. What percentage of social media users do you think post severely toxic content? ? 0% 50% 100% Lock in my guess They were surprised by the results. They had discovered an enormous reservoir of misperception that had been hidden in plain view.

The Bar Here is the simplest version of the problem. Imagine walking into a bar with a hundred people inside. Three of them are shouting — about politics, about each other, about whatever gets a reaction. The other ninety-seven are talking at a normal volume. But there's a bouncer at the door, and he gets paid by the minute you spend staring. So he's wired the three loudest people into the sound system and turned it all the way up. You walk in, hear the roar, and conclude: this place is full of lunatics. Never hearing the 97 people having normal conversations a few feet away. You could leave, but all your friends are inside. You're stuck. This is how social media deals with contentious topics. The bouncer is an algorithm. And whether you like it or not, you've been a bystander. Pick a contentious topic. This is what your feed might look like. Reading this feed, you might reasonably conclude that the country is split between unhinged extremes. It is not. And the gap between what Americans actually believe and what the feed suggests they believe may be the most consequential thing the platforms haven't shown you.

See the Room Let's visualize this as a single room with 100 people inside. This is what it looks like: 97 regular users 3 users who have posted toxic content 3% → 33% On most platforms, ~3% of accounts produce 1/3 of all content Your feed Engagement ranking amplifies high-reaction content from the prolific few ▼ Your feed ▼ See their content → The actual room. 3 out of 100 users have ever posted severely toxic content. This pattern repeats across platforms. On Twitter/X, toxic tweets receive ~86% more retweets and ~27% more visibility than non-toxic ones, 0.3% of users shared 80% of all contested news,14 and just 6% of users produce roughly 73% of all political tweets.16 On TikTok, 25% of users produce 98% of all public videos.15 The specific numbers vary. The dynamic is the same: a small minority of highly active users overwhelms the majority. After a time consuming content in this room, your brain performs a kind of ambient demography. The feed becomes a sort of census. You conclude — logically — that the behavior must be widespread. The room might just be full of extreme people! Maybe most people do believe these crazy things.

This is not just about what we see on social media If this were just about tone of our social posts, it wouldn't matter very much. But this distortion ends up causing some seriously bad patterns of behavior. Pattern 1 The Majority Goes Silent When the majority of people looks at the feed and assumes they're outnumbered, people will often self-censor.3 The dynamic replicates on social media17 — fear of social isolation suppresses opinion expression on platforms where it's perceived to be unwelcome. They go quiet, or they leave a platform entirely. They cede the space to users with more extreme politics. Pattern 2 The Loud Minority Thinks It's the Majority The minority who aggressively post end up with their own distortion – believing they are part of the majority.5 A study of 17 extremist forums found the same pattern: the more someone posted, the more they believed the public agreed with them. More engaged participation bred false consensus. Pattern 3 Everyone Gets Each Other Wrong Both sides develop wildly inaccurate beliefs about who the other side actually is.6 See how some of your own beliefs line up: What percentage of Democratic supporters do you think are LGBTQ? ? 0% 50% 100% Lock in my guess What percentage of Republican supporters do you think earn over $250,000 a year? ? 0% 50% 100% Lock in my guess The distortion extends to policy beliefs. Step through to see the perception gap on the issue of immigration. Show where Democrats place Republicans → Source: More in Common (2019) & Moore-Berg et al., PNAS 2020. Illustrative. Pattern 4 Politicians Follow the Perceived Room, Not the Real One Elected officials are very good at sensing political sentiment. It's literally their job. (They are not elected to correct people's beliefs.) Politicians who can build a coalition about a perceived belief are more likely to win. They position themselves against an opponent that doesn't exist, but their supporters think exists. And remember: most of our politics now happens on social media. Candidates often read the same distorted feed. They are unlikely to change their minds. The window of discourse shifts. Not because opinions changed, but because perceptions of opinions did. Pattern 5 Misperception Turns into Hostility When you believe the other side is extreme, you become more willing to treat them as a threat.7 Both Democrats and Republicans vastly overestimate how many on the other side support political violence. The result is a populace primed to assume the other side is ready to do horrible things. "What percentage of the other side supports political violence?" Democrats believe estimate 35.5 % 35.5% 3.4× off of Republicans support political violence Republicans believe estimate 37.1 % 37.1% 4.0× off of Democrats support political violence What's the real number? → Both sides were wrong by 3 to 4 times. When researchers corrected these beliefs, partisan hostility dropped. Mernyk et al., PNAS 2022 · n=4,741 Each step feeds the next. The distortion is self-reinforcing.

Knowing Isn't Enough Okay. So now you know that a small minority dominates the feed. You know that Republicans and Democrats actually have a far more nuanced set of opinions about contested issues. Does that fix it? Not really. You also know that everyone else doesn't know it. And if the world continues operating as if the distortion is real, you should probably act the same — even though you know it's wrong. The room hasn't changed, even if you know people inside it are confused. This is called a common knowledge problem. Private knowledge Common knowledge Private knowledge You've read the stat. But you have no idea who else has. The feed still looks the same. You still assume you're outnumbered. You stay quiet. Steven Pinker lays this out cleanly in his excellent recent book When Everyone Knows That Everyone Knows.8 Learning a fact changes what you know. Seeing it displayed publicly — where everyone else can see it too — where you know others can also see it, changes what everyone knows, and subsequently how they act. Social media has no public square. It has 300 million private windows, each showing a different distortion of the same room. Illuminating the common thoughts between us has the potential to radically change it.

The Idea So what can we do about this? Fortunately, there's some good evidence showing how it can be fixed. Multiple studies show that when misperceptions are corrected in a public way, hostility drops. Mernyk et al. found that a single correction reduced partisan hostility for a full month.7 Lee et al. found that correcting overestimates of toxic users improved how people felt about their country and each other.1 We can do this today. Imagine every post on a contested topic had a quiet link beneath it. Not a fact check, a label, or a warning. Instead — what if it had a Community Check? How do people actually feel about this? → ↑ click here Community Check · This Platform Where do people stand on this issue? 72% Support 19% Support with conditions 9% Oppose A Community Check is an open-source design layer that could be deployed across social media, beneath contentious posts, to help users understand how other people on the platform (or the nation) actually feel about an issue. It is a way of quickly adding context to the most hot-button viral issues, giving people more visibility into the opinions of the public.

The Idea in Action Let's explore this intervention with a topic that cuts across political identity: $ $ $ Money in Politics On the surface, this seems contentious. But it's actually a supermajority issue: 81% are concerned about the influence of money on elections, including 78% of Republicans and 90% of Democrats. 75% say unlimited spending weakens democracy. Only 15% believe unlimited political spending is protected free speech. And yet, very little changes, largely because everyone assumes the other side is fine with it. The feed is full of people defending their team's donors and attacking the other team's. It might look like a 50/50 partisan battle, but it's not. It's a majority consensus that cannot see itself. What if you could see this consensus? @real_talk_politics · 2h Everyone complains about money in politics but the second their candidate gets a massive donation they shut up real fast. You don't hate money in politics. You hate when the OTHER side has more of it. ♡ 11,847 ???? 6,203 ↻ 2,891 How do people actually feel about this? → ↑ click here Community Check draws from a random sample of platform users + robust national polls, surveyed independently of the content. The sample is statistically representative. The results update continuously. And critically: everyone sees the same numbers.

Why This Isn't Fact Checking or Audience Polling Traditional fact-checking is a top-down approach that often feels like it's dictating from above. This is hard for people to stomach. Content moderation for many years now has been perceived as removing speech. This simply adds context, much like the crowdsourced feature Community Notes (an inspiration for this project). Nor is this just a user-poll under a post. Instead it's drawing from all platform users, coupled with statistically significant national surveys. It's an actual window into the views of the majority, not just the views of those looking at the post.

It Works for Video Too Short-form video is the fastest-growing vector for political distortion. The same dynamic applies — a small minority of creators produce the vast majority of political content — but video bypasses the pause that text gives you. Community Check can adapt. Tap through to see how. Money IS free speech. Deal with it. ???????? Citizens United was CORRECT @liberty_caucus_tv Follow #FreeSpeech #CitizensUnited ???? ♡ 5,247 ???? 612 ↗ 1,742 See what platform users and Americans think → Community Check ✕ What do platform users and Americans think? 83% Favor a constitutional amendment to allow spending limits 81% Say they are concerned about money's influence in politics 15% Say unlimited spending should be treated as protected free speech American Promise/Ipsos 2026 · Issue One/YouGov 2025 · Pew 2023 Pinned Community Check Community Check: 83% of Americans favor allowing limits on political spending. 15% say unlimited spending is free speech. See what the community thinks ↗ 1 The video plays 2 Banner appears 3 Tap to expand 4 Pinned comment A political video crosses the engagement threshold. 51K views, 612 comments (1.2%), 1.7K shares (3.4%). The feed shows outrage. But what do people actually think? See technical specs for how it works below ↓

We Could Do This Now Platforms already have a lot of these capabilities. They already survey users. They even know how to run sophisticated polls. There are a few technical details to work out (spec here), but this is not a hard problem to solve. The unseen majority is the public. And the public deserves to know itself. A tiny minority, dominating the feed. That's all it ever was. The rest of us were here the whole time, quiet and decent and waiting to be seen. Follow my other work here See FAQ ↓ See technical specifications ↓

Your Turn A Community Check on Community Check Live Two quick questions. Then see how other readers answered, live, right here. Question 1 of 2 Would seeing representative polling data beneath social media posts change how you interpret them? Yes, definitely Probably Not sure Probably not No Question 2 of 2 Should platforms show representative polling data alongside contentious posts? Yes Yes, with conditions Not sure No Tallying responses…

Help Surface the Quiet Majority If this changed how you see the feed, share it with someone who might need to see it too. Post on X Bluesky Threads LinkedIn Facebook Email Copy link Stay in the loop Get notified when Community Check launches or gets platform adoption. Subscribe Community Check is a free and open specification. The complete technical spec, research base, and open questions are published for researchers, engineers, and platform designers to stress-test and build on. Please steal it with attribution. View on GitHub

Common Questions