At some point in the last decade, our relationship with science changed. Not science itself — the methodology, the peer review, the slow and unglamorous process of testing claims against reality. That’s still there. What changed is how we use it, or more precisely, how we deploy it.
We don’t follow science anymore. We wear it. We put it on when it supports an argument we already believe, and take it off the moment it asks us to actually do something different.
The New Scientific Method
The pattern is consistent and recognizable once you know what to look for. Start with the conclusion. Find a source — a study, a clip, a credentialed-sounding person on a podcast — that supports it. Share it widely. Treat anyone who questions the framing as either naive or compromised. Repeat.
That’s not scientific reasoning. That’s confirmation bias with a citation attached.
The more telling version is the selective application. The same person who tracks their VO2 max, optimizes their creatine protocol, monitors their sleep stages, and carefully manages their cold plunge temperature — all of which are applications of physiological science — will dismiss or deflect the moment that same body of knowledge says something inconvenient. Sleep more. Drink less. Take a walk. Deal with the psychological pattern you’ve been calling a productivity strategy.
At that point, suddenly, science is complicated. Science is manipulated. The studies can’t be trusted.
The selective application is the tell. It reveals that the commitment was never to inquiry — it was to permission. Science as a tool for confirming what we already want to do, and skepticism as a shield when science suggests we stop doing it.
The Performance Age
We are not living in the information age. We are living in the performance age. What travels is not what’s accurate — it’s what’s shareable. What gets amplified is not nuance. It’s certainty. A measured expert saying it’s complicated, and here’s what the evidence actually shows will be outperformed in every engagement metric by someone with a ring light and an unqualified opinion delivered with maximum confidence.
That’s not an algorithm problem, though the algorithm doesn’t help. It’s a human appetite problem. We are drawn to certainty. We trust the person who seems sure. And in a world where social engagement rewards performance over precision, the incentive structure runs directly against honest intellectual engagement.
The result is that reality isn’t being discovered publicly. It’s being marketed. Science gets treated like a brand — applicable where it flatters, ignorable where it doesn’t, always subordinate to the identity it’s meant to support.
The Permission Economy
Most people, if they’re being honest, don’t primarily want truth. They want permission. Permission to stay exactly as they are. Permission to continue disliking the people they already dislike. Permission to hold the beliefs they already hold without the cost of revision. Permission to keep the identity intact.
Truth is expensive. It costs you something. It might cost you your position in a community that has built its identity around a particular claim. It might cost you a habit you genuinely enjoy. It might cost you the comfort of being the person who got it right — because it turns out you got it wrong, and updating is uncomfortable in a way that staying wrong is not.
A lot of people would rather defend a wrong position indefinitely than absorb the brief discomfort of revision. That’s not stupidity. That’s a very human prioritization of social stability over accuracy. Understanding that doesn’t make it less corrosive at scale.
The Only Question That Matters
The diagnostic is simple. Do you believe in science when it insults your lifestyle? Do you hold the claim when it exposes a behavior you’d rather not examine? Do you maintain intellectual honesty when the honest position makes you the problem in the narrative you’ve been telling?
If the answer is yes only when science flatters you — you don’t believe in science. You believe in yourself. Which, to be fair, is the oldest and most persistent religion on the planet.
The exit from this isn’t more information. We have more information than any generation in history and it has not made us more epistemically humble. The exit is a different orientation — the willingness to ask not what do I want to believe but what’s actually true, even when it embarrasses me?
That requires something rarer than intelligence. It requires the willingness to say: I might be wrong. I got played. I chose comfort over accuracy.
That’s not a weakness. That’s what intellectual adulthood actually looks like. And without it — with selective science as the operating standard — we are not getting smarter. We are getting more sophisticated at staying exactly where we are, which is not the same thing at all.
