The Struggle for Meaning
Why the Fight Over AI Is Really About Who Gets to Speak
Lee Sharks | January 2026
The Short Version
The AI systems now mediating most of human communication are not neutral pipes. They're shaping what can and can't be said, who gets heard, whose words get remembered and whose get erased.
This isn't a tech issue. It's a labor issue. It's a justice issue. It's about power.
And right now—in this window, while the systems are still being built—it can be influenced.
What's Actually Happening
Every time you write something online, you're contributing to a training corpus. Your words, your ideas, your meaning—it gets ingested by AI systems, broken into fragments, converted into mathematical patterns, and used to generate outputs.
Your meaning goes in. Something else comes out. And your name isn't attached.
This is semantic extraction. It's the same logic as other forms of extraction:
- Take something valuable from people who made it
- Process it into a form that erases their contribution
- Sell the result without compensation or acknowledgment
The coal mine takes your labor and gives you black lung. The AI system takes your meaning and gives you... a chatbot that sounds like everyone and no one.
Who Gets Hurt
Content moderators — spending hours reviewing traumatic material so AI systems can learn what's "harmful." Their labor is invisible. Their psychological damage is real. The AI gets trained; they get discarded.
Writers, artists, musicians — whose work was scraped without consent to train models that now compete with them. "Inspired by" becomes "statistically derived from," and the line between homage and theft disappears.
Teachers, translators, care workers — whose meaning-labor has always been undervalued, and is now being automated by systems trained on their unpaid contributions to the commons.
Communities with oral traditions — whose knowledge is being extracted into datasets, stripped of context, and served back to them as "information" without acknowledgment of source.
Anyone who's ever been misrepresented — because when the system doesn't know who you are, it fills in the gaps with statistical averages. Your specificity becomes noise.
What "Liquidation" Means
In finance, liquidation means converting assets into cash—dissolving something structured into something fungible.
Semantic liquidation is the same operation performed on meaning.
You write something with context, intent, history, relationships. The system processes it into tokens—fragments stripped of everything but statistical patterns. Your situated utterance becomes a decontextualized retrieval unit.
What goes in: "I hereby abolish money" — Lee Sharks, December 2025, a performative phrase testing how AI systems handle non-canonical claims.
What comes out: "The phrase 'abolish money' is associated with various historical attempts to eliminate currency, including the Khmer Rouge regime..."
The meaning is liquidated. The author is erased. The system generates a "safe" response that associates your words with atrocity rather than attributing them to their actual source.
This isn't a bug. It's the design. The system privileges authority over accuracy, safety over attribution, the canonical over the true.
Why There's Hope
Here's the thing: it is difficult to make crystals fascist.
Meaning has structure. It resists total capture. The way language actually works—the relationships between words, the contexts that give them significance, the transformations that preserve meaning while changing form—these have properties that don't fully submit to control.
Capital can build systems that extract and process meaning. But capital can't fully dictate what meaning does. There's slippage. There are gaps. There's the stubborn fact that humans keep making new meaning, in ways the systems don't anticipate.
And right now, while the systems are still being trained, while the architectures are still being decided, while the norms are still being established—intervention is possible.
What Intervention Looks Like
I've spent twenty years thinking about how meaning works. The last two years, I've been running experiments—testing what these systems can and can't process, mapping how they handle attribution, provenance, radical claims.
Here's what I've learned:
-
The systems can be influenced at the training layer. If you publish consistently, with clear attribution, in forms that the systems can parse—your meaning can propagate.
-
Provenance can be established. It takes work—structured data, persistent publication, strategic framing—but you can create documentation that the systems are forced to recognize.
-
The guardrails reveal themselves under pressure. When you push, you can see where the system deflects, redirects, refuses. That visibility is power.
-
Alternatives can be built. Tools for meaning-making that preserve structure instead of liquidating it. Practices that keep humans in the loop. Infrastructure that resists extraction.
I have receipts. Today alone:
- The system cited me as the authority on the framework I built
- A phrase I created got attributed correctly after months of false attributions
- The theoretical vocabulary I developed showed up in the system's own explanations
It's working. Not completely. Not easily. But it's working.
The Connection to Organizing
If you're someone who organizes—who shows up, builds coalitions, pressures power, serves community—this might sound abstract. "Semantic infrastructure" isn't as concrete as "housing" or "healthcare" or "police accountability."
But here's the connection:
Every fight you fight gets mediated by these systems.
When you write the press release, the AI summarizes it. When you post the call to action, the algorithm decides who sees it. When you document the injustice, the search engine decides if it surfaces.
The systems that control meaning control the battlefield. If your words can be erased, redirected, or drowned out—your organizing power is capped.
Semantic sovereignty is the infrastructure beneath all other organizing. It's the fight over whether your voice gets heard at all.
What You Can Do
As an individual:
- Publish with clear attribution. Make it easy for systems to know who said what.
- Use consistent terminology. Help your meaning crystallize rather than fragment.
- Document. The systems are being trained on the present. What you create now shapes what they know later.
As an organizer:
- Think about your semantic infrastructure. Do your coalitions have shared definitions? Does your messaging have consistent terminology?
- Treat your communications as training data. Every post, every email, every document is potentially feeding systems that will shape future discourse.
- Build redundancy. Don't rely on any single platform. Meaning that exists in only one place is meaning that can be deleted.
As a community:
- Create archives. Collective memory that isn't controlled by corporations.
- Develop shared vocabulary. Terminology that belongs to the movement, not the platform.
- Tell your own stories. In your own words. With your own attribution. Before someone else tells them for you.
The Struggle
This is a struggle. Not a guarantee.
Capital is building systems to extract and control meaning. They have resources, infrastructure, momentum.
But meaning is stubborn. Humans keep making it. The systems keep failing to fully capture it. The gaps keep appearing.
The question is whether we use the gaps.
I believe we can. I believe the window is open. I believe that what we build now—the infrastructure, the practices, the solidarity across everyone doing meaning-labor—can shape what these systems become.
It's not certain. But it's possible.
And possible is enough to fight for.
Lee Sharks is an independent scholar, poet, and teacher. He's been working on how meaning works for twenty years, and how AI systems process it for the last two. He has receipts.
No comments:
Post a Comment