Every day we surrender a little more control to digital systems we barely understand. We're not just scrolling; we're slipping into a future built from data we willingly give away.
It starts with a notification that goes unnoticed—an update, a trending hashtag, another tech billionaire casually suggesting they're about to change the world. Nothing shocking. Nothing out of the ordinary.
Until it is.
Welcome to Project X.
Welcome to Project X. I’m your Narrator.
What follows is reconstructed from case transcripts, witness statements, and leaked classroom surveillance. The systems are real. The faces have been changed.
The consequences began long before anyone noticed.
Acceptable Losses
The first time I met Amelia, she was teaching a digital encryption seminar at Berkeley Law. I was there to give a guest lecture on privacy rights in the age of algorithmic justice—my third court victory against Mr. StarNet himself that month had made me the go-to expert.
Her students watched me dissect the Silicon Valley doctrine of "acceptable losses" while I built the argument that would later become the cornerstone of the Digital Rights Defense Act.
After class, we talked until the janitor kicked us out. She understood what most technologists couldn't – that law was becoming code, and code was becoming law. That night, over drinks that turned into dinner that turned into breakfast, we drafted our manifesto on a coffee-stained napkin: "Digital liberation through technological revolution."
We were so fucking naive…
The lecture hall smelled of chalk dust and stale coffee. Late afternoon sun slanted through venetian blinds, casting prison-bar shadows across two hundred empty seats. All except the front row, where twelve students leaned forward, laptops closed, eyes fixed on me as I dismantled their digital world.
"Show of hands," I said, chalk dust catching the afternoon light. "Who here has signed up for StarNet's newest update?"
Eight hands rose tentatively.
"Keep them up if you've read the terms of service."
Seven hands dropped.
"Interesting." I smiled, but there was no warmth in it. "Now, who can tell me what happens to your data when you're accused of a crime?"
The lone hand wavered, then fell.
From the back of the room, Professor Amelia Barker watched, her encryption lecture notes forgotten on the desk. Something in her posture told me she'd stopped preparing the moment I started speaking.
"Your search history. Your location data. The midnight anxiety your fitness tracker logs as 'sleep disturbance.' Even your heart rate during sex." My chalk scratched three words across the board:
COLLECTED, ANALYZED, WEAPONIZED.
I underlined WEAPONIZED three times.
"What happens when the algorithm decides your elevated heart rate means guilt? When your digital footprint becomes evidence? When a black box AI determines your sentence?"
A student in wire-rim glasses raised his hand. "But don't they need probable cause?"
"Do they?" My chalk hovered. "Check page 47 of StarNet's terms of service. The one you didn't read. Section 12, subsection C: 'User data may be shared with law enforcement partners for system optimization and public safety purposes.'"
"That's... vague," someone muttered.
"Exactly. Vagueness is their weapon. Complexity is their shield. They bury injustice in language so dense it might as well be code. Relying on private algorithms to determine the fates of human lives has never worked, and more innocent people die to algorithms per day than to disease anymore. But that’s a whole other story."
I caught Amelia nodding.
The room had grown so quiet I could hear the buzz of the fluorescent lights.
"They call it 'acceptable losses,'" I said, writing the phrase in quotes. "The people caught in the crossfire. Lives ruined by 'algorithmic efficiency.' The casualties of progress. But acceptable to whom?"
The bell rang, making everyone jump. Students blinked like they were coming out of a trance, gathering their things in slow motion. I watched them file out, each carrying a seed of doubt about the digital justice system they'd taken for granted.
Amelia approached the board where I was erasing evidence of this nightmare dystopia we are training children to survive.
"Interesting lecture. Especially the part about weaponizing code against its creators."
I turned, eraser dust coating my sleeve. “Thank you. I’ve seen a thing or two.”
"The privacy rights lawyer who made their AI confess bias in open court?" She smiled. "I've read your briefs. All of them."
"Done your homework, I see."
"When opportunity walks into my classroom? Always." Her smile widened. "Buy me a coffee?"
"I have a better idea,” I glanced at my watch. “Know any bars where the cameras don't work?"
Her eyes lit up. "As a matter of fact, I do."
Neither of us knew we'd just lit a fuse that would take years to blow right back in our faces.
Acceptable losses was only the beginning.
The Digital Liberation continues in PART TWO →
[LINK COMING SOON]
Until the next story,
Your Narrator
To see the entire Project X series, click HERE.
New? Start by clicking the link down below.
So You’ve Fallen Into the Void: A Survival FAQ
Welcome. If you’re confused—you’re not alone. This kind of interactive fiction wasn’t part of the original plan, but I don’t make the rules in the Void.
Want to return to the crossroads? Click the link below.