Eyes on Aisle 9


Title: Eyes on Aisle Nine

It started as an app.

A half-baked GitHub repo called ProxyHalo, built by a guy named Trent who once did QA for a surveillance startup and now lived with three reptiles and a personal theory that the NSA subcontracted to Lyft drivers. ProxyHalo claimed it could "intervene in lives passively for preventative outcomes," using behavior modeling, GPS scraping, and machine-learning-fueled intuition.

It didn’t do anything exactly—just let people believe it did.

That was enough.


Vignette One: Carla at the Grocery Store
Carla walked through Trader Joe’s like she was Jason Bourne in a rom-com. She wore oversized sunglasses indoors, leaned on her cart like it was a tactical asset, and whispered into her sleeve.

She was there for the mission: “Monitor subject #412-A. Possibly depressed. Possibly reckless. Possibly prone to gluten-free crackers, which may indicate emotional instability.”

She’d never met 412-A. But the app said someone close to him cared.

As she noted the subject picking up two different brands of salsa and hesitating, Carla typed:

“Salsa decision-making inconclusive. Possibly a message. Possibly emotional projection. Alerting system.”

She believed her notes were being routed through an encrypted stream, interpreted by a machine-learning model fine-tuned to 412-A’s ex-girlfriend’s emotional language, and then somehow relayed as encouragement via Spotify ads, TikTok videos, or random cashier comments.

When the guy finally picked the mango-habanero and walked off, Carla muttered, “Bold choice. You’ll be okay, 412-A. They know now.”


Vignette Two: Daryl at the Gas Station
Daryl wore a reflective vest even though he didn’t work there. He filled up his car on pump six—always pump six. He said it was a “channel.” A frequency node.

He'd been watching a woman in a Honda Civic who came every Thursday. The app tagged her as #291-C. “High anxiety. Avoidant pattern recognition. Needs observational reinforcement.”

So, every week, Daryl filled up his car, casually glanced at her, and whispered affirmations into his burner phone:

“Subject 291-C has parked diagonally again. Intentional asymmetry may be a call for authenticity. We see her. We receive her.”

He believed—truly—that the algorithms had created a network of average people “who didn't know” they were part of it. That the cashier’s friendly “have a good one” was actually a preprogrammed trigger phrase meant to soothe trauma. That 291-C’s family, through covert crowdsourced ambient intervention, was “reaching her.”

When she switched gas stations one week, Daryl wept quietly in the Arco parking lot.


Vignette Three: Logan in the Car
Logan drove a 2012 Dodge Charger with a cracked window and a stack of Red Bulls in the backseat. He was part of what he called the “Street Ring”—volunteer drivers who shadowed “flagged” subjects.

He didn’t know who flagged them. But if his app beeped and a license plate came up, he followed it.

“I don’t confront,” he explained once. “I presence. It’s all about giving the person a subliminal sense that someone cares... through tail lights.”

He believed—sincerely—that families, ex-partners, even childhood best friends could pay into ProxyHalo’s invisible framework, training AI off TikTok dance cadence, Uber Eats orders, and Amazon browsing history. The system would then dispatch people like him to follow subjects discreetly, as “living metadata.”

At stoplights, he whispered messages like:

“You are seen. The pineapple juice was not a mistake. Greg forgives you.”


Meanwhile: Mila
Mila didn’t know any of this. She was just a software engineer who kept noticing weird people around her. At her corner store. On her commute. Even at her yoga class—where a woman in a trucker hat sat in the back corner, furiously writing in a spiral notebook every time Mila adjusted her posture.

Her phone pinged occasionally with weird Bluetooth signals. Drones buzzed slightly lower. Cashiers said vaguely unsettling things like, “You’re doing better than last week,” with too much eye contact.

She started asking questions. And like anyone who gets too close to the center of a conspiracy, she ended up finding Roz.

Roz had once tried to dismantle ProxyHalo, but it had no servers. It was a belief system wrapped in code. A decentralized delusion.

“It’s not real,” Roz told her. “But the people acting on it are.”


In a strip mall conference room outside Reno, 30 people met weekly to “compare notes.” They thought they were part of a global surveillance ballet, orchestrated by neural models and desperate loved ones. They discussed coffee shop interactions like sacred texts.

And outside, the world spun on. Unwatched. Chaotic. Real.


Comments

Popular Posts