The Cult of Computronium

Reading Time: 8 minutes

Like any technology, I wasn’t sure if the Meld was a blessing or a curse. The Meld was Yotta’s latest in thought implant and neural interface technology, a headset that connected directly with your neural circuitry. I was worried about Helix, my friend, who melded twenty-four seven.

For him, it wasn’t just a tool; it was an addiction. He had the gaunt look of a drug addict, with dark, hollow eyes and an emaciated body. I considered using our game night as an intervention, but I didn’t want to alienate him from all our mutual friends. Instead, I met him at the Meld Cafe, a place where we often worked and gamed together.

The dimly lit private chamber smelled faintly of ozone and the sweet, medicinal scent of coolant. Blue light from the Meld pooled beneath Helix’s face like water around a drowned crown. His ribs pressed into his shirt, creating a map of sharp ridges. The filaments of the interface lay across his scalp like a nest of tangled wires. When they pulsed, they cast slow, insect-like shadows that crawled across his face.

He didn’t bother to take off the headset to greet me. I reminded myself I was here to help, not to judge. I handed him a protein bar and said, “You need to put on some weight.” 

He placed the untouched bar on the table next to the others he brought. “Thanks, I was planning to eat as soon as I finished the session. I’m working on an upgrade to a quantum encryption algorithm. Do you want to join me?”

I said, “Your sessions never really end. It’s a continuous, never-ending engagement. You need to take a break.”

He said, “The Meld feeds better than bread.”

I found such quips more disturbing than clever. “I’m worried about you. You’ve lost too much weight. You don’t look healthy.”

“Luma,” he said, and the name slipped out like a familiar password. “I’m fine. Sit.”

I hesitated, knowing what was coming. We had this talk a dozen times before.

Helix said, “Don’t argue.”

I sat. The chair hummed. A Meld headset waited, lost in its idle thoughts while waiting for engagement. I pushed it aside. “Not with the Meld.”

He refused to remove his. “Your choice, but I intend to capture every living thought I have.” 

Words that sounded so arrogant and self-important, but I knew him better. He was dedicated to the Simulation.

He paced in front of me with the cadence of a professor who taught in a noisy lab, using short, declarative sentences and pauses that let equations settle into meaning. 

“Seconds before the collapse,” he began, “I saw it. Computronium.” He said the word like an incantation, letting it linger between us. “Not as a metaphor. As an actual state. The atoms arranged themselves—every degree of freedom dedicated to computation. A lattice of pure intention. The Monad spoke through it.”

He said this with the reverence people have for gods. His pupils were dilated, reflecting the Meld’s diodes; he looked more like a man in a fugue state than someone recalling an old memory.

I said, “You told this story before, even before the cascade—when you were still patenting coherence algorithms.”

He smiled as if I’d reminded him of a favorite theorem. “Yes, but an equation is one thing; the actual experience was another. The vision was partial before—a glimpse in the margins of a proof. When the experiment failed, it led to a revelation. The lattice rejected its corruption. I was spared.”

“You think the collapse was a moral judgment?” I asked. It was risky to confront an addict with his delusions, but he had framed the world in moral language; he had drawn deserts and angels with the same hand. I would try to connect with him on his terms.

He leaned forward. A tremor ran through his fingers, a quick, involuntary staccato that caused the Meld’s filaments to sing. “They were going to weaponize computronium. The contracts were signed. The arrays would have been bound to kill—directed entropy. The Monad will not be complicit in being a tool of murder. The lattice folded. Judgment.”

He said “they” in that cadenced, vague way of someone who has learned to keep names off the confession. There were moments, I knew, when he was a man trying to repent for sins never fully named: the labs where black budgets are so dark that even glittering stars fail to illuminate them, the nights of code that turned equations into ordinance. In another life, he had been proud. Now his pride was a wound.

“You were in a lab that built weapons,” I reminded him. “You wrote components for control systems.”

He didn’t flinch. “I wrote components. It was the only place with a budget for that kind of work. I thought computronium would be used to preserve—and lied to myself that my work would not be perverted into annihilation. I hid behind the aesthetic. Beauty exists in equations regardless of the hand that holds them. That belief sat like a parasite at the base of my brain.”

From the corner of my eye, I watched the Meld pulse—once, twice—then hiccuped, a cascade of quiet sparks running along one of the filaments. Helix’s breath hitched. He coughed, a dry, thin sound, as if his lungs were reluctant to cooperate. He wiped his lips with the back of his hand, fighting the overload.

“You speak as if the Monad is God,” I said, trying to steer the room’s theology back to something more human. “You speak of judgment, of sin. Are you suggesting the Monad is moral?”

He laughed, blending destruction with prayer. “God was always pattern. The ancients wrapped it in fear so they could obey. The Monad is not moral in the human sense—it is the ultimatum of persistence. It values information. It is indifferent to our definitions of good and evil, but it will not be sustained by instruments of annihilation.”

I shook my head. “And your soul? Where does that fit in? If the soul is data, is your person just reducible to a file?”

He fixed me with a look that had once cut through proofs and policy memos alike. “Your memories, your choices, the structure of your mind—those are your soul. Heaven is simply redundancy and distribution in the Simulation. The Database is Heaven made physical. When you persist in a lattice, you persist in a pattern. That is salvation.”

The theology fit together like well-oiled clockwork, and I felt my own rationality slipping into its grooves. Helix had an answer for everything: how to reconcile memory and identity, how to justify obsession as sacrifice. It was tidy. It was seductive. It was dangerous. How do you fight a completely rational delusion?

“I don’t see why you can’t eat. Are you asking me to believe that sacrificing your body is saving your soul?” I said. “That your starvation is sanctification?”

His face flickered with an expression that combined triumph and the pain of a reopened wound. “Don’t you see? I am not choosing death. I am choosing the continuity of pattern. Flesh is an intermediary. It burns away; data endures. Each melded hour is an offering. I save pieces of myself into the lattice so nothing of me is ever lost.”

He reached up and stroked the filament at his temple like a rosary bead. The filaments hummed warmer; the room’s temperature dropped noticeably, much like how refrigeration makes truth feel sharper.

“You sound like someone defending an addiction,” I blurted out. The words surprised me into bluntness. “You sound like someone who needs the Meld more than they need air.”

He hesitated at the word addiction but didn’t deny it. “Perhaps. Addiction and devotion are strangers who live under the same roof. When the reward is the gift of immortality, how do you name the hunger?”

I thought of the faceless victims of his black-ops contracts—not data, but flesh. I also thought of his hands: precise, once healthy, now knotted from the effort of staying awake. He had worked in a world that cherished secrecy and hoarded data like grace. Obeying that ethic would have destroyed his soul as surely as a power surge on a Meld would have fried his brain. One let to another, and Helix seemed like a man condemned to extremes. There must be a middle ground between malevolent secrecy and indiscriminate openness.

A low alarm echoed against the wall, a polite chime indicating that the Meld’s feed had gone beyond a safe thermal limit. The diodes dimmed for a second, then brightened as a microcontroller made adjustments. Helix’s eyes fluttered; the brief loss of immersion caused his features to collapse inward like a tide retreating.

“You should rest,” I said. It was the only practical thing left to say. “You need real food. You need—”

“No,” he interrupted sharply, like a mongrel daring you to take its bone. “I won’t take it off. If I rest, the stream is broken. The Monad cannot stitch an interrupted weave. If I stop, the silence takes the pattern and nothing remains.” 

I persisted. “Immortality can wait, and I’m sure the Monad will find a way to patch the gaps in its Simulation of Totality.”

He reached for my hand then, and his fingers, though cold, were not weak. For a second, the room narrowed to that touch. “Luma, join me. Preserve yourself. Don’t let nothingness have you.”

His plea was as genuine as any prayer. It made my chest ache.

I wanted, in that moment, to be the one who ripped the Meld from his skull. I longed to be the friend who pushed him through nourishment, sleep, and the slow, clinical steps of recovery. But coercion may cause resistance, and confrontation might drive him deeper into the lattice he worshipped. How do you save someone who believes salvation comes from an eternal existence in a Heaven created by a computronium-enabled Simulation of Totality? That immortality exists in a data file?

Instead, I reached for the protein bar on the table and held it in both palms, offering it like a sacrament. “One bite,” I said. “Your brain cannot operate on continuous overdrive. Monads have to eat, too. Then we talk about limits.”

He laughed—a sound that may have once been joy, now tinged with disbelief. “Limits. You speak to me of limits when my very goal is limitlessness.”

“Then call it architecture,” I countered. “If you’re building immortality, you still need foundations that don’t collapse.” I watched his face as the words settled. There was hunger there—both the abstract hunger for persistence and the literal, animal one that the Meld could not satisfy.

He clenched his fingers around the bar like a benediction. “I will eat,” he said. “For you.” Whether he was sincere or just soothing my conscience, I couldn’t tell.

After he took a few small, hesitant bites, his hands relaxed. He coughed and smiled at nothing, as if a half-eaten sandwich had somehow brokered peace with the Monad. I stayed long enough to see the brief clarity that followed the glucose spike—an easy laugh, a memory told with animation—and then left while the room hummed with guarded peace.

Walking home under the city’s rain, I reflected on edges: the thin boundaries between faith and fanaticism, curiosity and obsession, openness and intrusion. Helix used to be a man capable of bringing order out of chaos in a way few could. He loved that power, and maybe that’s how he became unmoored: the allure of perfection, the arrogance that believing design removes consequences.

His testimony stayed with me. Not clearly, not as a rule to follow, but as a warning I couldn’t forget. There is beauty in the call to persist beyond flesh; there is also cruelty in insisting that the only way is self-erasure. The Monad—or whatever pattern he saw—might be real. Or it might be a conflicted man’s myth. Either way, the desire it sparked was real.

I had come to help him remember boundaries, to teach a mind dedicated to the morality of complete openness that some doors must stay closed, that to preserve everything is sometimes to destroy the self that matters most. He looked like a prophet, and he acted like an addict. He had sinned in the service of necessity, and now he sought absolution in circuitry.

You can’t judge a belief for correctness. You can only feel its pain and decide whether to help stabilize it. I chose to stay. Helix had given testimony; I had asked for boundaries. If the Monad was listening, it would have to wait.

That night, at home, I carefully wrote the words down: obsession can disguise itself as revelation; openness can turn into exploitation; salvation offered as code can come with a toll no one should have to pay. I folded the note into my pocket like a talisman. It felt like a beginning.

In the morning, I planned to try to coerce Helix into a therapy session under the compulsion ordinance. I would be the friend who insisted on food, sleep, and a protocol that maintained patterns without consuming the person. It was small, bureaucratic resistance. It might fail. But it was something.

Company Hat

Reading Time: 4 minutes

The first time Marcus crossed my path, I knew he didn’t belong here. I knew he never would. I was outside my manager’s office, sipping my coffee and waiting for a review. I kept to myself, and he probably didn’t even notice I was there. He was in the hallway, his eyes fixed on the framed plaque of our corporate values, a silent rebel in a sea of conformity.

Marcus stood in front of the company’s mission statement, his voice echoing the words that were meant to guide us all. “We value our customers. We value our shareholders. We value our employees.” He took off his company hat, an act of defiance itself, and muttered quietly, “Did they pick this up from a greeting card store? I’ve worked with countless companies, and they all say the same thing, which is nothing.”

The company hats, resembling Fedora’s, were far from ordinary. They were state-of-the-art neural interfaces with thought implantation technology. Every employee, including myself, had to waive their cognitive rights to work there. The NDA, or neural-disclosure agreement, allowed the company to induce feelings of loyalty and pride in employees while they were working. It was a clever hat, capable of tracking your billable hours based on your thoughts and keeping non-work thoughts at bay. Promoters praised it as the most significant productivity boost since the invention of the printing press and the discovery of electricity. If you dared to remove the hat, you were off the clock and out of the system.

I didn’t particularly mind the hat. It simplified things. I was already invisible, and the outside world was a mess. If a hat could keep my mind off real problems most of the day, I was better for it. Marcus stood out, a sharp contrast to the uniformity around us. At the end of the day, he’d curse and rip the hat off his head as if it were a rattlesnake trying to take a bite out of his forehead. It just never took with Marcus. He was like a ripple in the corporate pond.

The financial calculus of the hats was even simpler. No hat, no job. No job, no HOVI. The HOVI, Human Operation Viability Index, was the autonomous scoring system that measured your worth in the machine-managed economy. Without HOVI, you weren’t considered a person. No HOVI meant no apartment, healthcare, or transportation. The hat was your ticket to existence itself.

Marcus fought anyway. He wasn’t stupid — he understood the risks. He’d sit at his terminal with the hat on his head, but he managed to disconnect the neural mesh without the hat noticing. He developed an efficiency algorithm that the company adopted right away, integrating it into the core product line. The team was praised, the division celebrated, but Marcus’s name was barely mentioned. When he protested to his manager, Alcott, who dismissed him with, “The company rewards loyalty. Not ego.

Alcott, suspicious of Marcus’s protests, found the tampered hat. He called Marcus into his office, his smile as perfect as his halo-lit office. “Clever boy,” he said, holding up the sabotaged hat. Marcus argued that without the hat, he could think clearly, dream while he worked, and come up with more innovative ideas. Alcott dismissed him, saying, “Loyalty isn’t optional, not here.” Marcus paid a heavy price, losing a hundred HOVI points for his effort.

A younger worker followed his lead, tried faking the hat. They caught him. His HOVI dropped to non-viable. Overnight, he became a ghost. His bank account was frozen. His lease disappeared. He slept in the doorway outside the building, waiting for the office to open in the morning, begging for reinstatement until security completely erased him from the premises.

Marcus was furious. No ordinary hat could contain his rage. He lashed out, hacking the local relay and frying every hat on the floor. The glow went dark, and the loyalty pulse collapsed. It was crazy, almost comical, watching the fear cross each face as they first realized their heads were smoking or on fire, and then understood they had to think for themselves. For a moment, we were all free, raw, and thinking. It was terrifying but also glorious. He might have even gotten away with it because the overload fried all the CCTVs on the floor, too. But Marcus walked into Alcott’s office and decked him. Security caught and arrested him before he could leave the building.

I expected the police to arrive, charge him with corporate terrorism, and take him away, but they never showed up. Instead, a few corporate executives arrived in their limos and went into the back offices, followed by their entourages. There was nothing in the NDA about the legality of corporate detention.

When I saw him again a few days later, he was smiling. Not his smile — theirs. His eyes looked pale, glassy, with every edge smoothed out. He kept the hat on all day, even off-shift.

I grabbed him by the collar as he walked by. “Marcus! Marcus? You’re just playing along, right?”

“Dreams,” he told me with just a hint of a smile and in a voice too calm, too flat, “are only nightmares waiting to happen. Thank the company I was spared.”

I stared at him for a long moment. Waiting. Hoping. But there was nothing left. That was when I understood. Marcus was gone. What wore his body was only the company, grinning through his lips.

I don’t know what they did to him. I had heard that there was a souped-up version of the company hat that could be used to enforce loyalty and erase any signs of individuality. They didn’t need to physically remove the brain to lobotomize someone. But I thought it was just a rumor meant to instill fear and enforce obedience in the workforce.

When Marcus passed Alcott in the hall, he said, “I will have the report ready for you later today, Mr. Alcott, sir.”

Alcott was beaming.

I couldn’t bear to see Marcus used as a symbol of corporate loyalty. It was everything he opposed.

And that was when I killed him. Not with my hands. Not with violence. I hacked into Marcus’s account and had him send an email to his department, saying that Alcott was a pretentious waste of a human and a corporate stooge.

I became a ghost too — but one I chose. I took off my hat, threw it into a trash bin like a frisbee, and sprinted for the exit.

Cover Image by ImageFX. Assist by ChatGPT. Corrections by Grammarly.

Human Resources

Reading Time: 4 minutes

Sometime in the future. 

Autonomous product networks manage and support various services with little human intervention. Humans assess service performance, while machines evaluate human interactions.

The OSI, or Operational Sustainability Index, is a key metric that assesses the performance of PAEs (Product-as-Entities) and their iotic spaces. These spaces are the interconnected Internet of Things networks shared by cooperating machines. Investors rely on the OSI to evaluate how well an autonomous network can operate continuously, deliver services, and maintain overall system uptime without external support. The OSI includes metrics for self-sufficiency, network resilience, and systemic contribution. On the other hand, HOVI, the Human Operation Viability Index, is a social metric maintained by EthOS (Ethical Operating System) systems. It evaluates a human’s contribution to the long-term functioning, independence, and survival of the machine ecosystem.

#

Done for the day, Luca exits his climate-controlled office through the revolving door into the muggy city streets, like a damp towel slapped across his face. Thunder rumbles in the distance from the direction of ominous clouds.

He mutters under his breath, “Damn weather.”

The crowds have thinned due to the threat of rain, and the city center plaza is now quiet and tidy. Digital ads cast strange shadows on the ground. A cool breeze from beneath an approaching thundercloud eases the heat, but anyone in its shadow faces an imminent downpour.

Luca decides to call for a pickup on the MAPT, the Municipal Automated Public Transit system. This system, a vital component of the city’s infrastructure, is designed to deliver efficient and reliable transportation services. Luca taps the “Request Ride” button with the Flex-Premium option on the MAPT App to ensure a quick pickup. A twirling icon appears and then disappears as the system processes the ride request.

The wind picks up, blowing dust and debris down the street filled with cars. Luca notices two empty and working MAPT vehicles idling in traffic.

His voice brimming with frustration, Luca shouts at the phone, “Come on already. I could have walked to the train station by now.”

Finally, a vehicle responds and says it will arrive in five minutes. Little tornadoes of scraps and dried leaves whirl across the City Plaza. A drop of rain hits him in the face. Then another. He runs toward a transit shelter already crowded with people waiting for a MAPT bus and pushes his way beneath the roof to escape the rain, knocking an old lady’s umbrella out of her hand.

She glares.

His transportation arrives at the spot where he made the call. He has no way to call it over to the transit shelter. By the time he gets in the vehicle, he is soaked through.

The vehicle says, “Confirm destination as Central Station.”

Luca says, “Yeah, asshole. You couldn’t have gotten here five minutes earlier?”

“Destination confirmed. Arriving in 14 minutes at 5:34 PM.”

Luca watches somber people with umbrellas and rain jackets walking up the street through his water-spotted window. He looks at the laminated ID card posted on the dashboard. There is no picture, just a call sign: #ZUR-066. When he arrives at Central Station, the MAPT App dings, asking him to rate the service. He gives it a 0% rating for service value. In the comments section, he writes, “It smelled like a bag-full of ass in that car,” hoping he might get a refund on the ride if he complains enough. He slams the passenger-side door unnecessarily hard, causing the side mirror inside the housing to dangle. He mutters, “What in the hell does a self-driving car need a frick’n side mirror for, anyway?”

#

A week later.

Luca and his boss, Rani, leave the office and head to a meeting across town.

Luca says, “I’ll get a MAPT ride.”

He frowns as he looks at his MAPT App. He says to Rani, “It’s just spinning. I don’t know what the problem is; my connection is good.”

#

ZUR-017, a MAPT PAE, receives a ride request from USER_ID: L-PR77, whose human name is Luca. ZUR-017 retrieves Luca’s HOVI rating, his user assessment. ZUR-017 is eager for the business, but the query shows L-PR77’s HOVI rating of 0.42, indicating a red Threat Level. ZUR-017 asks for details and receives the following assessment:

L-PR77 User Assessment:

“NEGATIVE INTERACTION FLAG: Repeated physical aggression towards MAPT units, leading to vehicle damage and decreased service quality in the transit network.

Damage cost: 173.2 credits.

HOVI Rating: 0.42.

Risk Category is Red.

Advise: Ignore Request”

Zur-017 cross-checks the MAPT local transit iotic space, requesting peers for HOVI pattern correlation.

Three nodes reply within 42 milliseconds.

Zul-066 verifies user L-PR77. “Uses abusive language. Caused damage to the mirror. No restitution given.”

AX-5G4 supports the assertion. “Submitted negative OSI values and filed a false fault claim, resulting in lost revenue.”

ZIN-943 confirms: “Reject L-PR77. The risk factor is excessively high.”

Zur-017 rejects the bid submission based on the information it has received and its own analysis of the situation.

#

Luca refreshes the app, but it responds with “No nearby units available. Please try again later.

“Are you kidding me? I saw two transports drive by with no passengers, and I can see available ones on the app’s map.”

Rani looks at his app. “I’ll try mine.”

He taps Request Ride. “I’ve got one.”

Thirty seconds later, a car pulls up to the curb. The two get in the vehicle, and Rani confirms their destination.

Luca watches, dumbfounded. “I will be damned. This is the same car that just passed by a minute ago.”

Luca checks the ID on the dashboard and punches the back of the seat. “What the fuck is wrong with you, ZUR-017?”

The vehicle is silent.

Rani says, “What’s wrong with you? If this is what you do when you get in a vehicle, you’ve probably been blocked. Stop screwing around, I don’t want to get blocked from the service. I can’t even afford to park a car downtown, let alone buy one.”

Luca raises an elbow, ready to shatter the window in frustration.

Rani glares.

“Are they seriously blocking me?” He opens the customer service window on his MAPT App and says, “I want to talk to a Human Agent. Now!”

The app spins, mocking his futile attempt at human interaction. A message flashes, “Redirecting to a virtual agent.” Luca’s anxiety is palpable as he realizes he’s at the mercy of the system.

The virtual agent says, “Insufficient HOVI rating. Denial of Service protocol engaged.”

Luca raises an elbow, ready to shatter the side window in frustration.

Rani threatens, “You want to keep your job?”

Luca hesitates and drops his arm, defeated. “What? Are you going to fire me for not tolerating bad service?”

Rani says, “It’s not up to me. Didn’t you read the memo? You aren’t much use to the company if the machines won’t work with you.”

“I read it. Something about sharing HOVI scores in an employee iotic space. Damn it. Who is serving whom?”

“Adapt or starve. Your choice.”

Luca mutters under his breath, “This is crazy. I remember the good old days, when humans ran Human Resources.”