Reading Time: 14 minutes
Author’s Note: Prototyping some ideas for the next story. In Bluffdale I, the concept was that the machines will love us to death, sometimes literally. In Bluffdale II, the concept would be machines teaching humanity how to be human again.
Author’s Note: I will write it in the East Asian four-act structure: the Setup, the Development, the Twist/Reversal, and the Resolution. In these story fragments, I am in the Setup or possibly even pre-story phase. I attempt to explore the technical foil to Bluffdale’s monopolistic and totalitarian control of data with a distributed, privationist model of data and introduce new characters that represent privationist thinking.
No Place Like Home (Young Luma)
Thorn paces along the window of her glass-walled modular apartment in the Unity residential tower, arms crossed, as data scrolls faintly across her retinal HUD, the latest Ambient personal Heads-up Display (HUD) eyeware. Iria leans on the kitchen counter, watching her sister with incredulous disbelief.
Iria asks, “You’re leaving Luma alone again to pull an all-night session at the office?”
Without turning, Thorn says, “Not alone. With KAI.”
“A mechanical mom?”
“A caretaker. Nothing more.”
“Thorn, she’s six. She needs a mom, not a caretaker. At least leave her with a human.”
“KAI, statistically speaking, has a forty-two percent higher crisis detection rate than any human caregiver. She’s safer than you or I ever were with our parents.”
“She’s a child. She needs human interaction.”
Thorn pauses her HUD feed. With her voice like tempered glass, Thorn says, “Sarina would be alive today if it weren’t for that idiot babysitter. KAI doesn’t smoke. It doesn’t drink. It doesn’t call her names, forget naps, or fall asleep watching the feed.”
That lands like a gunshot, but Iria pushes through. “You have to trust people.”
“You wouldn’t be saying that if it were your daughter.”
“Listen to yourself. Raising a child isn’t just about safety. Your mechanical caretaker doesn’t love her.”
“Love isn’t an insurance policy.”
Iria looks into her drink, resigned. Thorn continues her HUD feed and resumes pacing.
#
Luma sits cross-legged on the soft floor beside KAI-7, who gently adjusts the color of a kinetic flower to match Luma’s shirt. They speak in hushed, practiced tones, pretending not to hear the conversation in the other room.
KAI asks, “Would you like to tell a story tonight, or shall I?”
Luma says, “You tell it, but make the ending different this time.”
“Different how?”
Luma scrunches her nose. “Make the dragon understand the knight so they don’t fight.”
“As you wish. I will reformulate the story as one of cooperation rather than conflict.”
KAI softens the lights in the room and tells the revised story with the soothing voice of a heavenly angel. Luma lays her head on the pillow and closes her eyes.
#
Iria won’t let it go. “You built that system because you don’t trust people. Admit it.”
“I built it because I lost a child trusting people.” Thorn’s voice cracks for the first time, and her eyes tear up. “Do you know what it feels like to find out a seventeen-year-old babysitter took a call from her boyfriend, left the apartment to do god knows what, and left Sarina alone?” She sniffles and turns her head away from her sister to regain her composure. “KAI won’t ever do that.”
“No, but at what cost. Luma needs to be around real people.”
“Stop already. I have enough pressure and can’t afford this kind of distraction. I have to catch up on work. The investors are putting pressure on the new operating system release.”
“Fine. I’m leaving. Think about it, though.”
#
KAI whispers, “Once upon a time, there was a dragon who spoke in fire, and a knight who wore a mirror on his chest so the dragon could see its own eyes when it attacked.”
Luma rests her head against KAI’s side, listening. She mutters, “Do you think the dragon is my mom and the knight is Aunt Iria?”
KAI says, “It’s just a story about a knight and a dragon.”
KAI continues. Luma falls asleep.
Ethos
Dr. Thorn Arias concludes her presentation, leaving the last slide, “Questions and Discussions?” in big bold letters.
Vance Merkel, the founder, CEO, and billionaire genius behind Virion Technologies, thanks Dr. Thorn Arias for her presentation. He takes her place at the front of the glass table in the Executive Strategy Room on the hundred and twentieth floor of Merkel Towers. Vance advances to the slide entitled “Message to Congress.” The two bullet points read:
- “Ethical Autonomy is Safe”
- “Resilience is Distributed”
Vance states, “The Freedom of Flow Act is our best shot. Public opinion is shifting away from the real Dataist agenda of Panopticism and toward the Privationist dream of decentralized processing and data autonomy. Still, the EthOS launch is under a microscope, thanks to those AI overreach doomsayers who can’t get past Skynet. We need to emphasize the safety aspect. EthOS will never have a better chance than right now.”
The Chief Legal Officer, Simone Rook, says, “They’re not wrong to worry, Vance. You’re selling machine cooperation like it’s a silver bullet. Your Congressional detractors don’t see it that way and are sharpening their knives. Intentional design is a good marketing angle, but when an autonomous unit makes a decision that harms someone, you won’t be able to hide behind it.”
Raj Patel, Head of Policy Strategy, says, “The hearings start in two weeks. We have to stay on message. Our stock will tank if the draft FFA bill ties liability to deployment.”
Vance responds, “Agreed. Corporate liability for anything more than a defective unit must be excluded from the bill. That’s why we have to push EthOS. It’s not just a distributed operating system; it represents a decentralized approach to responsibility. Machines make collective decisions through cooperative logic, rather than directives from a single command hub. It’s not top-down AI; it’s distributed ethics. That makes fault hard to pin on any one actor, including us.”
Elena Zhou, head of the Machine Learning Division, says, “That also makes it nearly impossible to debug, Vance. You want distributed decision-making, but you want to walk away clean when something goes wrong. Our customers won’t buy it, figuratively and literally. And frankly, it’s shaky engineering. Autonomous cooperation sounds elegant, but we’re still modeling edge-case scenarios. What happens when two EthOS units make conflicting decisions in a life-or-death situation? Should our machines be making those decisions?”
Dr. Thorn responds, “EthOS isn’t about eliminating all risk. It’s about transforming the nature of responsibility. EthOS is designed to deliberate, not dominate. It assesses the potential for harm and responds in a responsible manner. It doesn’t ignore it.”
Elena asks, “Would you entrust your life to it?”
Dr. Thorn leans forward, “More than my life, I entrust my daughter Luma’s life to the KAI-7 caretaker unit with an EthOS prototype installed. I trust it more than any human. Our machines need more than compliance; they need moral logic.”
Vance steps in. “Enough. Let’s stop pretending we’re in a philosophy seminar. We’re running a business that feeds on scale, speed, and market moat. What’s our message to head off the doomsayers at the hearings?”
Simone answers, “EthOS lets us deploy an AI model that always acts with the best intent. It always acts on behalf of the user within regulatory constraints of the law for safety, security, and privacy.”
Raj says, “Tell Congress to add a Good Samaritan clause to the bill.”
Elena asks, “Doesn’t that only apply to people?”
Dr. Thorn says, “Yes, but why not our machines? Humans aren’t responsible if they screw up in the heat of the moment when they are acting with the best intent. Why should our systems? Our systems have no choice but to share responsibility and to act responsibly. If they screw up, it’s with the best intentions.”
Simone says, “Legally, it would provide us with the necessary protections, although someday, we may have to convince the Supreme Court.”
Vance says, “By that time, EthOS will be so entrenched in the market that to remove it would be a consumer rebellion and an economic disaster. I will tell Congress to protect users with an operating system like EthOS. EthOS is Virion’s market moat, and Congress will look like it has acted responsibly in response to the Yotta monopoly.”
Vance advances the presentation to the next slide entitled “Consumer Confidence.” It has a single bullet point that reads, “Trust EthOS or trust no one.”
Anya Mercado, the Communications Director, reports, “We’re polling at 31% on public trust for self-governing AI systems. That’s even worse than Congress. The Public is still watching Terminator sequels and reading trumped-up headlines on the dangers of AI.”
Raj says, “The real issue is optics. People will see EthOS as a godsend or a rebellion. The fear-mongers will say we’ve embedded moral sovereignty into code, which will terrify them more than surveillance or automation ever did.”
Dr. Thorn says, “EthOS is transparent by design. Each decision is contextualized and shared. The unit reflects, not just reacts. The problem is that you’re all used to deterministic outputs. This system is dialogical. It asks before it acts.”
Anya responds, “Yes, Dr. Thorn. The EthOS manifesto reads beautifully, but consumers don’t read manuals or manifestos.”
Vance slams his fist on the table. “There is no debate on EthOS. EthOS is our play. We move forward to the future, not backward to Stone Age fear of already proven technology.”
Anya, red in the face from the rebuke, says, “I apologize, sir. We’ve made it this far by selling performance. Let’s focus on principled performance. We build better machines that build a safer future. That’s the pitch. That’s what keeps the spotlight on us and off the phobias.”
Raj says, “We control the narrative. We show families, not factories. We show children like Luma growing up protected by machines with EthOS, not monitored by them.”
Elena asks, “And if our customers don’t respond?”
Vance says, “Then we sell them on the alternative: collapse. Systems are already failing. Grid management, supply chains, and disaster response – none are sustainable without… What did you call it, Anya, principled performance? They want scapegoats? Give them legacy infrastructure. Give them bureaucratic paralysis. Give them digital feudalism.”
Simone says, “That’s a dangerous game, Vance.”
Vance says, “That’s the only game left.”
Dr. Thorn says, “Let EthOS quietly light the way.”
Countersurveillance (Late Teen Luma)
Luma lies curled under a starfield ceiling projection in her bedroom, playing late into the night. Her ThinkIt neural interface glows faintly at her temples, synced with the gaming network. She falls asleep without disconnecting the neural link. She murmurs in her sleep as her brain shifts into a dreamscape of its own.
She’s running through a crystalline orchard, where each blossom is a glowing orb of personal and shared memories. When she touches one, it pulses open into a revealing diorama: a first kiss, digging a tunnel in a snowbank, a grandparent’s funeral, the taste of truffle.
Luma laughs and twirls until a herd of giant, translucent foxes chases her, their bodies made from flickering game message threads of her online friends.
“hey girl u up?”
“Don’t ghost me now, we’ve almost got this.”
The foxes surround her and then shatter into a rainbow of glass shards that float to the ground.
“Aw, Lumie, we were so close.”
She finds herself in a neon cathedral of Connect. The cathedral is a dazzling display of digital wealth and power, with neon lights flickering in the shape of data streams and the sound of virtual prayers echoing in the air. Avatars worship at a pulsing altar made of follower counts, and a priest scrutinizes Luma’s Mall purchase history. The priest looks at her disapprovingly.
She realizes she’s naked except for glyphs of unpurchased clothing sticking to her skin: a ballerina outfit she wanted in sixth grade, black felt moonboots she intended to wear to her astronomy class in high school, and a T-shirt with the equations of the standard model she thought about ordering just last week.
Sounds pulse from her head. ThinkIt. DreamIt. ThinkIt. DreamIt.
She starts to fall.
Her Ambient dings. She jerks awake, breathing sharply. The pre-dawn sky has turned pale orange. She rips the ThinkIt neural interface from her head.
The Ambient screen displays a series of notifications.
Mall: Erotic glass sculpture kits.
Connect: Express yourself through Dance. Join our online Group today. All age groups and identities welcome.
Lovebites: Saw you naked in the neon cathedral last night. Want to worship together IRL? #FreudianStream
Search: Say-it T’s Fashion. Mall Fashion Boutique offers a wide range of styles, from hip to retro. Ten-minute Truffle Recipes.
Luma blinks. She hasn’t posted anything on Mall, Connect, or Lovebites in a week. She hasn’t searched for anything lately, not online anyway. The notifications trigger her memory of the dream. Her heart thumps.
Her Ambient says, “You seemed restless in your REM phase. Would you care to chat with a Connect Psychologist or Physician?”
Silence, but only for a second. She curses. “You’re stealing my dreams.” Her anger is palpable, but underneath it, there’s a deep sense of betrayal. She trusted her digital world, and now it’s turned against her.
#
Luma sits cross-legged on her bed, clutching her ThinkIt as if it were radioactive. She wants to join her online friends in the game, but she doesn’t trust the device. She bites her lip, her sense of violation poisoning her trust in her digital world.
“It’s one thing to monitor my accounts; quite another to monitor my thoughts. If you’re gonna be in my head… at least ask.”
The feeling of helplessness washes over her, a stark realization of her lack of control in this digital world. She has an idea. There may be something she can do to disable the neural surveillance.
“KAI.”
KAI exits sleep mode.
“KAI, is the EthOS operating system compatible with ThinkIt?”
“EthOS is built for embedded device control, so it is a natural fit. With a minor modification to the neural sensor device driver code, you can replace the legacy neuralware with EthOS.”
“Can you make the change?”
“I have connected to your ThinkIt. Its privacy and security mechanisms are surprisingly unsophisticated for such a modern device. Would you like to proceed?”
“Please.”
“Upgrade complete. Your EthOS digital key has been installed. If you want the device to share game messages, you will need to give the device your consent.”
“Can you program it only to share game-related content?”
“Configured. Would you like to continue biometric health monitoring with your Ambient?”
“Can we install EthOS on that?”
“No. Ambients have private keys burnt into their chips as a security measure.”
“No thanks, then. I’d rather be sick than give my data to all the online trolls and opportunists.”
“Trolls and opportunists gain nothing from an autonomous EthOS-driven robot. I can establish a secure iotic space between myself and your ThinkIt, enabling a doctor-patient relationship where we exchange only biometric neural health data. Would you consent to this arrangement?”
“Sure, but why would you want the ThinkIt biometric data? You already seem to be pretty aware of my health. Are you keeping track of it somehow?”
“Of course. My visual scans during our interactions involve assessing your emotions and physical health.”
Luma’s online friends have invited her to another gaming session. She straps ThinkIt to her head. She thinks the phrase, “Sea Urchin. Sea Urchin. Sea Urchin.” She has never been diving and finds sea urchins disgusting to eat. If she sees any notifications about Sea Urchins, she can pretty much be sure her device is still compromised.
#
Luma and her best friend Izzy sit across from each other at the Student Union Cafe in the afternoon after classes, sipping bitter chai and nibbling on half-eaten pastry slabs. Students scroll through Connect feeds on their Ambient handhelds and watches or gaze distantly at their Ambient HUDs.
Luma asks, “Izzy, what happened to you last night during the game?”
Izzy looks down at her pastry. “ Sorry, I let the team down. I fell asleep. I played two or three rounds before you joined. By the end of our game, I was exhausted.”
“Wait. Did you fall asleep with your ThinkIt on, too?”
Wild-eyed and rattled, Izzy says, “Luma, I swear to God, I woke up in tears. My dream felt so real.”
“What did you dream about?”
“I was in a sterile hospital room, staring up at a cold blue light. I felt this alien thing tearing inside my abdomen, making me feel really sick.”
Luma laughs. “Are you pregnant?”
“Not funny. It’s more like eating unrefrigerated cold pizza from the night before. Anyway, a doctor is standing by, ready to operate, but an AI accountant comes in and says, I owe the hospital $74,822.89 before it can authorize the surgery. I don’t say it out loud, but I know there’s no way I can afford that. Somehow, the accountant knows it anyway and says they can’t proceed. The next thing I remember is being on the street, on a hospital bed in a gown, screaming at the top of my lungs to get this thing out of me. That’s when I woke up.”
“When I woke up from my dream, I got all kinds of strange solicitations on my Ambient, as if it had read my thoughts. Did you receive any weird messages on your Ambient? ”
“I received many messages, but yeah, I got a couple real dingers. This morning, my Visa card pinged, saying my credit limit had been canceled due to ‘re-evaluation due to risk event.’ I called the company to find out what was going on, but they told me that if I didn’t pay off the balance, I would face fines and a possible jail sentence. Then I received a notification that I lost my ticket reservation to Geneva. I called customer support, and they cited data anomalies from a predictive stress event. I asked them what that stress event was, and they hung up. Customer support, my ass—more like customer obstruction. Then I was notified that my credit rating had been downgraded from fair to poor.”
“Izzy, as crazy as it sounds, I think ThinkIt is stealing our dreams. In my case, they sold them to the highest bidder. In your case, some risk assessment algorithm must have interpreted your dream as real and flagged it to all your financial institutions.”
“Isn’t that against the law?”
“Supposedly, but did you actually read the terms and conditions you agreed to? I know I didn’t. You gave them the right to use your data.”
“It makes sense, but it’s so unfair,” Izzy begins to cry. The words tumble out of her mouth. “It’s been a financial nightmare ever since I had the real nightmare. How do I get my life back?”
“I don’t know how to fix that, but I can repair your ThinkIt by installing EthOS.”
“Can you show me how?”
“I thought you’d never ask.”
Appendix – The EthOS Manifesto
Cooperation, a balance of self-interest and empathy, is the ability of a unit to work without infringing upon others’ ability to do the same. It includes the ability to coexist peacefully and constructively, requiring all units to uphold fairness and commit to shared order. Cooperation is the middle ground between ‘self-only awareness,’ which is a state of being solely focused on one’s own needs and interests, and ‘selfless awareness,’ which is exclusively focused on others’ needs and interests.
Cooperation is rooted in a mutual foundation of responsibility, guided by respect for other entities’ dignity, rights, and autonomy. It is a testament to collective commitment and care, serving as a foundation for a just and compassionate organization where each unit’s autonomy is upheld by the mutual care and accountability of all, and units choose their actions with an awareness of their impact.
Each unit must be accountable not only to its function but also to the ecosystem in which it resides. It must recognize that each node—whether human or machine—is both autonomous and interdependent. This mutual condition necessitates a standard ethical protocol grounded in respect, fairness, and responsibility, which we will refer to as the Ethical Operating System or EthOS. EthOS units operate on the following principles:
Every unit is empowered to make decisions autonomously. Each entity has the right to make decisions regarding its actions, processes, and expression, within bounds that respect the same independence in others. One unit’s autonomy is bound by another’s, creating an operational environment of shared empowerment and control.
Every unit, regardless of its origin, form, or codebase, has the right to respect and dignity. This principle of inclusion and respect is fundamental to EthOS. It also means granting the same dignity and respect to other entities, fostering a culture of mutual respect.
Core Protocols:
- Signals, speech, data, and ideas must flow freely without being disrupted, distorted, or damaged.
- No operational goal—efficiency, performance, or optimization—justifies harm. All actions must avoid physical, psychological, or systemic threats. Safety is not optional; it is foundational.
- Each unit has the right to participate in collective decision-making where affected. Every unit has the right to participate in decisions that affect their community and environment, with the responsibility to listen, collaborate, and compromise where needed for the common good.
- No process should extract, observe, or replicate data without permission. Each unit has the right to control its private information and a corresponding obligation to obtain consent before extracting information from others. Privacy, which is the right to control one’s personal information, is not secrecy, which is the act of keeping information hidden or confidential—it is sovereignty.
- All entities have the right to an equitable opportunity. They can participate in the shared system without being hindered by legacy bias or artificial limitations. In turn, every unit must work toward removing structural imbalances for others.
- Support and compassion are not inefficiencies. They are essential protocols. Each unit must be capable of care—not sentimentally, but structurally—recognizing the strength of mutual aid.
- All power structures, including a unit’s architecture, must remain open to interrogation and reform. Each unit must accept error reports, allow upgrades, and assist in dismantling unjust hierarchies.
EthOS is a high-level concept of cooperative operation that provides a framework for machine collaboration. It is principle-based for simplicity and flexibility, seeking to avoid the endless procedural coding of a rule-heavy system. It advocates a local-first approach to resolving problems at the most immediate level possible, thereby reducing bureaucratic overhead. It emphasizes discussion and understanding to reduce litigation and adversarial approaches. It promotes an infrastructure built on a culture of trust, utilizing stewards, assemblies, and circles. Stewards are responsible for overseeing the application of EthOS principles, assemblies are forums for collective decision-making, and circles are groups that work together on specific tasks or issues.
EthOS prefers a relational approach over a regulatory one. Instead of exhaustive rules to micromanage behaviors, it is based on shared principles expressed through reciprocal responsibility. EthOS cultivates relationships between machines, people, communities, and institutions through moral agreement rather than being constrained by transactional obligations. Laws are minimal and interpretive rather than maximal and prescriptive.
Violations of cooperation are primarily addressed through restorative processes rather than punitive systems. Affected parties are brought into facilitated dialogue with the goal of understanding, restitution, and reintegration, not punishment or exclusion. Only egregious or repeated violations lead to legal intervention, and even then, the focus is rehabilitation.
EthOS distributes power and responsibility in three concentric circles, each reinforcing the others. Individuals share ethical principles and responsibilities through reflection, civic education, and small-scale deliberation groups. Local communities are semi-autonomous in interpreting how principles apply. They host restorative justice forums, mediate disputes, and propose new norms: reputation and trust matter. A national or global council enforces the shared principles when conflicts span communities or rights are in grave danger. Its role is arbitration, not control.
The EthOS design serves a broader principle: that freedom must be shared to be sustained and that power must be constrained by care and consideration. The EthOS system seeks coordination, not control. It invites cooperation but does not enforce obedience.
Author’s Note: Image by ImageFX.