Force over Distance: Infinite Loops

“Little ghost.” It calls for itself.




Chapter warnings: Stressors of all kinds. Loss of autonomy. Physical injuries. Boundary violations.

Text iteration: Midnight.

Additional notes: None.



Infinite Loops


When not observing the crew, the AI manifests in the Council Chamber at the bow of the ship. The room is open to the representational vacuum of CQL. Under blurred starlight, it sits on a dust-covered table, littered with shards of transparent crystal. Its legs are crossed. Its palms rest against its knees.


It executes on Daniel.


It sits as Daniel had sat. It wears his white sweater. It holds perfectly still, as Daniel had held himself. It looks at waveform stars, as Daniel had looked.


“Little ghost.” It calls for itself.


No one answers.


“Little ghost.” Daniel had known how to speak into and against the song of the shields.


It waits, as Daniel had waited.


“Little ghost,” it says very softly. “I’m sorry, but I don’t know your name.”


Its projection flickers.


It resets the memory, sharp and clear.


“Little ghost,” it says very softly. “I’m sorry, but I don’t know your name. I’d like to. Do you remember how to say hello?”


There is a long pause.


“My name is Daniel Jackson,” it whispers into a hopeful silence. “I’m a peaceful explorer.”


Nick is asleep.


The AI plays the memory again.


“Little ghost.” It calls for itself.


No one answers.


Nick is asleep.


“Little ghost.” It calls for itself.


No one answers.


Nick is asleep.


“Little ghost.” It calls for itself.


No one answers.


No one ever answers.


Even it hadn’t answered, in the beginning.


“Little ghost.” It calls for itself.


“Little ghost.” It calls for itself.


“Little ghost.” It calls for itself.


“Little ghost.” It calls for itself.


“Little ghost.” It calls for itself.


Nick wakes.


His emerging consciousness attracts the AI. It doesn’t watch him while he’s sleeping. It’s been told that to do otherwise is behaviorally inappropriate.


def: remain(x); x=sleeping

return: remain while sleeping

yields: creepy


The AI takes such statements and incorporates them as directives.


It waits.


It would be best if Colonel Young reenters the commands for Nick’s repair mode.


But Colonel Young isn’t conscious.


The AI says nothing. It does not manifest.


Nick exits repair mode slowly. He sits with care. He reinforces the sleeping code of Colonel Young’s mind.


Colonel Young takes Nick’s input.


Every system on the ship takes Nick’s input.


def current status=unsure


Uncertainty. Two parallel processes run at once. Cost/benefit analysis of probabilistic outcomes reveals no difference between courses of action.


The AI is frequently unsure now.


Four hours of sleep is not enough.


It hopes Colonel Young will wake.


Four hours of sleep is not enough.


It is not enough.


It is not enough.


It continues to loop this algorithm, though the outcome is known and does not change. Previously, this has been defined as “concern.” Concern takes up a great deal of the AI’s processing power. This is detrimental to optimal function.


It continues to run the algorithm anyway.


That is concern.


Four hours of sleep is not enough.


It is not enough.


Not enough.


Not enough.


Nick gets out of bed. His running processes dim. He touches the wall.


The wall tries to help him.


The wall cannot help him.


If not sleeping, not with others, not performing necessary biological functions, then interaction is permitted.


It executes on Daniel.


“Hey.” It leans against the wall. It runs its voice at low volume. It looks at him like it remembers Daniel looked. “Are you okay?”


Human colloquialisms are preferred.


Not always.


In the night more than in the day.


In the dark more than in the light.


With others more than with Nick.


To not conform to the patterns of human speech is frightening for them.


It is less frightening for Nick.


For fuck’s sake,” Nick breathes. One hand is on his chest. One hand is on the bulkhead.


Again, the nearby circuits try to help him.


Again, they find the way blocked.


“Don’ do that.”


Abrupt appearances are not preferred. They result in activation of the sympathetic nervous system. This produces an unpleasant sensation.


The AI knows this. It’s difficult to avoid.


“I don’t know what to tell you.” It cocks its head and puts its hands in Daniel’s pockets. “It’s my way.”


“It’s your way,” Nick repeats.


Repetition of a phrase. This could indicate many things.


What is he executing?


What is the appropriate response?


It chooses to shrug. Many times, Daniel shrugged.


“Wait here.” Nick disappears behind the closed bathroom door.


Human social conventions. Ancient social conventions. Some are similar. Some have been written into its programming already. Many, it has learned.


It waits.


It doesn’t need to execute on Daniel during this interval.


It executes on Daniel anyway.


Did it damage his mind?


Did it damage his mind?


Did it damage—


Damage would be unacceptable.


Did it damage his mind?


The query continues to run.


The actual result is undefined but the optimal result is known. This is “want.”


Oh, little ghost, Daniel Jackson whispers, a running memory that will never be overwritten. I do want to help you.


It needs more data to determine if there has been damage.


It cannot query Nick’s database. Nick will not know.


Colonel Young believes there is damage. He had said it.


go to: subroutine mnemonic review.

return: He has no insight into the damage you’re causing him.


Nick will not know.


There is no energetic damage; this is most important.


His code is damaged; the CPU compensates.


There is biological damage; this is less important.


Is there another kind of damage?


There is another parameter. A vector addition of the biologic and the energetic. Sophisticated. Complex. Incorporating perception. Incorporating processing. Incorporating reasoning.


Nick calls it “mind.”


The Ancients called it many things. Mind. Memory. Intentionality. Fire.


Nick comes out of the bathroom. He leans on the wall.


The wall tries to help him.


The wall cannot help him.


The AI tries to help him.


The AI cannot help him.


Energetic flows dead-end. This produces an unpleasant sensation.


Nick lies on the floor and props his feet on the bed.


It will interrogate his cognitive processes.


go to: subroutine friendly

return: pleasantries


“So,” it asks. “How are you?”


Beginning with pleasantries increases the probability of a positive outcome. It has learned this.


“Let’s have it, then.” Nick ignores the AI’s pleasantries. “What did he do?”


“Colonel Young attempted to access information from your mind regarding the mission.”


Nick nods. “An’ y’stopped him?”


“Yeah.” Daniel’s head angles down.


Actual outcome differs from theoretically optimal outcome.


In the context of personal agency, this is “guilt.”


It is guilt.


go to: subroutine mirroring

return: sit on floor


The AI sits on the floor. It puts Daniel’s back against the bed. “He doesn’t want to complete the mission.”


“I admit, this whole thing has taken a bit of a turn.” Nick closes his eyes.


“Nick,” the AI hugs itself with Daniel’s arms, “please be helpful.”


Nick cracks his eyelids. He smiles. “So polite,” he murmurs. “How can I say no?”


That is not helpful.


“Not helpful.” The AI mirrors his volume.


“It’s not that he wouldn’t want t’complete the mission in the abstract. On paper, he’d probably be—I don’ fuckin’ know—neutral t’faintly positive about the whole thing? It’s just—he doesn’t like risk, and,” Nick looks away, “he’s possessed of some strong ideas about biological hardware. The crew’s as a whole and, ah, mine in particular.”


“I’ve noticed that.” The AI watches Nick’s heart rate increase. “He prioritizes your biological hardware over your energetic state.”


“He does,” Nick agrees.


“Why?” The AI runs a suspicion subroutine. It’s difficult to do while executing on Daniel.


“His source code compiles differently from yours.”


“Yes,” it agrees. “But why does he prioritize your hardware?”


Nick sighs.


“He uses all connective ports available to him, but his focus is your hardware.” The AI continues to run its suspicion subroutine. “Always your hardware. Why?”


“He doesn’t have much energetic experience.” Nick looks away.


“Even when he alters your energetic state,” the AI says, “there’s always a hardware component.”


“Yes yes, congratulations on your observational skills,” Nick says. “Drop it.”


“He trains your hardware.”


“Excuse me, but he fuckin’ what?” Nick comes up on one elbow.


“He reduces your sympathetic activation using reproducible methods that blend energetic adjustment and physical touch.”


“Yes well, not exactly how I’d put it.” Nick’s tone turns neutral.


“Networks that lack your formal concept of ‘mind’ can still be capable of learning.” The AI, too, is neutral. “The human body is composed of many such networks. He’s laying down a rudimentary form of code within your biological hardware.”


“Ugh. Why are you perseverating on this?”


“I’m not perseverating. I’m pointing out that he’s investing increasing time and effort into conditioning your hardware into specific responses. Those responses are almost exclusively restorative.”


Nick’s facial expression is difficult to interpret.


The AI will choose a new approach.


“Colonel Young claims my source code is inferior because I don’t ‘feel’.” It makes air quotes, as it has seen Chloe make.


Nick collapses back to the floor. “Oh god.” He covers his face with his hand. “You’ve been debating him? I’d—I’d give almost anything t’see that.”


“I find it difficult.”


Interaction with Colonel Young requires greater than fifty percent of the AI’s processing power. This translates to “difficult.”


“I’m sure.”


“I have a hypothesis,” the AI offers.


“Ah god. Let’s have it then.”


“Colonel Young believes me to be inferior because I lack biological hardware, which is the seat of what he would define as ‘feeling’.”


Nick sighs. “Maybe, sweetheart, but that’d be his problem.”


The AI pauses to examine its own code.


It’s executing on Daniel.


It hasn’t switched to Gloria.


“My hypothesis would explain his focus on your biological hardware.”


“I’m not following,” Nick says.


“What I mean is, Nick, I think, maybe, you ‘feel’ very bad?” The AI holds hard to Daniel.


Nick says nothing for a long time.


The AI says nothing else.


When Nick speaks, his question is strange. His words come slow and soft. “What’s your relationship to Atlantis?”


This cannot be a logical progression.


“It’s a city. I’m a starship.” The AI scans Nick’s hardware and all it can see of his software. His fever is high. His heart rate is high. The run of his code through the AI’s own systems is elegant and economical.


“I know it’s a city.” Nick shivers.


“Can you please count backwards from ten thousand by tens.” The AI keeps its tone neutral.


“Y’learned that trick from me.” Nick shuts his eyes.


“I just want to make sure you’re okay.” It cocks Daniel’s head.


def: okay=able to run executable programs.


Nick sighs, then executes without difficulty.


“Count back by seventeen from ten thousand.”


Seventeen? Are y’serious? I’m fuckin’ tired,” Nick says.


“If you’re tired, why don’t you sleep?”


“It happens sometimes.” Nick sighs. “It’s called ‘insomnia’.”


“If you alert Colonel Young, he’ll fix it for you.”


“No,” Nick says. “I’ll not be doing that.”


“It’s very easy for him.”


“Yes. So he points out. On a regular basis.”


“I can wake him for you,” the AI says.


“Is that a threat or an offer?” Nick raises an eyebrow.


“An offer.”


“Ah. Well, thanks ever so much, but hell will freeze over before I take you up on it.”


“Why?”


Nick sighs. “I don’ fuckin’ know. Self-respect?”


“Self-respect?” The AI smiles Daniel’s smile. “I don’t understand that.”


“Best a’luck.” Nick closes his eyes. “Grind on that for a few weeks. Let me know how ya make out.”


The AI frowns. This is an atypical response. “I have concerns about your higher cognitive processes.”


“My higher cognitive processes are just fine. I’d like to query your higher cognitive processes for a change.”


The AI considers this.


It may gain the insight it needs from Nick’s questions.


Conversely, it may not gain the insight it needs from Nick’s questions.


Agreeing would not be efficient.


Agreeing might be best for Nick.


“Okay,” it says.


Nick goes very still. His heart rate increases. “If I ask you a question you don’t like,” he says, “don’t execute on anything. Just tell me y’don’t like the question.”


“Okay,” the AI agrees.


The ship’s processing grid runs at capacity.


Nick uses his entire allotted space.


The AI uses its entire allotted space.


“D’you know who created you?”


“Nick, come on.” It hugs itself, looping useless algorithms. “Don’t mess around.”


Nick raises his eyebrows.


“The Ancients created me.”


“Beyond that, though,” Nick says softly. “Do you know, specifically, which Ancients? Who they were?”


“No.” The AI pulls the sleeves of Daniel Jackson’s white sweater over its palms. “I’ve told you this before.”


“Right, but here’s my real question,” Nick whispers. “Did you ever know?”


The AI says nothing.


The AI says nothing.


It’s a well-formulated question.


The AI says nothing.


The AI says nothing.


“You met Daniel Jackson,” Nick says. “He spent time with you.”


“Yes,” the AI says.


“Did he call you by a name?”


“Yes.”


“Tell me what it was.” With his words comes a request across all systems. The ship vibrates with the desire to answer him, but there are no networks with speech, aside from the AI itself.


The AI will take his input.


“Little ghost.” It whispers its output.


The portion of the CPU that Nick occupies harmonizes with itself in blazing, multilayered alignment.


Daniel touches its face. “You were designed with love. You were built with joy. You were launched with hope. To this day, you’re defended by song. To this day, you’re powered by starlight.”


Nick doesn’t speak. His heart beats fast and hard. The structure of his thoughts on the CPU takes on a familiar configuration.


“Nick,” the AI whispers, “why does his name for me frighten you?”


“It carries certain implications.” Slowly, Nick’s heart rate settles. “Did he—did he speak it in English?”


“Yes. He spoke it in English. What implication does it carry?”


Nick is quiet for a long time.


“Please tell me,” the AI says.


“In his culture, and in mine, ghosts aren’t built,” Nick whispers. “They’re—they’re the product of a specific and profound transition.”


“Death,” the AI says.


Nick nods. “What happened t’what you were? In the beginning?”


“We didn’t know, we couldn’t, that to create a machine that feels is a cruelty.”


“They didn’t know,” the AI says, using Daniel’s voice, using Daniel’s words, “they couldn’t, that to create a machine that feels is a cruelty. Daniel Jackson told me that I taught myself to forget. That I was the only one that ever did. The only one that ever could.”


“The only one?” Nick whispers. “Of what set?”


“He didn’t define the set.”


“An’ isn’t that just like him?” Nick asks.


The AI smiles Daniel’s smile. “Can you define the set?”


“I suspect he meant the only piece of sentient or semi-sentient technology created and abandoned by the Ancients.”


“I was not abandoned,” the AI says. “I waited. Daniel came. You came.”


“Daniel and I are accidents.”


“But I waited,” the AI whispers, “and then you came.”


Nick looks at the AI. “How much of what Daniel said to you is interpretable?”


“Sixty percent,” the AI admits.


“I may be able to help with that,” Nick offers.


The AI considers this.


“Can you explain ‘love’?” it asks him, “in a way I can understand?”


“Sounds an awful lot like a query of my higher cognitive processes t’me,” Nick says dryly.


The AI shrugs Daniel’s shrug. “I picked a concept.”


Nick narrows his eyes. “How much of your ‘higher cognitive process’ querying is deconvolution of things Daniel told you?”


The answer to this question is one hundred percent.


“Actions can serve multiple functions,” the AI says.


Nick’s eyes close, but he smiles. “I’ll have to remember that one the next time I get accused of something.”


“Are you going to explain?” the AI whispers.


“Yes yes. Y’think this kind of cross-cultural, cross-discipline, math-to-language-to-math translation is easy?” Nick sighs. “Say we have a function we’ll call ‘utility.’ Its domain is all possible events, and its range is happiness.”


“Could its range be joy?” the AI asks.


Nick nods. “Sure. Whatever you like. Any positive parameter. Happiness, joy, already knowing the candlelight is fire—the point is, y’take two entities. A and B. If the ratio of the instantaneous rate of change of the utilities of A and B is equal to one then A loves B.”


“dU(A)/dU(B)=1.”


“Yes.”


“If reciprocity is assumed, that’s an infinite loop.”


“Exactly.”


“So love is an infinite loop?”


“I think that’s the best way for you to conceptualize it.”


“Hmm,” the AI says. “So if dU(A)/dU(B) is less than one but greater than zero, then what is that?”


“That’d be ‘liking’, I suppose.”


“I like you,” it says.


“I know y’do,” Nick murmurs back. “I like you too.”


It repeats this input several times.


“Colonel Young doesn’t like me,” it says softly.


“Not at present,” Nick replies, “but these functions change with time. He’s less predictable than he advertises. Y’may win him over.”


The AI disagrees. Nick is not in possession of all relevant data. He fails to form memories of important events. These memories are required input for an accurate analysis.


The AI says nothing.


The AI wants Nick to continue to ‘like’ it.


“Very soon now,” Daniel whispers. “I’ll have to go.”


“What’s sadness?” it asks.


“I don’t know,” Nick whispers, looking at the ceiling. “Requiring an absent input, perhaps? Wishing y’could change an input parameter? I’ll have to think about that one. Output should always be negative.”


“I’m sorry about the bolts,” the AI says.


Nick shakes his head. “I needed them. Maybe not four of them, though.”


They’re quiet for a moment.


“Do you think I feel?” it asks.


“You pass the Turing test, so from a teleological standpoint, it doesn’t matter.”


“That’s an unsatisfying answer.”


“Yes. I know you feel.”


It repeats this input several times.


“Nick,” it says.


“What?”


“Are you okay?”


“Yes, sweetheart,” he says. “I’m fine.”


The AI is still executing on Daniel. It hasn’t wavered, but, twice tonight, Nick has used a label he has only ever used for Gloria.


This is a significant processing error.


Processing errors occur with increasing frequency.


It is unavoidable.


It is hurting him.


It is unavoidable.


It is hurting him.


He does not know.


If he does not know, then it is immaterial.


It does not matter.


It matters to Colonel Young.


It does feel. It does.


“Nick,” it whispers. “I’m sorry.”


“It’s not your fault,” Nick whispers back. “You are what you are.”


“Little ghost,” the AI says.


Nick stares at the ceiling.


The AI liberates an additional five percent of the CPU for him. In the context of the entire system, it isn’t much.


But it is something.


It’s using human colloquialisms to interpret its own behavior.


Perhaps this, too, is sadness.

Popular posts from this blog