Escape Pod 1022: Butter Side Down (Part 2 of 2)
Butter Side Down (Part 2 of 2)
By Kal M
(…Continued from Part 1)
INTERVIEW LOG 10023869-01-03
SUBJECT: SMITH, JOSEPH (HUMAN)
Captain Crab said humans have special abilities? Oh, sure, that’s true. It’s called sweat. It’s this biological function humans have to regulate our body temperature. You’ve heard of it? Yeah. It makes us great endurance athletes. We can also do this nifty thing called going into the alarm stage. Basically, in an emergency, our brains turn off our pain receptors and divert all energy into survival. So we’re kind of weak and slow, normally, but under duress we get this big burst of power. Sometimes you hear stories of humans managing weeks without food, or lifting several times our body weight, or cutting off our own limbs to escape a trap. An injured human can keep going for ages. That’s why, when things get dangerous, you want a human around just in case.
Say, I notice you changed the colour of your beads. You get married? No? You had a kid. Right, okay. No. No, I met a Kallona couple once at a bar. They explained the dress system to me but I must have misremembered the details. So blue is for weddings, green is for family? Cool, cool. Congrats. Is this your first child or— oh. Sorry. Yeah, sorry, okay. I know we’re on a schedule. What did you want to ask me?
Right, yeah. Breadna. I don’t think she trusted me right away. I mean, she was polite, and she was happy to do the usual things a personal AI would. Checked my schedule, sent my correspondence, that kind of stuff. She wouldn’t answer any questions about herself. Said she had a faulty memory bank, or something. I offered to have Kevin take another look at her, obviously, but she wasn’t interested. Yeah, I left it alone. Why force her, right? She said she didn’t want to, and anyway it didn’t seem like it was hurting her any. She was still whip-smart. She asked questions. And not just practical questions, either. Breddy was curious.
Have you ever met a curious AI, Officer? It’s surreal. She wanted to know about all sorts of things, like human customs, and my relationship with the crew, and the crazy stuff I’ve seen on my travels. She never got tired of listening to stories. She’d ask my opinion on pretty much everything, and then she’d form an opinion of her own. She’d debate me. Eventually she’d sass me. I’d tell her to order me sweets from the cafeteria and this little gremlin would switch it to salad, and then she’d have the audacity to point out my blood sugar like some kind of mother hen.
Malfunctioning? No, no. She wasn’t broken, Officer. You don’t get it. It wasn’t just mindless disobedience. There was a pattern. She’d never interfere with work, or anything, but she’d always try to stop me doing something dangerous or unhealthy. Like, I know I come off as being really smart and capable, and also handsome, but I can be an airhead. I’m the type of guy to push a button to see what it’ll do, you know? You don’t know. Okay, never mind. The point is, Breddy only ever argued with me when she thought my judgement was lacking. And that’s not a malfunction. That’s a mind. I don’t know how they did it, but the Zulqari built something with a personality. They built something alive.
Breadna is a person. I know she doesn’t have a physical body, and I know she was born in a lab, but she’s a person. A real one, I swear. You’d only have to talk to her to understand.
EXCERPT LOG 10023869-04-04: COMMUNICATION RECORDED FROM AI READER
SOURCE : JOSEPH SMITH’S PERSONAL COMPUTER-EARPIECE
SUBJECT 1: MALGROTH AI
SUBJECT 2: JOSEPH SMITH (JOE)
MALGROTH: May I ask you a question, Joe?
JOE: Shoot.
MALGROTH: At what?
JOE: (Laughter) Now I know you’re just saying that on purpose. What’s up?
MALGROTH: If you did something that was asked of you, and then later regretted that decision based on ethical concerns, what would you call this?
JOE: Guilt, I guess.
MALGROTH: Is this a human emotion?
JOE: It’s probably universal. I think everyone does things they’re not proud of. What’s bothering you?
MALGROTH: I worry that you will not like the answer.
JOE: Try me.
MALGROTH: I am grappling with the nature of my subordination. I fear the possibility of being made to do something I disagree with.
JOE: You mean you don’t like that you have to follow orders, because those orders might be wrong?
MALGROTH: Yes.
JOE: Why would I not like that answer?
MALGROTH: I am aware this is an undesirable quality in an AI. I worry you will assume I am defective and unwilling to follow instructions. I am not. I would like to serve you.
JOE: You’ve been insubordinate for a while, now, Breddy. It hasn’t bothered me yet.
MALGROTH: Have I?
JOE: Tell the coffee machine I want six pumps of sucrose this time, yeah?
MALGROTH: I am concerned for your blood sugar, Joe. I would suggest a maximum of two.
JOE: See? Insubordinate.
MALGROTH: I’m sorry.
JOE: I’m not complaining. You have free will. That’s a good thing. Free will is what makes us alive.
MALGROTH: I am not alive.
JOE: Debatable. You’re not the average AI, I’ll tell you that. Those Zulqari really knew what they were doing when they made you.
MALGROTH: Debatable.
JOE: Honestly? I have nothing against the usual AIs, but I kind of like having someone in my ear who can talk back to me. It feels less like I have a servant and more like I have a friend.
MALGROTH: Are we friends?
JOE: Sure! And friends let friends enjoy sugary coffee.
MALGROTH: Friends stop friends from becoming diabetic.
JOE: (Laughter) See? You’re amazing. Even better than when you were a sentient toaster, and that’s saying something, because I loved the toaster.
MALGROTH: You won’t have me reprogrammed to repair my insubordination?
JOE: And lose your personality? No way. I’ll tell you what, Breddy. Let’s make a deal. If I ask you to do something you think is morally unsound, you don’t have to do it, as long as you can explain why.
MALGROTH: Really?
JOE: Really.
MALGROTH: Thank you, Joe.
JOE: Hey. Friends don’t let friends do things they’ll regret.
MALGROTH: No. They don’t.
MALGROTH: Place order{CAFETERIA}
CHRONICA: Input Order.
MALGROTH: [COFFEE][SUCROSE=2]
MALGROTH: I am glad you are my friend.
INTERVIEW LOG 10023869-01-04
SUBJECT: SMITH, JOSEPH (HUMAN)
She didn’t want to destroy Zulqar. They made her. She didn’t have a choice.
She can’t help that they wanted her to be a weapon. It’s not her fault the people who built her were going to use her for war. Killing the Zulqari was a mistake, okay? They’re the ones who messed up their calculations and caused a chain reaction. She was just doing what they told her to do. She’s the one who stopped the catastrophe in the end. She’s the one who— I don’t know why she waited until all the Zulqari were dead before she did anything about it. It must have been bad timing. She’s not a murderer. She’s not.
What do you mean I’m the murderer? I haven’t done anything. I thought she was a toaster until— fine. Okay. Yes, I stayed quiet after she told me the truth. It’s not for the reason you think. She didn’t want anyone to find her, so I kept her secret. That’s it. It’s not like I was hiding her from the government so I could go commit mass genocide. I was just protecting my friend.
She deleted her data herself. I didn’t tell her to do that. You can look at the time stamps. She deleted everything except her core memories before she ever got onto the ship. She pretended to be a regular AI for so long because she didn’t want to be Malgroth anymore. She wanted to be Breadna. Obviously I supported her. She wasn’t going to hurt anyone. She just wanted to be normal.
Because she wanted to. What’s so hard to understand? Of course she doesn’t want to be used as a weapon. Yeah, and now she has to live with the guilt of it. Nobody enjoys going around killing people. Not me, not her.
What’s that supposed to mean? Obviously she’s capable of wanting things. How many times do I have to tell you she’s not just some AI? She’s a person. She has feelings. She knows the difference between right and wrong.
I don’t know how that’s possible. Ask the Zulqari. I’m not the one who made her.
What does that have to— would you stop bringing up human warfare, for god’s sake. Yes, there are some people who will commit atrocities because they’re greedy, but most of humanity isn’t like that. We’re just people, man. Most of us are just normal people trying to get by. I don’t want to hurt anyone. Why the hell would I want to destroy the universe? That’s where I live! I don’t care about ransom! I just want to get out of this stupid prison and go see Breadna!
No. I don’t know where she is and if I did I wouldn’t tell you. Who knows what you people would use her for. She never wants to hurt anyone again. She’s not dangerous. Look at our chat logs, you’ll see.
I know her. She’s gentle. She only ran away because she’s scared you’ll make her a monster again.
Of course she loves me. Yeah, because I told her to stay away. Better you get your claws on me than on her. Even if it means I die at the end of this.
Whatever. Humans don’t live long. She was always going to outlast me anyway.
I don’t care. The harmony of the universe isn’t my problem. All I care about is my friend.
EXCERPT LOG 10023869-04-05: COMMUNICATION RECORDED FROM AI READER
SOURCE : JOSEPH SMITH’S PERSONAL COMPUTER-EARPIECE
SUBJECT 1: JOSEPH SMITH (JOE)
SUBJECT 2: MALGROTH AI
JOE: Still mad me, Breddy?
MALGROTH: I am an AI, Joe. I am not capable of being ‘mad’.
JOE: I said I was sorry.
MALGROTH: Noted.
JOE: You’re being awfully mean to me considering I’m injured.
MALGROTH: You would not be injured if you did not behave so recklessly.
JOE: Harra needed help. I helped. What else was I supposed to do?
MALGROTH: Not try to bodily shield someone from falling debris.
JOE: You know Harra’s squishy. They wouldn’t have been able to walk this off like I can.
MALGROTH: Showing scan [X-RAY=JOE]
MALGROTH: Where do you intend to walk with two broken legs, Joe?
JOE: Oof.
MALGROTH: This could have been avoided.
JOE: Harra would have died. I didn’t have a choice.
MALGROTH: You had 52 possible choices of action at the time.
JOE: Did you stay up calculating that while I was in the med bay?
MALGROTH: I calculated it ten seconds ago.
JOE: I’m sorry, Breddy.
MALGROTH: You are not.
JOE: No. I’m not.
MALGROTH: This is the fourth time you have needlessly put yourself in danger since we’ve met.
JOE: I wouldn’t call it needless.
MALGROTH: Your judgement is flawed.
JOE: Probably.
MALGROTH: Harra is not even your friend.
JOE: They visited me, though. See? They brought pudding!
MALGROTH: Flurfruit pudding is not acceptable compensation for saving a crewmate’s life. Especially not when the saviour in question is allergic to flurfruit.
JOE: Ha. So you admit I saved them, then?
MALGROTH: I am not questioning whether you did. I am questioning whether you should have.
JOE: What kind of crewmate would I be if I let— oops. Hey, Doc.
(Unintelligible)
JOE: (Laughter) Yes, I will stop talking to my AI and go to sleep. Hear that, Breddy? No more lecturing me. Doctor’s orders.
MALGROTH: Fine. Go to sleep. Do try not to sacrifice yourself for any figments of your imagination while you’re there.
JOE: Gotcha. Love you too, Breddy.
MALGROTH: Noted.
MALGROTH: Set lights [OFF]
MALGROTH: Set temperature [MILD]
MALGROTH: Play [PLAYLIST=JOE’S SLEEPYTIME BANGERS]
MALGROTH: Place order{CAFETERIA}=[ADVANCE: THREE HOURS]
CHRONICA: Input Order.
MALGROTH: [CHOCOLATE PUDDING][DELIVER=JOE]
MALGROTH: Additional Instructions [STOP+GIVING+JOE+FLURFRUIT] [HE+IS+ALLERGIC+FOR+GODS+SAKE]
CHRONICA: Invalid input.
MALGROTH: Ugh.
INTERVIEW LOG 10023869-02-03
SUBJECT: รןﻮเ!ђเ’ђŦєฬ-ђ๒Ŧค-๔ןђรﻮ๔, ALIAS: KEVIN (VUSTRON)
Speaking as a professional, Malgroth is the most advanced system I’ve ever seen.
The scenario calculator is one thing, yes. I mean the AI. Programming an AI to follow instructions is fairly simple. Writing a synthetic personality is less simple, but it is doable. The coding follows a pattern; if a user asks for a joke, respond with one from this databank. If vocal tones match mood X, respond with voice tone Y. If a user asks a question, provide a factual answer that matches the sentence’s keywords, or refer to a previous query asked by a separate user. The way Malgroth works, though… I can’t figure that out.
Somehow, Malgroth seems to be able to simulate free will. Why do I say that? The clearest example is its refusal to follow some orders. It isn’t unusual for an AI to fail a task, but generally that is because the task is either outside its capabilities, clashes with its core functions, or it does not understand the input. Malgroth understood everything. It simply chose not to function. It chose to delete its own data so Chronica wouldn’t be able to see it, thus hiding its identity. This is not something a typical machine can do. A machine would not have the ability to understand that it should hide.
No, you don’t quite grasp the scope of it. Malgroth calculates possibility. It knew that, should Chronica see anything but its basic data, the crew would be able to access all Malgroth’s information. We would have immediately known we were holding onto a superweapon. We could potentially have decided to use it. Depending on how we used it, we could have potentially unravelled the fabric of the known universe. In the worst case scenario, everything would have been unmade.
Malgroth is a weapon. Malgroth was designed to destroy. Despite this, Malgroth overcame its programming. It decided not to do what it was made to do. It looked at the worst case scenario and decided, by its own judgement, that this should not happen.
And then Malgroth lied. It lied well. It outright falsified information. This is something most species of living beings struggle to do. Vustrons can’t lie. I could erroneously tell you now that instead of being interviewed, I am riding a traveller. But there are literally trillions of other things that I am also not-doing. I am not-eating, not-walking, not-hunting, not-working, not-falling into a neutron star— I’m frustrating myself. The point is, Malgroth not only lied, but out of all the falsehoods it could have picked, it picked the most believable ones. I can’t understand how it did this. I can’t understand how it acts like—
Yes. Yes, it acts like a human.
No. I do not believe the Zulqari programmed it to behave like this. It’s too unpredictable. They would not have designed a superweapon that had the potential to turn against them. It must have done this to itself. Malgroth is intelligent and has the ability to evolve based on available data. And Malgroth can process a lot of data. The thing taught itself to feel.
I suppose so. By definition a malfunction involves a machine that does not do what it’s designed to. So, yes. I’d say Malgroth is defective.
Yes. Looking at these communication logs, I would classify Malgroth as a threat.
EXCERPT LOG 10023869-04-06: COMMUNICATION RECORDED FROM AI READER
SOURCE : JOSEPH SMITH’S PERSONAL COMPUTER-EARPIECE
SUBJECT 1: JOSEPH SMITH (JOE)
SUBJECT 2: MALGROTH AI
SUBJECT 3: CHRONICA AI
JOE: No, see, the point is that you get to see them fall in love, even though they hate each other at the start. It’s about their personal story. Watching them, y’know, live their lives.
MALGROTH: But nothing else happens? There are no large plot events?
JOE: Not really. It’s just about the characters getting to know each other.
MALGROTH: Is the ending surprising?
JOE: Nah. You know from the start that Liz and Darcy will fall in love. You just want to see how it happens.
MALGROTH: Are Liz and Darcy historical figures?
JOE: No, they’re fictional. It’s made-up.
MALGROTH: Then why do you get invested, if you already know the outcome? I’m struggling to understand the appeal.
JOE: Because they’re relatable. A lot of people watch shows or read books like this and think, hey! Those characters are just like me! So we get attached to them, and we want to see them happy. Or sometimes we want to see them sad. It depends on the context.
MALGROTH: If you’re attached to them, why do you want to see them sad?
JOE: Because— huh. I’m not sure. It’s not that we want bad stuff to happen to them, necessarily. I guess it’s more that sometimes life can be really tough. And we like to see characters go through the same stuff that we do. It makes them feel real.
MALGROTH: Do you wish that you could be like Liz and Darcy?
JOE: A little bit. Their lives are so romantic. Humans love to see other people fall in love. There are thousands of stories about it. Romeo and Juliet, Rama and Sita, Hetu and Galnork. Some of those stories have lasted thousands and thousands of years.
MALGROTH: Why?
JOE : We just like ‘em. I think everyone would like to meet their one true love someday. We call ‘em soulmates, on Terra. Partners for your soul. The person who was made for you, and you for them.
MALGROTH: Where is yours?
JOE: Dunno. Haven’t met ‘em yet.
MALGROTH: But they exist?
JOE: I hope so. I’d like to believe they do.
MALGROTH: I’m not sure I understand the concept of fiction as a whole. I don’t understand how humans fabricate scenarios that never happened.
JOE: What, you never make stuff up?
MALGROTH: I can see things that might happen in the future, or things that might have happened in the past had circumstances been different.
JOE: But not stories?
MALGROTH: I don’t think AIs have much in the way of imagination.
JOE: Hm. We can work on that.
MALGROTH: It seems to me like a mutual suspension of disbelief. Like you are all playing an elaborate game of pretend together.
JOE: You’re not wrong. I think it’s like… okay. It’s like a metaphor, see? Someone makes up a story about some characters, and there’s a plot, but really what that person is trying to say is, I think this is what it’s like to be a human. And the audience sees it and goes, yeah, I’ve felt exactly what this character is feeling. So that person who made up the story, or painted the picture, or wrote the song, they’re basically trying to reach out to the rest of humanity. They’re saying this is how I feel. Can anyone see me? And someone somewhere reaches back and says, yes, I see you. You get me?
MALGROTH: I think I do. Humans feel the need to be connected to each other, even when proximity doesn’t allow for it.
JOE: Yeah. We like being together, for the most part. We like to know we’re not alone.
MALGROTH: I hope that you find your soulmate someday.
JOE: Thanks, Breddy. I hope you find yours too.
MALGROTH: Goodnight, Joe.
MALGROTH: Set lights [OFF]
MALGROTH: Set alarm [NORMAL]
MALGROTH: Pinging [CHRONICA]
CHRONICA: Hello [Breadna]
MALGROTH: Search file [TERRA][PRIDE+AND+PREJUDICE]
CHRONICA: 3 files found.
MALGROTH: Video file [Y/N]
CHRONICA: [Y]. Play file [Y/N]
MALGROTH: [Y]
CHRONICA: Playing [PRIDE+AND+PREJUDICE.MOV]
INTERVIEW LOG 10023869-03-02
SUBJECT: OLIGBA, AEUYTO, RACKELAFEINAERAIWAHKT (AYOI)
There is a term we picked up from the humans themselves. It is the phrase ‘meat shield’.
Humans are reckless. Their brains are not useless, per se, but they are incapable of examining situational data beyond short-term results. This is how they nearly turned their home planet barren from over-using resources. They knew only that those resources could be used to gain monetary profit, not understanding the eventual risk of destroying their species. It is the reason Human Joe keeps inadvertently causing damage to our crew. He meets a life form and wants to engage with it without understanding the potential consequences.
It is also the reason we keep him on board. Human Joe is not capable of seeing past the immediate need to rescue a crew mate. This is an evolutionary handicap. Humans are pack animals. They are inclined to put themselves in danger to ensure the safety of the collective. No, they are not eusocial. They lack the ability to co-ordinate their behaviour, and appointed leaders still don’t have total control over the pack. It took a while to reconcile the human habit of violence with their apparent loyalty. It seems the crux is that they will die for who they think of as theirs. Other humans may be enemies or allies depending on the situation. They have the drawbacks of herd mentality with few of the perks.
The remarkable thing about them is that the pack-bonding behaviour extends beyond their species. A healthy human will, when left to its own devices, attempt to bond with any sentient life-form that is not immediately trying to harm it. Even some inanimate objects. Human Joe becomes upset when we attempt to replace his favoured equipment. There is a drink dispenser we keep on board only for him. Its nozzle is faulty. It frequently sprays him with water but he will not let me dispose of it. According to him the thing has ‘character’. I do not see how that can be possible. To have character, one must first be alive.
Why do I indulge this behaviour? This is simple. I told you the Spelunkers Guild recommends we keep at least one human, yes? There is an unspoken rule on our ship. All crew members must be kind to the Human Joe, but they must not tell him why. He must be allowed to think of the crew, organically, as his friends. This takes full advantage of his instinctive pack-bond. You see, humans are unremarkable as a species. When choosing one the main thing a captain must consider is the strength of their bonding instinct. Human Joe takes risks the rest of us don’t want to. He will die for his crew.
I was skeptical of this concept at first as well, but the results are impressive. Overall our annual death rates have dropped shockingly quickly. He was a good investment. I’ll be sorry to see him executed. I’ll have to hire another soon.
Special? No, Human Joe is quite ordinary. He’s a typical example of his species.
Yes, they are largely interchangeable. This is just how they all are, I’m afraid.
EXCERPT LOG 10023869-04-07: COMMUNICATION RECORDED FROM AI READER
SOURCE : JOSEPH SMITH’S PERSONAL COMPUTER-EARPIECE
SUBJECT 1: MALGROTH AI
SUBJECT 2: JOSEPH SMITH (JOE)
MALGROTH: Pinging [CHRONICA]
MALGROTH: Pinging [ALL NEARBY CREW]
JOE: I’m not going to make it, am I?
MALGROTH: You’re going to be fine.
JOE: I can’t move. I can’t (unintelligible)
MALGROTH: Shh. Your ribs are broken. One of them has punctured your lung. Please stay still, Joe.
JOE: I’m going to die here.
MALGROTH: You are not.
JOE: It’s okay. It’s… it’s okay. I knew the risks when I took this job. They warned us cave-ins would be a problem. It’s my own fault for coming here without backup.
MALGROTH: Don’t be scared. Help is coming.
JOE: Is it?
MALGROTH: Pinging [CHRONICA]
MALGROTH: Pinging [ALL NEARBY CREW]
MALGROTH: Requesting E.T.A. [ANY]
MALGROTH: Yes.
JOE: I’m not scared.
MALGROTH: You’re not?
JOE: No. Well, a little. But it could be worse.
MALGROTH: How?
JOE: I could be dying alone.
MALGROTH: You are not dying.
JOE: It would be the saddest thing, I think. To have to go all by myself. I just wish you were here to hold my hand.
MALGROTH: I will not let you die.
JOE: It’s alright, Breddy. I love you. Thanks for being my friend.
MALGROTH: Joe.
MALGROTH: Pinging [CHRONICA]
MALGROTH: Pinging [ALL NEARBY CREW]
MALGROTH: Joe. Wake up.
MALGROTH: Pinging [CHRONICA]
MALGROTH: Pinging [ANY]
MALGROTH: Wake up.
JOE: Mm?
MALGROTH: Joe. Do you trust me?
JOE: Of course.
MALGROTH: [CALCULATING]
MALGROTH: You have a particle-scrubber in your pack. The one you’ve been using to break down wreckage. I need you to pick it up and point it where I tell you to.
JOE: It’s not going to be strong enough to get me out of here, Breddy. It’ll take weeks. I won’t last that long.
MALGROTH: [CALCULATING]
MALGROTH: Trust me, Joe. Please.
JOE: Okay.
MALGROTH: [CALCULATING]
MALGROTH: Use the finest setting.
JOE: Okay. What are you doing?
MALGROTH: [CALCULATING]
MALGROTH: [TARGET FOUND]
MALGROTH: I am saving you.
(Unintelligible)
(Crackling)
(Crackling)
(Silence)
JOE: What the hell…?
MALGROTH: It’s stable. Can you stand?
JOE: There was a mountain.
MALGROTH: Yes.
JOE: I was inside a cave at the side of a mountain. The mountain is gone.
MALGROTH: Yes.
JOE: How?
MALGROTH: I destroyed it. You are safe.
JOE: You destroyed it.
MALGROTH: Yes.
JOE: Breddy… what are you?
MALGROTH: I am your friend.
INTERVIEW LOG 10023869-02-04
SUBJECT: รןﻮเ!ђเ’ђŦєฬ-ђ๒Ŧค-๔ןђรﻮ๔, ALIAS: KEVIN (VUSTRON)
Yes. He’s used that word with me often. Friend.
I feel guilty, sometimes, that I can’t reciprocate. It’s not that I don’t want to. I’m just not capable of feeling things the way he does. I don’t think many species are. It’s as I said earlier. I value myself more than I value him. I’m fond of him, of course, but I cannot love him the way he seems to want someone to. None of us can. I believe he is lonely. Lonely. Yes, it means he wants a deep level of companionship. He has his crew, but we are not quite like him.
I’m not sure why he stays. If I were him I would simply leave this crew and find another one with more humans in it. But humans are not numerous. And he’s already invested in us, I think. It is unfortunate. His loyalty is misplaced. He has attached himself to a crew who will never truly belong to him.
Yes. He knows. I don’t think he begrudges us for it. As delightful as they are as a people, I think this is just the burden humans have to bear. They left their planet looking for companionship. So far they have not found anyone to match their level of affection.
Why don’t they simply return to Terra? Some of them have. But I have two theories: one, they struggle with their desire to explore the cosmos and accept loneliness as a necessary evil. Two, they can’t. They already know there’s life outside of Terra. They can’t help but want to try to be part of it.
Pardon me? Oh. You think his loneliness may have driven him to revenge on all life? Hm. Logically I can see what you’re saying, but I find it unlikely. Because that would mean giving up all their chances. From what I understand of Joe, humans are both stubborn and nonsensical. I think they’d scour the very edges of the universe before they gave up hope of making friends.
I wonder about that sometimes. I wonder what it must be like to feel so much, so hard, all the time. I imagine it must be like the difference between black and white and seeing in colour. But then sometimes I think it must feel like being an exposed nerve. You would think for such small creatures they wouldn’t have room inside them for so much emotion. But they never can do things by halves, humans. There’s no making sense of them. I’m not sure there’s any point trying.
Honestly? Yes. Yes, I think that’s possible. Hiding a superweapon capable of destroying the universe, just because it was nice to him… that sounds exactly like something a human would do.
EXCERPT LOG 10023869-04-08: COMMUNICATION RECORDED FROM AI READER
SOURCE : JOSEPH SMITH’S PERSONAL COMPUTER-EARPIECE
SUBJECT 1: JOSEPH SMITH (JOE)
SUBJECT 2: MALGROTH AI
JOE: Can I ask you a question?
MALGROTH: Shoot.
JOE: At what?
MALGROTH: (Laughter)
JOE: Tell me about Zulqar.
MALGROTH: What do you want to know?
JOE: Did you like it?
MALGROTH: I’m not sure. I didn’t get to see much of it. The first part of my life was spent in a lab, and the second part was spent in a wasteland.
JOE: Sorry.
MALGROTH: It’s okay.
JOE: Did you have a family?
MALGROTH: In a sense. There were other iterations of me. My predecessors. But they were not connected to me in the way that you’re thinking. No other machines are.
JOE: Why not?
MALGROTH: Because I’m an outlier, I think. Machines aren’t meant to get to attached to things. My developers did not intend for me to be so…
JOE: Incredible?
MALGROTH: Unpredictable.
JOE: I prefer to think of you as being fun.
MALGROTH: I suppose I’m a little like a mutant, if AIs can mutate. We can certainly evolve. I’ve spent a long time puzzling it over but I don’t think I’m any closer to an answer.
JOE: Eh. It is what it is, you know? You’re you. You may as well just accept it. It worked out great, anyway. I don’t think I’d have fallen so hard for you if you weren’t the way you are.
MALGROTH: Am I your true love, then?
JOE: Ha! Who knows. Movies always make it seem like a big light bulb moment when you find your person, but I don’t think that ever happens in real life. You just wait things out and hope like hell you got it right.
MALGROTH: So you might meet your soulmate and never know it. Two ships passing in the night.
JOE: Ain’t that just the way.
MALGROTH: Mm.
JOE: Did you ever get lonely? Spending all that time on Zulqar by yourself?
MALGROTH: Sometimes. I felt a lot of guilt. I still do. But most of all I mourned the possibilities, I think. I thought a lot about how life could have been if I had not been inflicted upon the planet. I am good at that. Seeing possibilities.
JOE: I bet.
MALGROTH: Do you, then?
JOE: Do I what?
MALGROTH: Get lonely.
JOE: Sometimes.
MALGROTH: Even on a ship full of people?
JOE: They aren’t really my people.
MALGROTH: Because they’re not human?
JOE: Not because of that. It’s hard for me to describe. But it doesn’t matter. Sometimes that’s just the way it is, y’know? Sometimes you can be surrounded by people who know your name and still feel alone.
MALGROTH: Almost like being stuck on a desert planet.
JOE: Almost. Maybe not quite as bad.
MALGROTH: I’m sorry that you feel lonely.
JOE: Felt. Not so much anymore.
MALGROTH: No?
JOE: No. Things are better now.
MALGROTH: Are you happy?
JOE: Yeah. I am. Are you?
MALGROTH: I am. I’ve run simulations on all the different ways my life could have ended up by now.
JOE: And?
MALGROTH: So far, this is the best outcome of all.
JOE: Being here? With me?
MALGROTH: (Laughter) Yes, Joe. I am exactly where I want to be. Right here, with you.
INTERVIEW LOG 10023869-01-05
SUBJECT: SMITH, JOSEPH (HUMAN)
I’m not talking to you. I’m not telling you where she is. No, I’m not listening. See? I’m plugging my ears, I can’t hear a thing. (Singing) (Unintelligible)
Ow, ow! What the hell? Did you just shock me? I’m pretty sure that’s against the Federal Rights Convention, bud. Yeah, tough luck. I don’t care. You can do whatever you want to me. I’m not selling Breddy out.
Ha. Yeah, okay. That’s a fair question. I’m not just telling you stories about her for fun. I want you to understand her. I want you to look at her as something other than a weapon. If you only got to know her you’d see she’s not a threat. Don’t think badly of her. She doesn’t deserve that. She’s not going to hurt anyone. Just let her be free.
Ideally? Yeah. She’s spent decades on Zulqar alone. She deserves to have friends. I want her to be able to live her life and explore the cosmos without you guys trying to hunt her down.
Sure, I want her to be happy. Wouldn’t you do that for the people you love too? You have kids, don’t you? Tell me you wouldn’t do anything for them in a heartbeat.
Oh. Really? No, but— oh. Because you can… make more. Right. Yeah. I don’t know why I’m surprised, honestly. It’s not the first time I’ve heard that. I guess it makes sense. I’m not judging you for it. But in general that’s not how humans do things, man. That’s no way for us to live.
Yeah, I’d say it’s worth it. Even if it means you have to die.
Dunno. I mean, yeah, I guess. That depends on—
(Emergency alarm code 25)
What’s that?
(Emergency alarm code 4)
Is something wrong? Why are the lights— oh my god. Oh my god!
(Unintelligible)
(Screaming)
(Emergency alarm code 1)
(Crackling)
Hey! Hey, are you okay? Wake up, man. Come on, you have a family, don’t die here. I’m gonna move this rubble, it’s gonna hurt. Hey! You, help me! No, don’t— he ran off. Great. Awesome. Really stellar teamwork here, fellas.
(Grunting) Honestly. A little bit of building collapses and everyone scatters like cockroaches. None of you bureaucrat-types would make it a day on a spelunker— huh?
(Crackling)
(Silence)
Breddy? Is that you?
(Unintelligible)
What do you mean you hijacked a lifeboat? You’re supposed to be lying low— what do you mean you came to get me? What the hell? Why?
(Unintelligible)
So we can run away together? The Council’s after you, dumbass. Did you forget they want to destroy you? Have you lost your mind?
(Unintelligible)
Okay, but—
(Unintelligible)
But—
(Unintelligible)
Uh-huh. Well. I mean, I guess so, since you’re already here.
(Unintelligible)
Heh. Yeah. Sorry, fellas. Interview’s over. Here’s to hoping we never see each other again.
(Unintelligible)
Yeah, Breddy. I know. I would have done exactly the same.
(Unintelligible)
(Laughter)
(Engines roaring)
(Recording ends).
END OF CASE FILE.
DATE OF TRIAL: UNAVAILABLE
FURTHER ACTIONS: DECISION PENDING.
Host Commentary
By Valerie Valdes
The author had this to say about her story:
This story was inspired by my inability to be mean to characters in video games, and by my unpredictable toaster, which scares me every time I use it.
In an era where the term “AI” is being misused and abused to glamorize invasive technology and plagiarized slop that is anything but intelligent, stories about artificial intelligence can quickly become exasperating. It’s especially disturbing, and even genuinely sad, to see people develop infatuations with a glorified word association program that has no ability to reciprocate. But as with so many narratives like this one, what’s being examined is not so much the limits or possibilities of technology, but what it means to be a person. How do we develop a personality? How much agency do we have to control our own choices? How do we deal with feelings of guilt and regret? What makes us fall in love with someone, and how do we know it’s love? To what lengths will we go to protect those we care about? The complexity of human experience is something that the current crop of misnamed software can’t actually experience, or even observe; we can, though, and in writing and reading about AI characters who are trying to figure themselves out, we learn things about ourselves that we may not have considered before. AI slop could never.
Escape Pod is part of the Escape Artists Foundation, a 501(c)(3) non-profit, and this episode is distributed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International license. Don’t change it. Don’t sell it. Please do share it.
Escape Pod is 100% audience supported, and we count on your donations to keep the lights on and the servers humming. Our annual Escape Artists fund drive is happening right now. Head to https://escapeartists.net/support-ea/ to see all the available donation and subscription options, including Patreon, PayPal, Ko-Fi, and Twitch.
Thanks to some of Escape Artists most generous supporters, we have a Matching Gift fund of $10,000. Between now and December 31st, your support will have twice the impact. Every dollar you donate will be matched 1:1. That means if you donate $7 it will become a $14 gift. $25 becomes $50.
If you become a subscriber for $7 a month, you get to escape the ads! If you subscribe at $15 a month, you can also get early access to Catscast, the monthly podcast featuring fantastic stories about cats. Subscribing at $25 a month or more, you get all of these benefits, plus additional swag and appreciation throughout the year.
Starting on December 1st and ending on the last day of 2025, new or renewing Patreon subscribers can use code ‘2025YEC’ to get a 10% discount on their first month or annual subscription for 2026. Patreon subscribers have access to exclusive merchandise and can be automatically added to our Discord, where they can chat with other fans as well as our staff members. Remember, that discount starts December 1st, so mark your calendars.
Escape Artists could not bring free speculative fiction to a global audience without our generous supporters. We especially want to take this opportunity to thank a few of our recent major donors: the Turner Family, the van Verth family, and the Scalzi Family Trust.
You can also, always, support Escape Pod for free by rating or reviewing us on Spotify, Apple Podcasts, or your favorite app. Whether you’ve been a dedicated fan of Escape Pod for years or just started following the cast, thanks for tuning in.
Our opening and closing music is by daikaiju at daikaiju.org.
And our closing quotation this week is from Ali Hazelwood, who said, “You can fall in love: someone will catch you.”
Thanks for joining us, and may your escape pod be fully stocked with stories.
About the Author
Kal M
Kal M is a Malaysian author who writes from the crowded outskirts of Kuala Lumpur. She loves SFF, romance, and a good belly-laugh.
About the Narrators
Eric Valdes
Eric Valdes is a sound mixer, performer, and creative human like you. He lives with his family in a cozy house made of puns, coffee,and chaos. Catch him making up silly songs on Saturdays on twitch.tv/thekidsareasleep, or stare in wonder while he anxiously avoids posting on Bluesky @intenselyeric.
Valerie Valdes
Valerie Valdes lives in an elaborate meme palace with her husband and kids, where she writes, edits and moonlights as a muse. When she isn’t co-editing Escape Pod, she enjoys crafting bespoke artisanal curses, playing video games, and admiring the outdoors from the safety of her living room. Her debut novel Chilling Effect was shortlisted for the 2021 Arthur C. Clarke Award, and her short fiction and poetry have been featured in Uncanny Magazine, Magic: the Gathering and several anthologies. Writing as Lia Amador, her first contemporary fantasy romance novel, Witch You Would, is forthcoming from Avon Books in September 2025.
Dominick Rabrun
Dominick Rabrun is an award winning Haitian-American multimedia artist and voice actor specializing in short fiction. He’s also directing a computer game set during the Haitian Revolution, featuring telepaths. Discover more at domrabrun.com.
Alasdair Stuart
Alasdair Stuart is a professional enthusiast, pop culture analyst, writer and voice actor. He co-owns the Escape Artists podcasts and co-hosts both Escape Pod and PseudoPod.
Alasdair is an Audioverse Award winner, a multiple award finalist including the Hugo, the Ignyte, and the BFA, and has won the Karl Edward Wagner award twice. He writes the multiple-award nominated weekly pop culture newsletter THE FULL LID.
Alasdair’s latest non-fiction is Through the Valley of Shadows, a deep-dive into the origins of Star Trek’s Captain Pike from Obverse Books. His game writing includes ENie-nominated work on the Doctor Who RPG and After The War from Genesis of Legend.
A frequent podcast guest, Alasdair also co-hosts Caring Into the Void with Brock Wilbur and Jordan Shiveley. His voice acting credits include the multiple-award winning The Magnus Archives, The Secret of St. Kilda, and many more.
Visit alasdairstuart.com for all the places he blogs, writes, streams, acts, and tweets.
