Escape Pod 354: The Caretaker
By Ken Liu
Motors whining, the machine squats down next to the bed, holding its arms out parallel to the ground. The metal fingers ball up into fist-shaped handholds. The robot has transformed into something like a wheelchair with treads, its lap the seat where my backside is supposed to fit.
A swiveling, flexible metal neck rises over the back of the chair, at the end of which are a pair of camera lenses with lens hood flaps on top like tilted eyebrows. There’s a speaker below the cameras, covered by metal lips. The effect is a cartoonish imitation of a face.
“It’s ugly,” I say. I try to come up with more, but that’s the only thing I can think of.
Lying on the bed with my back and neck propped up by all these pillows reminds me of long-ago Saturday mornings, when I used to sit up like this in bed, trying to catch up on grading while Peggy was still asleep next to me. Suddenly, Tom and Ellen would burst through the bedroom door without knocking and jump into the bed, landing on top of us in a heap, smelling of warm blankets and clamoring for breakfast.
Except now my left leg is a useless weight, anchoring me to the mattress. The space next to me is empty. And Tom and Ellen, standing behind the robot, have children of their own.
“It’s reliable,” Tom says. Then he seems to have run out of things to say, too. My son is like me, awkward with words when the emotions get complicated.
After a few seconds of silence, his sister steps forward and stands next to the robot. Gently, she bends down to put a hand on my shoulder. “Dad, Tom is running out of vacation days. And I can’t take any more time off either because I need to be with my husband and kids. We think this is best. It’s a lot cheaper than a live-in aide.”
It occurs to me that this would make an excellent illustration of the arrow of time: the care that parents devote to children is asymmetrical with the care that their children can reciprocate. Far more vivid than any talk of entropy.
Too bad I no longer have students to explain this to. The high school has already hired a new physics teacher and baseball coach.
I don’t want to get maudlin here and start quoting Lear. Hadn’t Peggy and I each left our parents to the care of strangers in faraway homes? That’s life.
Who wants to weigh their children down the way my body now weighs me down? My guilt should trump theirs. We are a nation built on the promise that there are no roots. Every generation must be free to begin afresh somewhere else, leaving the old behind like fallen leaves.
I wave my right arm – the one arm that still obeys me. “I know.” I would have stopped there, but I keep going because Peggy would have said more, and she’s always right. “You’ve done more than enough. I’ll be fine.”
“It’s pretty intuitive to operate,” Ellen says. She doesn’t look at me. “Just talk to it.”
The robot and I stare at each other. I look into the cameras, caricatures of eyes, and see nothing but a pair of distorted, diminished images of myself.
I understand the aesthetics of its design, the efficient, functional skeleton softened by touches of cuteness and whimsy. Peggy and I once saw a show about caretaker robots for the elderly in Japan, and the show explained how the robots’ kawaii features were intended to entice old people into becoming emotionally invested in and attached to the lifeless algorithm-driven machines.
I guess that’s me now. At sixty, with a stroke, I’m _old_ and an invalid. I need to be taken care of and fooled by a machine.
“Wonderful,” I say. “I’m sure we’ll be such pals.”
“Mr. Church, would you like to read my operation manual?”
The machine’s metal lips flap in sync with the voice, which is pleasant, androgynous, and very “computery.” No doubt that was a decision made after a lot of research to avoid the uncanny valley. Make the voice too human, and you actually diminish the ability to create false empathy.
“No, I don’t want to read your operation manual. Does it look like I want to hold up a book?” I lift my limp left arm with my right and let it drop. “But let me guess, you can lift me, carry me around, give me a restored sense of mobility, and engage me in healthy positive chitchat to maintain my mental health. Does that about cover it?”
My outburst seems to shock the machine into silence. I feel good for a few seconds before the feeling dissipates. Great, the highlight of my day is yelling at a glorified wheelchair.
“Can you help me up?” I feel foolish, trying to be polite to a machine. “I’d like a … bath. Is that something you can help with?”
Its movements are slow and mechanical, nonthreatening. The arms are steady and strong, and it gets me undressed and into the bathtub without any awkwardness. There is an advantage to having a machine taking care of you: you don’t have much self-consciousness or shame being naked in its arms.
The hot bath makes me feel better.
“What should I call you?”
That’s probably some clever acronym that the marketing team came up with after a long lunch. Sunshine Autonomous Nursing Device? I don’t really care. “Sandy” it is.
According to Sandy, for “legal reasons,” I’m required to sit and listen to a recorded presentation from the manufacturer.
“Fine, play it. But keep the volume down and hold the crossword steady, would you?”
Sandy holds the folded-up paper at the edge of the tub with its metal fingers while I wield the pencil in my good hand. After a musical introduction, an oily, rich voice comes out of Sandy’s speaker.
“Hello. I’m Dr. Vincent Lyle, Founder and CEO of Sunshine Homecare Solutions.”
Five seconds in, and I already dislike the man. He takes far too much pleasure in his own voice. I try to tune him out and focus on the puzzle.
“… without the danger of undocumented foreign homecare workers, possible criminal records, and the certain loss of your privacy …”
Ah, yes, the scare to seal the deal. I’m sure Sunshine had a lot to do with those immigration reform bills and that hideous Wall. If this were a few years earlier, Tom and Ellen would have hired a Mexican or Chinese woman, probably an illegal, very likely not speaking much English, to move in here with me. That choice is no longer available.
“… can be with you, 24/7. The caretaker is never off-duty … ”
I don’t have a problem with immigrants, per se. I’d taught plenty of bright Mexican kids in my class – some of them no doubt undocumented – back when the border still leaked like a sieve. Peggy was a lot more sympathetic with the illegals and thought the deportations too harsh. But I don’t think there’s a right to break the law and cross the border whenever you please, taking jobs away from people born and raised here.
Or from American robots. I smirk at my little mental irony.
I look up at Sandy, who lifts the lens hood flaps over its cameras in a questioning gesture, as if trying to guess my thoughts.
“… the product of the hard work and dedication of our one hundred percent American engineering staff, who hold over two hundred patents in artificial intelligence …”
Or from American engineers, I continue musing. Low-skilled workers retard progress. Technology will always offer a better solution. Isn’t that the American way? Make machines with metal fingers and glass eyes to care for you in your twilight, machines in front of which you won’t be ashamed to be weak, naked, a mere animal in need, machines that will hold you while your children are thousands of miles away, absorbed with their careers and their youth. Machines, instead of other people.
I know I’m pitiful, pathetic, feeling sorry for myself. I try to drive the feelings away, but my eyes and nose don’t obey me.
“… You acknowledge that Sunshine has made no representation that its products offer medical care of any kind. You agree that you assume all risks that Sunshine products may …”
Sandy is just a machine, and I’m alone. The idea of the days, weeks, years ahead, my only company this machine and my own thoughts, terrifies me. What would I not give to have Peggy back?
I’m crying like a child, and I don’t care.
“… Please indicate your acceptance of the End User Agreement by clearly stating ‘yes’ into the device’s microphone.”
I don’t realize that I’m shouting until I see Sandy’s face “flinch” away from me. The idea that even a robot finds me frightening or repulsive depresses me further.
I lower my voice. “If your circuits go haywire and you drop me from the top of the stairs, I promise I won’t sue Sunshine. Just let me finish my crossword in peace.”
“Would you drop me out the upstairs window if I order you to?”
“Have a lot of safeguards in those silicon chips, do you? But shouldn’t you prioritize my needs above everything else? If I want you to throw me down the stairs or choke me with your pincers, shouldn’t you do what I want?”
“What if I ask you to leave me in the middle of some train tracks and order you to stay away? You wouldn’t be actively causing my death. Would you obey?”
It’s no fun debating moral philosophy with Sandy. It simply refuses to be goaded. I’ve not succeeded in getting sparks to flow from its head with artfully constructed hypotheticals the way they always seem to do in sci-fi flicks.
I’m not sure if I’m suicidal. I have good days and bad days. I haven’t broken down and cried since that first day in the bathtub, but it would be a stretch to say that I’ve fully adjusted to my new life.
Conversations with Sandy tend to be calming in a light, surreal manner that is likely intentional on the part of Sandy’s programmers. Sandy doesn’t know much about politics or baseball, but just like all the kids these days, it’s very adept at making web searches. When we watch the nightly game on TV, if I make a comment about the batter in the box, Sandy generally stays silent. Then, after a minute or so, it will pop in with an obscure statistic and a non-sequitur comment that’s probably cribbed verbatim from some sabermetrics site it just accessed wirelessly. When we watch the singing competitions, it will offer observations about the contestants that sound like it’s reading from the real-time stream of tweets on the Net.
Sandy’s programming is surprisingly sophisticated. Sunshine apparently put a great deal of care into giving Sandy “weaknesses” that make it seem more alive.
For example, I discovered that Sandy didn’t know how to play chess, and I had to go through the charade of “teaching” it even though I’m sure it could have downloaded a chess program in seconds. I can even get Sandy to make more mistakes during a game by distracting it with conversation. I guess letting the invalid win contributes to psychological well-being.
Late morning, after all the kids have gone to school and the adults are away at work, Sandy carries me out for my daily walk.
It seems as pleased and excited to be outside as I am – swiveling its cameras from side to side to follow the movements of squirrels and hummingbirds, zooming its lenses audibly on herb gardens and lawn ornaments. The simulated wonder is so real that it reminds me of the intense way Tom and Ellen used to look at everything when I pushed them along in a double stroller.
Yet Sandy’s programming also has surprising flaws. It has trouble with crosswalks. The first few times we went on our walks, it did not bother waiting for the WALK signal. It just glanced around and dashed across with me when there was an opening in the traffic, like an impatient teenager.
Since I’m no longer entertaining thoughts of creatively getting Sandy to let me die, I decide that I need to speak up.
“Sunshine is going to get sued if a customer dies because of your jaywalking, you know? That End User Agreement isn’t going to absolve you from such an obvious error.”
Sandy stops. Its “face,” which usually hovers near mine on its slender stalk of a neck when we’re on walks like this, swivels away in a facsimile of embarrassment. I can feel the robot settling lower in its squat.
My heart clenches up. Looking away when admonished was a habit of Ellen’s when she was younger. She would blush furiously when she felt she had disappointed me, and not let me see the tears that threatened to roll down her cheeks.
“It’s all right,” I say to Sandy, my tone an echo of the way I used to speak to my little daughter. “Just be more careful next time. Were your programmers all reckless teenagers who believe that they’re immortal and traffic laws should be optional?”
Sandy shows a lot of curiosity in my books. Unlike a robot from the movies, it doesn’t just flip through the pages in a few seconds in a fluttering flurry. Instead, if I’m dozing or flipping through the channels, Sandy settles down with one of Peggy’s novels and read for hours, totally absorbed, just like a real person.
I ask Sandy to read to me. I don’t care much for fiction, so I have it read me long-form journalism, and news articles about science discoveries. For years it’s been my habit to read the science news to look for interesting bits to share with my class. Sandy stumbles over technical words and formulas, and I explain them. It’s a little bit like having a student again, and I find myself enjoying “teaching” the robot.
This is probably just the result of some kind of adoptive programming in Sandy intended to make me feel better, given my past profession. But I get suckered into it anyway.
I wake up in the middle of the night. Moonlight falls through the window to form a white rhombus on the floor. I imagine Tom and Ellen in their respective homes, sound asleep next to their spouses. I think about the moon looking in through their windows at their sleeping faces, as though they were suddenly children again. It’s sentimental and foolish. But Peggy would have understood.
Sandy is parked next to my bed, the neck curved around so that the cameras faced away from me. It gives the impression of a sleeping cat. So much for being on duty 24/7, I think. Simulating sleep for a robot carries the anthropomorphism game a bit too far.
“Sandy. Hey, Sandy. Wake up.”
No response. This is going to have to be another feedback item for Sunshine. Would the robot “sleep” through a heart attack? Unbelievable.
I reach out and touch the arm of the robot.
It sits up in a whirring of gears and motors, extending its neck around to look at me. A light over the cameras flicks on and shines in my face, and I have to reach up to block the beam with my right hand.
“Are you okay?” I can actually hear a hint of anxiety in the electronic voice.
“I’m fine. I just wanted a drink of water. Can you turn on the bedside lamp and turn off that infernal laser over your head? I’m going blind here.”
Sandy rushes around, its motors whining, and brings me a glass of water.
“What happened there?” I ask. “Did you actually fall asleep? Why is that even part of your programming?”
“I’m sorry,” Sandy says. It really does seem contrite. “It was a mistake. It won’t happen again.”
I’m trying to sign up for an account on this website so I can see the new pictures of the baby posted by Ellen.
The tablet is propped up next to the bed. Filling in all the information with the touch screen keyboard is a chore. Since the stroke, my right hand isn’t at a hundred percent either. Typing feels like poking at elevator buttons with a walking stick.
Sandy offers to help. With a sigh, I lean back and let it. It fills in my personal information without asking. The machine now knows me better than my kids. I’m not sure that either Tom or Ellen remembers the street I grew up on – necessary for the security question.
The next screen asks me to prove I’m a human to prevent spambots from signing up. I hate these puzzles where you have to decipher squiggly letters and numbers in a sea of noise. It’s like going to an eye exam. And my eyes aren’t what they used to be, not after years of trying to read the illegible scribbles of teenagers who prefer texting to writing.
The puzzles they use on this site are a bit different. Three circular images are presented on the page, and I have to rotate them so the images are oriented right-side-up. The first image is a zoomed-in picture of a parrot perched in some leaves, the bird’s plumage a cacophony of colors and abstract shapes. The second shows a heaped jumble of plates and glasses lit with harsh lights from below. The last is a shot of some chairs stacked upside-down on a table in a restaurant. All are rotated to odd angles.
Sandy reaches out with a metal finger and quickly rotates the three images to the correct orientation. It hits the submit button for me.
I get my account and the pictures of little Maggie fill the screen. Sandy and I spend a long time looking at them, flipping from page to page, admiring the new generation.
I ask Sandy to take a break and clean up in the kitchen. “I want to be by myself for a while. Maybe take a nap. I’ll call you if I need anything.”
When Sandy is gone, I pull up the Web search engine on the tablet and type in my query, one shaky letter at a time. I scan through the results.
The seemingly simple task of making an image upright is quite difficult to automate over a wide variety of photographic content … The success of our CAPTCHA rests on the fact that orienting an image is an AI-hard problem.
My God, I think. I’ve found the man in the Mechanical Turk.
“Who’s in there?” I ask, when Sandy comes back. “Who’s really in there?” I point my finger at the robot and stare into its cameras. I picture a remote operator sitting in an office park somewhere, having a laugh at my expense.
Sandy’s lens hoods flutter wide open, as if the robot is shocked. It freezes for a few seconds. The gesture is very human. An hour ago I would have attributed it to yet more clever programming, but not now.
It lifts a finger to its metallic lips and opens and closes the diaphragms in its cameras a few times in rapid succession, as though it were blinking.
Then, very deliberately, it turns the cameras away so that they are pointing into the hallway.
“There’s no one in the hall, Mr. Church. There’s no one there.”
Keeping the camera pointing away, it rolls up closer to the bed. I tense up and am about to say something more when it grabs the pencil and the newspaper (turned to today’s crossword) on the nightstand, and begins to write rapidly without the paper being in the cameras’ field of view. The letters are large, crude, and difficult to read.
PLEASE. I’LL EXPLAIN.
“My eyes seem to be stuck,” it says to the empty air, the voice as artificial as ever. “Give me a second to unjam the motors.” It begins to make a series of whirring and high-pitched whining noises as it shakes the assembly on top of its neck.
WRITE BACK. MOVE MY HAND.
I grab Sandy’s hand, the metal fingers around the pencil cool to the touch, and begin to print laboriously in capital letters. I’m guessing there is some feedback mechanism allowing the operator to feel the motions.
COME CLEAN. OR I CALL POLICE.
With a loud pop, the cameras swivel around. They are pointed at my face, still keeping the paper and the writing out of view.
“I need to make some repairs,” Sandy says. “Can you rest while I deal with this? Maybe you can check your email later if you’re bored.”
I nod. Sandy props up the tablet next to the bed and backs out of the room.
Dear Mr. Church,
My name is Manuela Aida ¡lvarez RÌos. I apologize for having deceived you. Though the headset disguises my voice, I can hear your real voice, and I believe you are a kind and forgiving man. Perhaps you will be willing to hear the story of how I came to be your caretaker.
I was born in the village of La Gloria, in the southeastern part of Durango, Mexico. I am the youngest of my parents’ three daughters. When I was two, the whole family made its way north into California, where my father picked oranges and my mother helped him and cleaned houses. Later, we moved to Arizona, where my father took what jobs he could find and my mother took care of an elderly woman. We were not rich, but I grew up happy and did well in school. There was hope.
One day, when I was thirteen, the police raided the restaurant where my father worked. There was a TV crew filming. People lined up on the streets and cheered as my father and his friends were led away in cuffs.
I do not wish to argue with you about the immigration laws, or why it is that our fates should be determined by where we were born. I already know how you feel.
We were deported and lost everything we had. I left behind my books, my music, my American childhood. I was sent back to a country I had no memories of, where I had to learn a new way of life.
In La Gloria, there is much love, and family is everything. The land is lush and beautiful. But how you are born there is how you will die, except that the poor can get poorer. I understood why my parents had chosen to risk everything.
My father went back north by himself, and we never heard from him again. My sisters went to Mexico City, and sent money back. We avoided talking about what they did for a living. I stayed to care for my mother. She had become sick and needed expensive care we could not afford.
Then my oldest sister wrote to tell me that in one of the old maquiladoras over in Piedras Negras, they were looking for girls like my sisters and me: women who had grown up in the United States, fluent in its language and customs. The jobs paid well, and we could save up the money my mother needed.
The old factory floor has been divided into rows of cubicles with sleeping pads down the aisles. Each girl has a headset, a monitor, and a set of controls before her like the cockpit of a plane on TV. There’s also a mask for the girl to wear, so that her robot can smile.
Operating the robot remotely is very hard. There is no off-time. I sleep when you do, and an alarm wakes me when you are awake. When I need to use the bathroom, I must wait until one of the other girls with a sleeping client can take over for me for a few minutes.
I do not mean to say that I am unhappy caring for you. I think of my mother, whose work had been very much like mine. She’s in bed back home, cared for by my cousins. I am doing for you what I wish I could be doing for her.
It is bittersweet for me to watch your life in America, seeing those wide streets and quiet neighborhoods through the camera. I enjoy my walks with you.
It is forbidden to let you know of my existence. I will be fined and fired if you choose to report it. I pray that you will keep this our secret and allow me to care for you.
Tom calls and reveals that he has been getting copies of my bank statements. It was a necessary precaution, he explains, back when I was in the hospital.
“I need some privacy,” I say to Manuela. She scoots quickly out of the room.
“Dad, I saw in last month’s statement a transfer to Western Union. Can you explain? Elle and I are concerned.”
The money was sent to a former student of mine, who’s spending the summer traveling in Durango. I asked him to look up La Gloria, and if he can locate Manuela’s family, to give the money to them.
“Who should I say the money is from?” he had asked.
“El Norte,” I had said. “Tell them it’s money that is owed to them.”
I imagine Manuela’s family trying to come up with explanations. Perhaps Manuela’s father sent the money, and is trying to send it without giving himself away to the authorities. Perhaps the American government is returning to us the property that we lost.
“I sent some money to a friend in Mexico,” I tell my son.
“You don’t know her.”
“How did you meet her?”
“Through the Internet.” It’s as close to the truth as anything.
Tom is quiet. He’s trying to figure out if I’ve lost my mind.
“There are a lot of scams on the Internet, Dad,” he says. I can tell he’s working hard to keep his voice calm.
“Yes, that’s true,” I say.
Manuela returns for my bath. Now that I know the truth, I do feel some embarrassment. But I let her undress me and carry me into the tub, her movements as steady and gentle as ever.
“Thank you,” I say.
“You are welcome.” The mechanical voice is silent a while. “Would you like me to read to you?”
I look into the cameras. The diaphragms open and close, slowly, like a blink.
[Author’s Note: The image-orientation CAPTCHA Reverse Turing Test (or “Human Interactive Proof”) is described by Rich Gossweiler, Maryam Kamvar, and Shumeet Baluja in “What’s Up CAPTCHA? A CAPTCHA Based On Image Orientation,” first published in Proceedings of the 18th International Conference on World Wide Web (Madrid, Spain, April 20 – 24, 2009). The quote in the story is taken from that paper, a copy of which may be retrieved at: http://www.richgossweiler.com/projects/rotcaptcha/rotcaptcha.pdf]
About the Author
A winner of the Nebula, Hugo, and World Fantasy awards, Ken Liu is the author of The Dandelion Dynasty, a silkpunk epic fantasy series (The Grace of Kings (2015), The Wall of Storms (2016), and a forthcoming third volume) and The Paper Menagerie and Other Stories (2016), a collection. He also wrote the Star Wars novel, The Legends of Luke Skywalker (2017).
About the Narrator
Tom Rockwell, AKA Devo Spice, is a comedy-rapper from New Jersey who has become one of the most popular artists on the Dr. Demento Show. He was the founding member of Sudden Death, the comedy rap group that had the #1, #2, and #4 most requested songs of 2007 on the Dr. Demento Show. (“Cellular Degeneration”, “Getting Old Sucks” and “Pillagers”, respectively.)
Devo Spice performs regularly across the country at music clubs, comedy clubs, and science fiction conventions. He has shared the stage with Dr. Demento, MC Lars, MC Frontalot, Jonathan Coulton, Paul and Storm, and many others. The live show features videos and animations synchronized with the music resulting in a hilarious show that goes over very well with a variety of audiences.