Menu

Essay

Backstory: Here Comes The Singularity

“Science fiction writers don’t predict the future (except accidentally),” argues novelist Cory Doctorow in an essay called “Radical Presentism.” “But if they’re very good, they may manage to predict the present.” By way of example, he goes on to gloss some of the most famous speculative literature of the 19th and 20th centuries: “Mary Shelley wasn’t worried about reanimated corpses stalking Europe, but by casting a technological innovation in the starring role of Frankenstein, she was able to tap into present-day fears about technology overpowering its masters and the hubris of the inventor. Orwell didn’t worry about a future dominated by the view-screens from 1984, he worried about a present in which technology was changing the balance of power, creating opportunities for the state to enforce its power over individuals at ever-more-granular levels.” 

Playwright Jordan Harrison tends to agree. In a recent interview for American Theatre magazine, he told Madeleine George, “Most science fiction is actually about now. Or at least my favorite science fiction involves memorializing the way things are — which so quickly becomes the past.” In Marjorie Prime, Harrison conceives a technology currently beyond our grasp: namely, “Primes,” or advanced holographic replicas of deceased loved ones, who can learn to behave in human-like ways. But its anxieties, insights, and moments of grace feel firmly rooted in a world that audiences will recognize. The Primes, and their owners’ deep need, or resistance, to connect with them, offer a dramatic occasion to reflect on mortality, loss, grief, the passing of time, what it means to love another person and, even more fundamentally, what it means to be human. 

“We’re trying to build something that communicates with humans and doesn’t just wait for the human to tell it what to do.”

A few days ago I sat down with my laptop and a cup of tea and, curious to learn more about the technological landscape in which I’m living (and what Science is cooking up for my grandkids), ran a Google news search for the phrase “artificial intelligence.” That led me down a long and winding rabbit hole, the upshot being something along the lines of We live in interesting times, y’all, but what really seems to be making headlines as I write this is something called “deep learning”: a process in which computer software sifts through large amounts of data and, by identifying patterns, develops a kind of autonomous creative intelligence. This technology is already being harnessed to detect financial fraud and to improve the voice recognition on your smartphone, and will probably lay the groundwork for other major advancements across sectors in the years to come. DARPA, for example, just funded a grant for a University of Arizona School of Music professor to build software that, by “listening” to and learning from patterns in a large library of jazz recordings, will eventually develop the ability to improvise and jam with human players. (Yes, that’s right: a robot that can jam, paid for by the Defense Advanced Research Projects Agency.) “It has its own knowledge base and can make its own decisions,” explained Professor Kelland Thomas of his program, MUSICA, or Music Improvising Collaborative Agent. “We’re trying to build something that communicates with humans and doesn’t just wait for the human to tell it what to do.” Whether you think this is good news or bad could be an interesting psycho-spiritual Rorschach test, especially considering who’s footing the bill.

Though it’s not called out by name, deep learning (albeit far more advanced than anything that exists today) also seems to be the technology behind Marjorie’s Primes, which, through ongoing conversation, become ever-more compelling facsimiles of their human models. “It’s like a child learning to talk, only it does it so quickly,” explains one character of their development. “That’s how we think we’re talking to a human, because it listens so well.”

Futurists like to talk about the “Singularity,” what Doctorow describes as “the moment at which human and machine intelligence merge, creating a break with history beyond which the future cannot be predicted, because the post-humans who live there will be utterly unrecognizable to us in their emotions and motivations.” Whether and when it will come and what it will mean is a subject ripe for discussion by people less freaked out by the whole idea than yours truly (a Luddite who recently went back to using a dumbphone), but by way of circling around it a bit — 

Madeleine George, in that same American Theatre interview, suggests that it’s life’s limits that give it meaning. “Don’t you think the reason we have emotions is because we’re mortal?” she asks. To my fascination, Jordan was a bit more circumspect, not willing to say unequivocally that the Primes feel nothing. 

He also talks about the 1982 television movie The Electric Grandmother, an adaptation of Ray Bradbury’s short story “I Sing the Body Electric,” in which, after their mother dies, three young children go to a robot factory with their dad to order a bespoke grandmother. Intrigued, I found the movie on YouTube, a grainy three-part upload from a disintegrating VHS tape. The electric grandmother is airlifted to their home by helicopter in a container that looks, as it hovers above their lawn, a little like antique SCUBA equipment, but which is actually a sarcophagus, and she dotes on the children and does their laundry and shoots milk for little Agatha’s breakfast out of a pressurized valve at the tip of her fleshy robot finger, until, one by one, the children grow up and go off to start their lives. The electric grandmother goes back to the factory, where she’ll wait until they need her again. 

Sitting in a circle of rocking chairs with the other electric grandmas (has she been there one year, or 40?), she recalls the first time little Agatha said “I love you.” “Sometimes,” she says to the other robot grandmas, “I almost feel that I feel.” And another replies, “So do I, sometimes.”

What would it mean for the next significant evolution of intelligent life to be not biological, but technological? Can a smartphone have a soul? I’m inclined to say no — but that might be very 21st-century of me.

Sarah Lunnie
Literary Manager