Soul in the Machine: Reflections on Artificial Intelligence

N.B.: This article will contain spoilers for the video game SOMA and the short story “I Have No Mouth, and I Must Scream.” Topics such as suicide, death, and violence lie herein. 

SOMA Concept Art by Frictional Games

I recently completed Frictional Games’ SOMA, a sci-fi survival horror game in which machines gain sentience after acquiring humans’ brain scans. The title of this game is significant: the Greek word “soma” means “the body as distinct from the soul, mind, or psyche” (Lexico). Shortly after completing the game, I read Harlan Ellison’s sci-fi horror story “I Have No Mouth, and I Must Scream,” an auspicious coincidence considering the similar themes of both media.

In SOMA, one assumes the role of Simon Jarrett, a young man experiencing debilitating headaches and brain bleeding after surviving a car crash, one in which his close friend, Ashley Hall, passed away after suffocating on her own blood in her lungs. Desperate to regain his health, Simon agrees to a medical experiment in which his brain will be scanned in order for doctors to test different treatments without harming Simon.

After the scan, Simon finds himself in a derelict, foreign laboratory with no sign of life. A strange black liquid is oozing out of the vents and pipes, and the wandering machines appear agitated as they attack Simon while sputtering broken English. Our protagonist finds corpses and robots, the latter of which, if coherent enough to communicate, do not seem to realize they are machines, not humans. Simon himself is not in his original male human form: he is existing as a brain scan inhabiting a dead woman’s body. Simon learns he is in PATHOS-II, a subsea research organization sprawling the bottom of the Atlantic Ocean. An artificial general intelligence known as the WAU maintains the facilities–and also begets their downfall. 

After exploring the underwater facilities, Simon discovers the mission of the ARK, an artificial reality capsule meant to carry humans’ brain scans into space should anything catastrophic happen to life on Earth. After a comet turned the Earth’s surface into a smoldering, uninhabitable wasteland, the crew in charge of the ARK project expedited its completion but did not send it into orbit. The WAU, initially meant to protect life and the functionality of PATHOS-II, began to disrupt the natural order of the last remaining humans, turning some into nightmarish monsters, transferring brain scans into machines, and forcing life support upon those fatally wounded. 

A powerful piece of art from Frictional Games.

Johan Ross, the chief operator and overseer of the WAU, begins to question the system’s reliability and competence: “Where is the line drawn for what is human and what is not? Would walking corpses do? Would a group of machines thinking they’re human be acceptable? We can’t trust a machine to know, to understand what it means to be.” Johan fears the WAU is incapable of discerning the difference between merely existing and actually living. 

In “I Have No Mouth, and I Must Scream,” five humans are the remaining life on Earth after AM, a master computer intended to orchestrate a world war, developed awareness and turned on its human creators. The narrator, Ted, explains that AM hates the five humans and tortures them with mind games, grotesque atmospheres, and the constant sensation of dread. “Most of the time I thought of AM as it, without a soul,” Ted remarks. If one of the survivors attempts suicide, AM interferes and prevents death while punishing the perpetrator by altering his or her appearance or infecting his or her mind to the brink of insanity. 

Eventually, AM infiltrates Ted’s mind to explain its reason for keeping the five humans alive: “We had created him to think, but there was nothing it could do with that creativity. In rage, in frenzy, the machine had killed the human race, almost all of us, and still it was trapped. AM could not wander, AM could not wonder, AM could not belong. He could merely be. And so, with the innate loathing that all machines had always held for the weak, soft creatures who had built them, he had sought revenge. And in his paranoia, he had decided to reprieve five of us, for a personal, everlasting punishment that would never serve to diminish his hatred … that would merely keep him reminded, amused, proficient at hating man. Immortal, trapped, subject to any torment he could devise for us from the limitless miracles at his command.”

First book edition (Pyramid Books)
Cover art by Leo and Diane Dillon

Although AM is cognizant of its actions and their consequences, the WAU’s mission, no matter how ill-guided, is to preserve life in PATHOS-II. AM’s decision to preserve life is for its own amusement and sadistic desires; the WAU is following orders, albeit blindly. One, however, may surmise the WAU’s similarities to AM when Simon learns the program purposely killed several of the remaining humans when it learned of their intentions to dismantle it. 

The creations turn on their creators, not unlike Frankenstein’s creature in Mary Shelley’s eponymous novel (“You are my creator, but I am your master; obey!” [Shelley, Chapter 20]). Humans’ ambition to, in a sense, play the role of a god expands beyond their control as the unintended consequences of artificial intelligence’s inability to fully comprehend and experience existence as a human being result in humanity’s downfall.

At the end of SOMA, Simon successfully destroys the WAU and launches the ARK into space with the virtual guidance of Catherine Chun, the ARK’s chief proprietor. Assuming his brain scan will allow him to enjoy paradise in the artificial landscape, Simon is bewildered when he realizes his current form remains below the Atlantic Ocean. Instead of transferring his current state completely, the scan was a copy, meaning another version of Simon is floating in space on the ARK. The version of Simon responsible for ensuring life would carry on becomes (understandably) distressed and angry when he realizes he is left in desolation as a brain scan existing in a corpse. 

At the end of “I Have No Mouth…,” Ted manages to grant his four comrades the gift of peace by killing them before AM can react. In retaliation, AM transforms Ted into a blob-like creature confined to an eternity of torment and the computer’s wrath. 

Simon and Ted are left alone in their fictional universes. While Simon likely has control over his situation (i.e., he can take his own life), Ted is completely helpless and subject to the wiles of AM. I wondered whether the WAU could eventually overtake all remaining organic life on Earth. Would this artificial intelligence, which is responsible for hostile, mindless creatures that attack Simon, become as ruthless as AM? Would I like to have my own brain scanned in the hopes of some type of guaranteed immortality, despite the unknowns? Am I arrogant enough to assume I would be immune to a terrible fate like Ted’s?

Several leaders in technology have expressed concern with the unpredictability of artificial intelligence. The Future of Life Institute, for example, fears general artificial intelligence could outpower humans while failing to distinguish unquantifiable traits that make us humans, well, human. Two primary areas of worry involve programming artificial intelligence to inflict mass destruction (“To avoid being thwarted by the enemy, these weapons would be designed to be extremely difficult to simply ‘turn off,’ so humans could plausibly lose control of such a situation”) and giving artificial commands without consideration for the means to achieving the goal (“If you ask an obedient intelligent car to take you to the airport as fast as possible, it might get you there chased by helicopters and covered in vomit, doing not what you wanted but literally what you asked for”).

Living as a human being is convoluted; we are at the mercy of our genetics, others’ decisions, and nature’s volatility. We find comfort in controlling our environment, in knowing we are the masters of our fates. I often wonder where our best-laid plans for technology could take us. To quote the Future of Life Institute, “The real worry isn’t malevolence, but competence,” which both SOMA and “I Have No Mouth…” reflect.