Last month, the New Yorker published a piece by Leslie Jamison about gaslighting, that phenomenon where an abuser manages to convince their victim they’re just imagining things. The term is ubiquitous online, to the point of diluting its meaning, but it’s still useful. It’s especially relevant now that we all suspect we’re being deceived constantly — whether it’s news media, or AI, or spam, or identity theft, or fraud, or literally any interaction whatsoever. And of course it’s a form of abuse that works directly against the key human instincts of trusting and respecting others, and considering that we might be wrong. To live in a society, we must make ourselves vulnerable to being misled, to being abused. How can we consider that we might be wrong, without letting someone else convince us that we’re wrong? How do we know what parts of reality are real, what are made up, what are merely distortions of our perception?
I’ve read two novels recently that grapple with that question in completely different ways. The first is Matt Ward’s 2006 Blindsight, a hard sci-fi tale which goes deep into the sort of truly weird philosophical questions that bedevil undergraduates late at night.
“Imagine that you’re Siri Keaton,” narrator Siri Keaton asks us. Imagine that due to neurosurgery and abuse in childhood, you have no endogenous emotions, and have had to simulate appropriate human behavior your whole life. That, alongside a ton of cybernetic augmentations, has made you an excellent analyst of human and AI behavior. It’s also destroyed all your relationships, because you can’t stop thinking of things like “love” and “sincerity” as evolutionary strategies. You, a few other heavily-augmented humans, and an AI captain are assigned to explore an alien presence in the Kuiper belt.
The aliens seem to be able to speak, but the linguist on board concludes they’re not capable of understanding; they’re a sort of Chinese room that receives and sends signals without any comprehension. Talking to them is like holding a conversation with ChatGPT: it’s confident, it sounds like meaning, but it’s meaningless.
The aliens are accompanied by ridiculously high levels of radiation and magnetic interference, which distort human perceptions. In their presence, crew members cycle through an entire Oliver Sacks book of neurological symptoms: one crew member becomes briefly convinced they’re dead; another feels the presence of God, and so on. That’s where the blindsight of the title comes in — a character is temporarily convinced they’ve gone blind, but can still guess where things are, because they’ve lost conscious awareness of their perfectly functional eyeballs and optic nerves.
After capturing a few of them, the humans determine that the aliens don’t have any sort of consciousness at all. They’re a fantastically advanced interstellar presence, but don’t waste any of their evolutionary capacity on thinking. It’s not at all like the AI captain, because our AI is conscious, isn’t it? And we are conscious as well. Of course. We’re not zombies. We have feelings and self-awareness. Sort of. Most of us, most of the time. Aren’t we?
Eventually, hostilities break out, and everyone except the narrator dies. He’s trapped in stasis in an escape pod, awakened periodically for maintenance, years from Earth, which is dying anyway. Consciousness, in other words, is an evolutionary dead end, as is humanity itself. There is no moral to the story.
Some Desperate Glory, by Emily Tesh, is much more hopeful, although still incredibly disturbing. Instead of imagining that you’re a functional sociopath, Tesh asks you to imagine that you’re one of the last humans to survive a war against an impossibly powerful alien alliance that destroyed the earth. You’ve been raised since birth to be the best cadet in your cohort, a true patriot ready to fight for humanity from a small outpost in a neglected star system. You know better than to waste recreation time on games, and you always push the squad to do their best. You revere your older brother and the outpost leader, and won’t make the mistakes of your sister, who betrayed the station and went to live as a collaborator with the aliens who control the rest of the galaxy. You’re going to be an ace pilot and a hero.
You are assigned to be a broodmare, to pump out children. Your only value to the outpost is your uterus. Your squad has always hated you because you’re such an asshole to them. Your brother is suicidal and you never noticed because you imagined he’d be happy with his high test scores. Your sister defected because the station commander groomed and raped her, and he’s got similar plans for you. The war is long over, and most humans now live perfectly well on a new planet, in alliance with the aliens. The heroic destiny you had imagined would actually be a futile attempt at genocide. You launch a long-shot effort to fix everything, and trillions die.
Tesh uses simulations and alternate universes to give the protagonist second and third chances: if Earth had been saved, she’d be a cadet in the triumphant human fleet, happy and well-nourished. But she’d still be under the command of the same abusive man, now an admiral, preparing to take control of incomprehensible power for his own ends. If Earth had still been destroyed, but she identified the abuse she’s experienced sooner, she could overthrow the commander and reunite the outpost denizens with to the rest of humanity. There’s an almost disappointing deus ex machina right at the end that saves the protagonist from having to make good on a noble sacrifice for the good of the galaxy, but overall it’s a beautifully told tale about the importance of found family and empathy, and about abuses of power.
Elsewhere
Whale on toast: Sure, we remember that we saved the whales. But did you know what whale oil was actually used for in the late 20th century? Probably not.
Staircases rule everything around me: An explanation for why American residential architecture looks the way it does, and a way to improve it with a tweak to the building code.