“Brain Rot” is the Oxford Dictionary’s 2024 Word of the Year. Not a word, as such, but a phrase describing a ubiquitous condition (at least the Oxford people think so) afflicting homo interneticus. Specifically, the impact on cognitive function of consuming too much utter garbage online (you know, the kind served up by the recently Trump-resurrected TikTok). You may laugh, but this rot makes its victims as mushy and pliable as Halloween pumpkins left too long on the stoop. Easy marks for manipulation, whether by China’s Ministry of State Security or the people trying to sell you a new handbag you don’t need. But the kind of rot that worries me most is the decay—indeed, the wasting away--of memory induced by the ease of obtaining information online. Any dictator knows that one of the keys to fostering a servile, clueless populace is to strip its capacity to remember how they got to such a sorry state. If the content of our brains consists of what we have been fed (there’s a reason social media sites call it “your feed”), eventually we lose the ability to make memories and we are all like the character in Christopher Nolan’s Memento. We have, in fact, forgotten how to remember.
“I’m blanking on the name. Damnit. It’ll come to me.”
How frequently over the past year or so have you found yourself saying this? Even if you’re well under sixty and senility isn’t encroaching like crabgrass on the lawn of your mind. Even if you’re not a blackout drinker or a substance abuser. Even if you take regular doses of Gingko Biloba. Sure, we’ve been “blanking on things” forever. Our brains are not perfect machines, and our random access memory is not always randomly accessible. Factors like stress, fatigue and anxiety freeze up synaptic response. And memory takes practice, as Giordano Bruno knew. What’s happening now, even as you read this essay, is different. I think it may be accurate to call it a disease, and over the past twenty years or so, it has become a pandemic. Call me a conspiracy theorist if you like, but I fear that this pandemic has been unleashed by a kind of “lab leak.” If we don’t use certain muscles, as might be the case with a bedridden invalid, they atrophy. But that is a passive process. No evil genius has strapped you to the bed. In the case of our collective memory loss, it’s beginning to feel as if someone wants us to forget who we are.
I place the origin point of this epidemic at 1600 Amphitheatre Parkway, Mountain View, California, corporate headquarters of Alphabet, parent company of Google. The disease has at least one quasi-clinical name: digital dementia, a prominent cause of which is known, naturally, as “the Google effect.” (And they promised not to be evil.) Get ready: advanced AI in the hands of companies like Alphabet, Meta, and OpenAI is going to make things much, much worse. Already, no student has to actually write a term paper. Elon Musk’s Neuralink technology, especially if paired with something like Apple Vision, could ensure that we never need to learn a language, master a musical instrument, or make a shopping list. And what will barons of Big Tech and their new allies in government do with us once we have forgotten everything? Ah, yes. I almost forgot. These new manufactured memories can be monetized.
We don’t use maps anymore, so no one knows where they are or how they’re going to get where they’re going. We are, quite literally, disoriented. When the Google Maps girl says “head east,” we say, “what?” We follow the arrow, sometimes right into the back of a truck. We don’t use libraries, so the set of skills once known as “doing research” has atrophied. To use a library, you had to determine which area of the Dewey Decimal System, and thus, which section of the library, was most likely to hold the information you were seeking, and which authors were authoritative. And as you gathered the books in front of you in the library carrel and made notes, grooves were carved into your gray matter. This does not happen when we Google, or ChatGPT, or Perplexity our research subject. What happens is something called transactive memory, wherein we assign our memory function to an outside storage facility, the way we now assign our music files (and everything else) to the Cloud. It is proxy memory, surrogate recall, and its rapidly spreading replacement of original memory was the stunning discovery of a study you may have read about (if you remember), done in 2011 by Drs. Betsy Sparrow of Columbia U. and Daniel M. Wegner of Harvard. The study found that when we obtain information with a keystroke—without the effort of recall—we retain it about as well as a sieve retains water. If we park it someplace, however—say, in a color-coded desktop folder or on an external drive or as a bookmark in our internet browser—we can probably find our way back to it.
In other words, we don’t remember what we learned. We only remember where we learned it.
You may be tempted to call this efficiency. A way to “work smarter, not harder” and conserve disk-space in our brains for more important things. But is it, in any sense, really smarter? Think of what this means.
I don’t know my way home, but I know the Google Maps icon on my smartphone.
I don’t recall your name, but I know you’re in my contacts under “Recently Met.”
I don’t remember where I parked my car, but my phone remembers.
I’ve forgotten how to say “Good Day” in Spanish, but Google Translate knows.
I don’t remember if I took my meds this morning.
I don’t remember my daughter’s birthday (but it’s in my calendar and I’ll get a notification).
No big deal, right? Quotidian stuff. The last two may cause some discomfort, but it’s probably manageable. Why use up valuable mental real estate with such trivia? But as the virus spreads and the muscles atrophy, the problems escalate. Things can get more personal, and perhaps, more dangerous:
I don’t remember why I came here.
I don’t remember my address.
I don’t remember if it’s a work day or a weekend.
I don’t remember who is running the government.
I don’t remember if my mother is still alive.
This last set of lapses, of course, most of us would describe as “functional” dementia. At this stage, some might say we’re ready for the home. But even there, with proper care, a regular routine, and reliable digital notifications on our computer desktop and phone, we couldn’t get into too much trouble, could we? In some manner, all of human society may soon become a kind of retirement home, what with Universal Basic Income (a favorite of Silicon Valley), self-driving cars, robotic maids and lovers (Ah, what’s the difference really?), digital doctors, and the equivalent of Bingo night with slot games on your phone. And even when everything is done for us, when we are attended day and night by a virtual nursing staff of supercharged AI Siris and Alexas, we can still buy things, which is about all we’ll be good for. But what really worries me, what occasionally induces the deepest sort of existential dread, is the possibility that this creeping enfeeblement of the prefrontal cortex might someday lead us to wake up screaming I don’t remember who I am.
Still use the Dewey, thanks to a great librarian, Fitzy. I think that was her name. 😎