I entered graduate school thinking I’d live a “life of the mind.” I just didn’t think it’d be my mind specifically — less the platonic ideal of intellectual pursuit, and more unruly, unpredictable chaos.
On good days, I ricocheted from one problem to the next with an instinctual creativity, buoyed onwards by curiosity and excitement. On bad days, I was constantly scrambling, unprepared for the simplest of obstacles. Each mistake threatened to undo all my progress. And failure, of course, was inevitable: eventually I’d have more bad days than good ones, and I’d crash. Sometimes I’d spend weeks mired in shame before I could throw myself into the frenzied rush again, but I told myself I could do it. I always had.
This cycle propelled me through to my third year of graduate school in 2020, but the stress and uncertainty of the COVID-19 pandemic destabilized my volatile equilibrium. The crashes got longer. The highs got more intense. I seized every moment of what I called “good brain” time, sporadic periods of clarity and energy that I frantically leveraged to catch up with my responsibilities, no matter when and how long they occurred for. As my body lurched from one deadline to another, I wondered if this was the last time I’d make it work. What if my brain wasn’t “good” ever again?
My notion of a “good brain” was telling. Good brains do what they’re told: they’re productive when needed, they navigate social situations effortlessly, and they don’t have unusual cognitive or sensory needs. I’d spent most of my life wrangling my brain into a “good” one, treating it like an adversary to be beaten. I had conflated “good” with neurotypical, the perceived “normal” or “standard” way of cognitive functioning. In doing so, I pathologized myself, assuming that there was a “right” or “normal” way to function, and my divergence from the norm was therefore wrong. Under the harsh scrutiny of graduate school, I was constantly surveilling and correcting myself, following neurotypical expectations as closely as possible.
Gradually, I began to understand myself as neurodivergent, a broad term including neurological differences including autism, ADHD, dyslexia and dyscalculia, among many other variations in neurological function. For most of my life, I’d been “masking,” or suppressing neurodivergent behaviors and mimicking neurotypical ones in order to fit into social and professional settings.
Masking is often described as social camouflage, but I think that understates what it truly entails. The prefix “neuro-” is often understood as “brain,” but it means nerve — to be neurodivergent means sensing and processing the world differently. Successful masking is to compel yourself to deny your lived reality, to force a disconnect between how you experience the world and how you react to it. The dissonance permeates every sensation, action, thought — and it exacts a heavy physical and mental toll.
And yet, neurodivergent people continue masking, because we are intimately familiar with the stigma that comes with existing as we are — I’ll never forget the way a professor ended a particularly frustrating conversation:
“Fayth, you’re clearly unwell.”
The judgment dripping from that word invoked a lifetime of instinctual fear and shame of being seen, the consequences of letting the mask slip. When fellow academics — or those in the highest echelons of academic hierarchy — reveal that they see your existence as pathology, the path of least resistance is to quietly put the mask back on, no matter the cost.
Amidst the stereotypes of savants and awkward geniuses, it might seem strange that neurodivergent academics struggle. In prestigious professional sectors, discussions of neurodivergence tend to focus on this disconnect, framing neurodivergence as untapped potential or a competitive advantage.
But I think this framing demonstrates why we struggle in the first place: our public existence is contingent on being valuable. If we’re seen as assets, what happens if we fail to provide the expected return on investment? As is fundamental to the neurodiversity movement, which views differences in neurological function as simply part of natural human variation, neurodivergent life is inherently valuable. Neurodiversity validates our collective humanity and autonomy. Co-opting it to justify our societal utility—or more likely, the utility of a perceived worthy few — is not progressive. Learning to exploit us more efficiently is not equivalent to supporting us.
It was only at my most exhausted, after passing my qualifying exam in mid-2021, that I found unexpected respite — I simply didn’t have the energy to surveil myself. For the first time, I listened to what my mind and body were telling me. I worked from home to avoid sensory overload. I alternated between excited rambling and too-long pauses to process during conversations. I moved the way I instinctually wanted to, even if it looked strange or awkward. I became acutely aware of how much energy masking took, and in contrast, how joyful it was to exist without fighting myself every step of the way.
I never truly wanted to live a “life of the mind.” What I envisioned was living outside of my self. I wanted to exert perfect control over how I experienced and reacted to the world, while being able to ignore any of my needs. If this sounds unrealistic, or even inhuman — this is what neurotypical expectations demand from neurodivergent people. High-pressure, low-support environments like graduate school can intensify those demands, to the extent that we aspire to a disembodied existence.
I am slowly learning to be present in my own body, to trust how I experience and respond to the world around me. It’s new and nervous and uncertain — but I know that it’s the furthest thing from being unwell.
Feature image adapted from Lichtman and Sanes, 2008. Licensed under Creative Commons Attribution 3.0 Unported.