We’ve finally done it. Pop the champagne — if your smart speaker hasn’t already ordered it for you, sensed your celebratory mood, and scheduled a drone delivery for sometime between now and the heat death of the universe. We, the most cognitively complex species to ever walk this particular rock hurtling through space, have reached the absolute summit of our evolutionary potential. And from up here, the view is magnificent: a civilization of fully grown adults who cannot figure out if they need milk without consulting an appliance.
I watched it happen with my own eyes. A man — late thirties, presumably holding down a job, presumably capable of operating a motor vehicle — standing in front of his refrigerator in a state of low-grade existential panic because the app wasn’t loading. The fridge, for the record, was right there. The door, equipped with a handle specifically designed for human hands, was right there. The milk, or the conspicuous absence of it, was a literal arm’s reach away. And yet. And yet, he stood there refreshing his phone like the answer to life’s great mysteries lived somewhere in his notification settings. I didn’t say anything. What would I say? “Sir, I believe the technology you’re looking for is called opening it?” You can’t tell people things like that anymore. It comes across as aggressive.
This is where we are. This is the destination we chose.
Now, I want to be fair — I genuinely do, even if fairness is not exactly my natural resting state. Convenience itself is not the villain here. The dishwasher is convenient. Penicillin is convenient. Not dying of a tooth infection at thirty-two because you couldn’t get to a barber-surgeon is, by any reasonable metric, a net positive for human civilization. Convenience has pulled billions of people out of unnecessary suffering, freed up hours of human life that were previously spent doing things like churning butter and hand-washing shirts, and given us the cognitive bandwidth to pursue higher endeavors.
The question — the uncomfortable one that nobody really wants to sit with over their algorithmically curated morning playlist — is what exactly we’ve done with all that freed-up bandwidth. Because the promise was always implicit, wasn’t it? We automate the tedious, and in return, humanity soars. We shed the mechanical and ascend to the creative, the philosophical, the deeply human. That was the deal. Technology handles the grunt work; we handle the meaning-making.
Magnificent plan. Shame about the execution, though.
Because what we actually did — and I say “we” generously, as a species, with full awareness that I am also holding a smartphone and a gym membership I use primarily to justify the protein bars — what we actually did was take all that glorious freed-up time and capacity and promptly stuff it with more convenience. We didn’t think deeper. We scrolled further. We didn’t create more. We consumed more. We built apps to track whether we’re drinking enough water because somewhere along the way we outsourced the sensation of thirst.
Let me tell you about the attention economy, since apparently we’re living inside it and it’s considered polite to name the room you’re trapped in. Every notification, every algorithmically timed buzz, every little red badge screaming at you from the corner of an icon is not an accident. It is not a coincidence that Instagram shows you a photo that makes you feel vaguely inadequate and then immediately pivots to something that makes you laugh. That sequence is engineered. The people designing these systems know more about your neurological reward cycles than your own GP does, and they are using that knowledge with the precision of a surgeon and the ethics of a casino.
The slot machine analogy gets thrown around a lot, and I understand if you’re tired of it, but I’d ask you to actually sit with it for a moment rather than dismissing it because you’ve technically heard it before. The variable reward schedule — the mechanism that makes slot machines so brutally effective — works because unpredictability is neurologically irresistible. Your brain doesn’t light up when it gets a guaranteed reward. Your brain lights up in anticipation of a possible reward. So when you pick up your phone and swipe down to refresh your email, you’re not checking your email. You’re pulling the lever. You’re hoping this time there’s something good. A compliment. A sale. A message from someone who thought of you. And sometimes there is! And that occasional, unpredictable hit is precisely what keeps you coming back with the quiet desperation of someone who has convinced themselves they’re just killing five minutes.
This is not weakness. I want to be very clear about that. This is not a character flaw. You are not pathetic for being susceptible to systems specifically engineered by rooms full of behavioral psychologists and data scientists to be maximally irresistible. That’s not weakness — that’s being human in a world that has learned to weaponize humanity. The weakness, if we’re being precise, is in pretending this isn’t happening, or in reading an article like this one and nodding vigorously and then putting your phone down for approximately eleven minutes before the cycle restarts.
Here’s what I find genuinely, darkly funny about the whole situation: we’ve built a world so optimized for frictionlessness that friction itself has become pathological. We’re now at a point where any resistance, any delay, any moment of not-immediately-getting-the-thing causes a level of distress that would have baffled anyone living thirty years ago.
The two-second loading screen is now a legitimate source of suffering. Not metaphorical suffering. Actual, measurable cortisol spikes. Research bears this out — people’s heart rates increase when a page doesn’t load within a couple of seconds. Our nervous systems are calibrating to a standard of immediate gratification that is, by any historical measure, absolutely insane. We waited years for things. We waited months for letters. We waited in lines that didn’t have apps to tell us how long the wait would be, and we survived, and we even occasionally made friends with the person in front of us, though I understand that sounds as plausible as surviving on roots and berries.
But that calibration has consequences. If your baseline for acceptable response time is two seconds, then three seconds feels like neglect. If your baseline for accessible entertainment is infinite content on demand, then a moment of boredom feels like deprivation. If your baseline for navigating a city is a voice in your ear telling you precisely when to turn, then the idea of reading a map — or, God forbid, asking someone — feels genuinely overwhelming. We haven’t just raised our standards. We’ve fundamentally recalibrated what discomfort means. And in doing so, we’ve made ourselves exquisitely fragile in ways that are very difficult to discuss at dinner parties because everyone at the table is doing it.
I should probably mention the power outage, since that’s really where the comedy reaches its peak. Nothing reveals the infrastructure of our dependency quite like the grid going down for a few hours. I’ve seen it. Most of us have seen it at this point. The first ten minutes: novelty, a little exciting, almost romantic. Candles! We’re pioneers! The second ten minutes: mild inconvenience, a certain restlessness, picking up the phone reflexively and remembering. The third ten minutes: the faint but genuine anxiety about what is happening, whether the food in the fridge is okay, how long this will last, whether this is somehow connected to something larger. By the hour mark, you’ve got people who manage entire departments, who raised children, who have advanced degrees, genuinely unsure what to do with themselves in a room without a screen.
The irony that cracks me up every single time is this: we have more access to information than any humans who have ever lived. We carry in our pockets a portal to the entire accumulated knowledge of our species. And we’ve used it to become less capable of basic self-sufficiency than at almost any point in recorded history. We know more and can do less. We’ve traded competence for access and called it progress. We’ve outsourced navigation, memory, scheduling, social connection, emotional regulation, and yes, the detection of dairy products, and we’ve done it so gradually and so comfortably that it didn’t feel like loss. It felt like upgrade.
Now, here’s where I’m supposed to pivot, soften a little, acknowledge complexity, introduce nuance, and wrap things up in something approaching a constructive takeaway. And look — I’m not opposed to nuance. Nuance and I are on perfectly good terms. But I want to sit in the discomfort a moment longer, because I think we rush to the solution too quickly. I think we use the existence of a solution to avoid fully absorbing the problem. “Yes, yes, we’re all addicted to our phones, but here’s a mindfulness app that will help!” is not a satisfying response to what is actually a fairly significant civilizational question about what we’re doing to ourselves.
The question I keep circling back to — the one that keeps me up at night, or would if I weren’t using a sleep-tracking app to optimize my circadian rhythm — is about capacity. Specifically, what capacities are we quietly allowing to atrophy, and what does it mean for us, individually and collectively, when they’re gone? Because skills and capacities are not permanent. They are maintained through use, and they diminish through disuse, and they disappear through sustained neglect. This is not controversial. We know this about physical fitness. We know this about language. It is equally, uncomplicatedly true about cognitive and emotional capacities.
Boredom tolerance, for instance. The ability to be unstimulated for a period of time without it feeling like a crisis. This capacity, which once seemed so unremarkable as to be beneath discussion, turns out to be foundational for a fairly long list of important human activities: creative thought, self-reflection, genuine rest, the kind of deep reading where you follow a complex argument across many pages without your attention fragmenting, the ability to be fully present with another person without your brain quietly screaming for its next input. All of these require the ability to tolerate a moment that is not immediately gratifying. And we are systematically destroying that ability in ourselves, in real time, with great enthusiasm and increasingly sophisticated tools.
The children thing is where I try not to get too apocalyptic, because it’s easy to sound like every generation that has ever existed panicking about what the next generation is being corrupted by, and I’m aware of that tradition. Every generation has had its moral panic about the youth. Television was going to rot their brains. Rock and roll was going to corrupt their souls. Before that, novels — actual novels — were considered dangerous to the impressionable minds of young people, particularly young women, who might get ideas.
So I hold that context. I hold it carefully. And then I put it down, because I think this one might actually be different in a way that matters, and I’ll tell you why: the previous technologies that supposedly corrupted the youth were largely passive. Television asks nothing of you except attention. A novel asks quite a lot of you — imagination, sustained focus, empathy — but it doesn’t interrupt you, and it doesn’t optimize itself based on your behavior to keep you engaged, and it doesn’t notify you of things while you’re trying to read it. The smartphone, the social media platform, the algorithmically curated feed — these are not passive. They are responsive. They learn you. They adapt. And they are specifically designed for maximum engagement, which in practice means maximum retention of your attention, which in practice means exploiting every available neurological vulnerability.
Giving a device like that to a thirteen-year-old is not the same as letting them watch too much television. And the data coming out on adolescent mental health over the past decade, while complicated and still being interpreted, is not something I can look at and calmly conclude we’re dealing with the same old moral panic. The trends are too consistent. The timing correlates too precisely. The mechanisms are too clearly understood. Something is happening, and we’re still in the phase where we’re arguing about whether to look at it directly.
What I want — and this is the part where I briefly abandon the ironic detachment, so hold tight and we’ll be back to it shortly — what I actually want is for us to be honest about the trade-offs. Not to smash our phones, not to retreat to some pastoral fantasy of pre-digital life that was, let’s be honest, full of its own miseries and limitations. Not to perform a theatrical rejection of modernity that is really just a more sophisticated form of self-congratulation. I want the honest accounting. I want us to actually reckon with what we’re gaining and what we’re losing and whether the exchange rate makes sense.
Because we’re not having that conversation. We’re having the conversation where one side says technology is miraculous and anyone who questions it is a Luddite dinosaur clinging to nostalgia. And the other side says screens are poison and we’re all doomed, which is equally unhelpful and usually delivered via a podcast you can listen to on your phone. What we are not doing is the boring, careful, unglamorous work of figuring out which specific technologies, used in which specific ways, by which specific populations, at which specific developmental stages, produce which specific outcomes. Because that conversation takes longer than a hot take, and it doesn’t perform well on social media, and it requires holding contradictions simultaneously, and we’ve established that sustained attention is increasingly not our strong suit.
The man with the smart fridge is still in my head. I know I’m being unkind. He’s not a symbol of collapse. He’s a person who had a slightly frustrating morning. The appliance he invested money in didn’t function as advertised, and he was momentarily thrown. That’s not a moral failure. That’s Tuesday.
But here’s the thing. The version of that story that makes me pause is not the individual moment. It’s the trajectory. It’s the fact that we design and enthusiastically purchase a device to tell us whether we need milk, and when that device fails we experience genuine distress, and we do not then think, “Huh, perhaps this is a dependency I should examine.” We think: “They should make the app more reliable.” We respond to evidence of over-reliance by requesting more reliable things to rely on. We solve convenience failures with more convenience. It’s elegant, if you think about it. A closed loop. A system with no exit.
The fridge doesn’t know you need milk. It knows when you last scanned a barcode, and it knows the date, and it makes a guess. You, standing in front of it, using the sensory apparatus evolution spent millions of years developing, can open the door and know. And the fact that one of those options requires a monthly subscription and the other requires bending slightly at the waist but we are genuinely unclear on which one is better — I mean, that’s the bit, isn’t it? That’s the whole bit.
I’ll leave you with this. Somewhere, right now, there’s an algorithm deciding what you’ll read next after this. It’s been watching you — what you pause on, what you skip, how long you spend, where your attention drifts. And it’s about to serve you something calibrated precisely to your patterns, your preferences, your particular neurological fingerprint. And it will probably be good. It will probably feel exactly right. And you’ll consume it, and the algorithm will learn a little more, and the next thing will be even more precisely you, and this loop will continue until the content you’re consuming is so perfectly tailored to your existing preferences that it is, in a very real sense, a mirror. A very comfortable, very engaging mirror that never shows you anything you didn’t already think, feel, or believe.
The room is convenient. The door is right there.
Whether we remember how to open it is, I suppose, the question.
Danny Ballan
Editor-in-Chief
English Plus Magazine










0 Comments