In an era of split attention, there is more than one type of ADHD

by Paul Kudlow, Karline Treurnicht Naylor and Elia Abi-Jaoude, psychiatrists

ADHD is typically thought to be wired into the brain early. But many cases may be better seen as products of digital life

Attention has become a currency – traded, spent and depleted in an always-on world of pings, swipes and scrolls. In the past two decades, as the digital tide has risen, diagnoses of attention-deficit/hyperactivity disorder (ADHD) haven’t merely inched up: they have more than doubled in adults and climbed by two-thirds in children, according to large US cohort studies. The coincidence is striking and worth unpacking.

ADHD has long been seen as a neurodevelopmental condition, typically showing up in childhood with hallmarks like distractibility, impulsivity and restlessness. For years, scientists have known that this condition runs in families; genetics play a big role. Stimulant medications like Ritalin (methylphenidate) and Adderall (mixed amphetamine salts) – both of which boost the brain’s dopamine, a chemical messenger tied to focus and reward – are mainstays of treatment. They are often very helpful for those who need them.

This is the ‘classical’ form of ADHD, one that is thought to be wired into the brain from early on. People with this condition might struggle to sit still or to finish a task, but with the right support (structure, medication) they often thrive. Their self-control and focus commonly improve as they mature.

Recently, though, we have been making a case that a broader conceptualisation of ADHD is warranted. In our work as psychiatrists, we began seeing a subset of patients whose ADHD-related symptoms were barely nudged using the usual stimulant-plus-structure playbook. This pattern signalled an attention dysregulation that requires different remedies from the classical approach – and, in turn, a shift in how we understand the disorder.

Rather than being a single, uniform condition, we believe ADHD is best understood through a spectrum model. This spectrum is anchored at one end by a form rooted in biology, and at the other end by patterns that are shaped by modern digital life. We call these poles Type I and Type II ADHD. Most people fall somewhere between these poles, with both biology and environment shaping their attention to some degree. But we can use the two labels to distinguish cases based on what seems to be the predominant factor. Type I ADHD, the classical form, reflects neurodevelopmental traits. But for someone who is more Type II, the dysregulation may emerge later, driven largely by overstimulation in a screen-saturated world.

This framework offers a new way to understand why attention struggles are rising – and how care might evolve in response.

Picture someone who got through childhood without attention issues, only to develop them later, in their teens or 20s. They’re impulsive, scattered, easily pulled off track, but this isn’t how they’ve always been. What’s changed? Often, it’s their digital environment. Hours spent flipping between apps, scrolling through short reels, chasing social media likes or juggling screens might have disrupted their brain’s ability to regulate attention, prioritising rapid stimulation. Over time, this could have weakened their capacity for slower tasks, like reading a book, following a talk or even watching a slower movie.

It’s akin to the difference between Type I and Type II diabetes

Epidemiology echoes what we observe in practice: in a cohort of almost 5.3 million adults in the US, ADHD diagnoses rose from about 0.4 per cent to about 1 per cent between 2007 and 2016, a greater increase than the one seen in school-age children. We propose that many of those late-emerging cases reflect Type II ADHD: a cumulative effect of heavy, continuous digital stimulation. Very high amounts of screen use might also trigger similar problems earlier in life – preschoolers logging more than two hours a day showed a six-fold jump in clinically significant inattention compared with those watching less than half an hour per day.

The difference between Type I and Type II ADHD is akin to the difference between Type I and Type II diabetes. Type I ADHD is theorised to reflect early dopamine and brain-wiring differences, much as Type I diabetes stems from an early insulin shortfall. In contrast, Type II ADHD seems to arise after years of digital stimulation that desensitises reward circuits. Both satisfy ADHD’s behavioural criteria – inattention, impulsivity, difficulty with self-regulation – yet travel different biological roads and require different treatments.

For Type II, the best therapeutic target may be the environment: screen time might need to drop well below what most users consider ‘moderate’, and be limited to deliberate windows and offset by offline, focus-building tasks. The logic mirrors the care of Type II diabetes in removing chronic excess and letting the system regain sensitivity.

A long-emerging body of research supports our thinking about a Type II ADHD. Decades ago, studies found a relationship between early television exposure and later attention problems. One tracked more than 1,200 toddlers, finding that more television at ages one and three were both associated with greater attention issues by age seven. Another study followed around 1,300 school-age children, finding a link between higher screen media use and worsening focus. These early findings suggested that increased screen use could instil restless habits.

Today’s digital tools go further, offering constant novelty and instant rewards. A two-year study of around 2,600 teens found that heavy digital media use preceded new symptoms of inattention and impulsivity, even in those without any prior signs. A five-year study of around 3,800 adolescents found that when an individual’s social media use went up, their ADHD symptoms tended to be higher in that same year and the next year, with impulsivity explaining much of the link. A smaller follow-up trial in children showed that cutting leisure screen time led to measurable improvements in behaviour within just two weeks.

Social media overuse has been associated with smaller reward and attention areas in the brain

A caveat is needed here: most of these studies are observational, so they cannot prove that screen exposure causes the brain or behaviour changes they document. It is possible that individuals who are already more impulsive or novelty-seeking gravitate toward heavier media use. Even so, the longitudinal evidence – where heavier use comes before later problems – together with early intervention studies strengthens the case for causality. With these limits in mind, researchers have turned to the brain itself, and the neural picture echoes the behavioural one.

Heavy multitasking illustrates the point. People juggling multiple screens – streaming, texting, gaming – often struggle to filter distractions or shift between tasks smoothly. Brain scans of 149 young people showed that those who engaged in heavy screen-switching in their daily life needed higher activity in their prefrontal cortex, a focus centre, during a focused attention task – and they still performed worse on the task. Another scan of 75 adults found that chronic media multitaskers had less grey matter in a brain area related to self-control.

The brain’s reward system provides further insight. Social media’s stream of notifications and likes delivers dopamine boosts, which may disrupt the balance in susceptible brains. A study of 22 adult social media users found that the more they scrolled, the less dopamine-making capacity they had in the putamen, a reward spot in the brain that’s been linked to ADHD. A review of other brain studies found that social media overuse was associated with smaller reward and attention areas in the brain, and with activity differences in the brain’s focus networks – a pattern that resembles what we see in ADHD. Finally, a review of studies found a relationship between frequent screen time and lower cognitive control in adolescents. This evidence is consistent with the idea that screens overfeed the brain’s reward loop, leaving one eager for more and dissatisfied with slowing down.

We’ve described two sides of a theoretical ADHD spectrum, but how do they show up in the real world? What does this way of understanding ADHD look like in practice? In our own work, we’ve seen Type I and Type II manifest quite differently.

Take a child we’ll call Sam (a composite of several patients, with details altered for privacy). Sam is a classic Type I case. From early childhood, he struggled to stay still – darting around, interrupting, drifting mid-sentence. His parents and teachers noticed early, and a genetic predisposition was likely; his father had ADHD too. After Sam received structure, support and medication, his focus improved significantly: he completed homework, joined a soccer team, and took pride in his progress.

Sam may benefit from pharmaceutical support, but Alex may need a digital reset

Now consider Alex, a woman in her 20s who excelled in school but struggled at university. Deadlines slipped, lectures felt overwhelming, and her phone was a constant companion – she scrolled TikTok late into the night. She’d never faced such challenges before. Stimulants seemed to help Alex at first, but the effect faded, and she felt worse while off them than she did before. Her screen habits stood out: hours daily, often multitasking. For her, reducing digital overload – replacing excessive scrolling with walks or reading – and improving sleep over time seemed more effective than medication. A colleague saw similar outcomes in another patient, likely Type II: a month-long screen reduction, limiting non-essential use, restored focus without medications. These aren’t isolated anecdotes. They reflect patterns that many clinicians are now observing.

Crucially, patients like Alex still meet nearly all the official criteria for diagnosing ADHD; the Type II label isn’t about widening the net so much as explaining why the symptoms surface late and respond best to environmental change. Framing it as ADHD is a way to keep their struggles in recognised clinical language and preserve access to care. And when they do receive care, one size no longer fits all: Sam may benefit from pharmaceutical support, but Alex may need a digital reset.

As culture evolves, so must the treatment of mental illness – yet most diagnostic frameworks still ignore digital behaviour. A recent paper argues that clinicians should dig into this, not just tallying a patient’s hours online but exploring their emotional impact. For people who have difficulties with attention, it makes sense to consider: what’s the emotional toll of time spent on screens? Does scrolling leave you wired or drained? Do you panic without your phone? We’ve started asking patients these things, and the answers are revealing. One teen told us that Instagram helped him stay connected to others, but also fuelled his anxiety. An adult said that gaming had been his escape until it started swallowing his sleep. These stories shape how we see their struggles – and how we help.

If we’re right about there being a Type II ADHD, it could spark a shift – less reliance on medications for many, more focus on taming the digital beast. Perhaps schools could dial back digital noise, workplaces could rethink endless pings, and families could foster calmer environments. Schools could follow the lead of French classrooms that ban smartphones outright, creating ‘attention sanctuaries’ where sustained focus can be learned and practised. Just as a cafeteria might post sugar content, curricula could disclose screen minutes and build in offline study blocks to protect and promote cognitive stamina.

We’re not saying screens are the enemy. They connect us, inform us, entertain us. But like anything, too much can tip the scales. Try this: mute non-essential notifications for a day, or swap an hour of scrolling for a walk. Let your brain idle for once: like fallow ground, it gathers strength for what comes next. You might find that small moves like these steady your focus (or your child’s focus) over time.

Future research will need to follow thousands of people, measuring digital habits, brain changes and treatment responses over time. If the two-type model proves out, it could encourage diagnostic manuals to distinguish attention problems rooted in biology from those largely fuelled by chronic screen overload. That clarity would help clinicians match treatment to cause and give schools and tech platforms a firmer mandate to protect attention. Until those data arrive, each of us can run a small experiment: dial back the pings, watch what happens, and share the results.

Syndicate this idea

Explore more

A man in a tweed jacket viewing a framed German wanted poster on a wall in a museum or gallery setting.

The eerie phenomenon that keeps popping up

Ever feel like a word or person you just learned about has been showing up repeatedly? There’s a term for that

by Hannah Seo

Photo of a businessman in a suit explaining with a Jenga tower, another man stands behind him, both in an office setting.
Psyche Exclusive
FILM

Our world’s complex issues can’t be solved by lone heroes. Our stories should reflect that

Video by Psyche

A couple dancing on stage, the photo focusing on their feet and shadows, with dramatic lighting and a blue spotlight.
DANCE

Dance showed me the untapped power of our attention muscle

Through tango, I sharpened attentional skills that make any moment richer. But these can be honed on or off the dancefloor

by Sara Melzer

Page from a book with printed text about society and handwritten notes in the margin, including ’True but only in part‘.

The value of scribbling in the margins

Marginalia is far from inessential. It would be a shame if it died off in the digital age

by Richard Fisher

A young boy smiling, resting his chin on his hands, looking at a marshmallow on a table.

What the marshmallow test got wrong about child psychology

Self-control, grit, growth mindset – trendy skills won’t transform children’s lives, but more meaningful interventions can

by Tyler W Watts

Photo of uniformed men standing outside portable toilets, one gestures invitingly.

Does progress seem slower when you constantly check on it?

Research on how we perceive the rate of change shows how you can be strategic about goal tracking and boost your motivation

by André Vaz

Close-up photo of a hand carving a pencil with a small knife, focusing on the pencil’s tip and wood shavings.

There’s joy in doing a job right. Just ask this artisanal pencil-sharpener

Directed by Kenneth Price

Photo of a street with vintage cars, motel signs and a person leaning on a truck holding a broom under a blue sky.

How to alter the passage of time to feel fast or slow

Knowing the psychology behind why moments drag or whizz by can give you a degree of control over your experience of time

by Martin Wiener