At age six, Sarah Hill was handed her first iPad by her parents, which she used to play games like Angry Birds and Minecraft whenever she was bored. By age 21, the Alabama native had fallen so deep into virtual reality experiences and playing video games that she’d stopped seeing friends, showering, and brushing her teeth. “If you compare video game and tech addiction to drugs,” she says, “VR is the meth of drugs.”
But concerned parents—along with researchers, health organizations, and even some former tech industry leaders—are sounding the alarm, saying that the systems we rely on for modern life are designed in ways that may be fundamentally incompatible with human well-being. They cite a growing body of research in psychology and neuroscience arguing that social media use delivers dopamine jolts similar to those associated with addictive drugs like meth or heroin. And with the rapid acceleration of AI, many are calling for the U.S. government to get serious about regulation and pleading with Big Tech to provide stronger safety features that constrain the algorithms, push notifications, and endless swiping that make it so hard to put your phone down.
“Unfortunately, [tech] is taking mostly young people away from the most important thing in their lives and key to their mental health, and that is relationships with other people,” says New York University professor and podcaster Scott Galloway. For tech companies, he says, it’s all about keeping users’ attention locked in: “I don’t think [Big Tech] set out in their business plans to depress global youth. I think their algorithms discovered that rage, self-esteem, and funny cat videos just keep people online.”
There is, of course, a difference between the kind of low-level “addiction” to our phones that most of us jokingly will cop to—checking email before we’re out of bed, scrolling TikTok in the grocery line—and the rarer, all-consuming dependency that leads people to places like reSTART or into courtrooms as plaintiffs. At the same time, the line between a bad habit of using tech several hours a day and a behavioral addiction can be blurry, especially for teens and young adults whose social lives, homework, and entertainment all run through the same devices.
“I’m finally putting a foot down and saying, ‘I want to get out of this endless cycle.’ I need to do something to better myself and my life.”
And that’s the whole point, argues Roger McNamee, a former tech investor and author of Zucked: Waking Up to the Facebook Catastrophe. “These companies are in the business of attention,” he says. “Once they had attention, they were in the business of controlling the choices available to people in order to influence their behavior in ways that were profitable for the platform. That culture and that business model were guaranteed to produce lots of harm.”
With its constitutional and cultural emphasis on the importance of free speech, the U.S., unlike many other countries, has largely declined to tell tech companies how they should interact with users. That has had dire effects, McNamee says: “We went from a culture where we used tech as an empowering tool to viewing tech as a tool for controlling people and extracting value. That’s the culture of the Valley, and the underlying behaviors that that causes are wrecking our democracy, wrecking public health, and wrecking our economy.”
The reluctance to place limits on how tech products engage users, especially in the age of AI, “should disturb everybody,” he says.
Some 25 miles northeast of Seattle, past towering Douglas firs, sits the gray-paneled, split-level reSTART clinic. Motivational posters and pillows with phrases like “Healing is not linear” adorn its common areas. The center can house up to 16 clients, who share rooms and are responsible for household chores. They also are required to participate in 24 to 30 hours of structured group and individual therapy each week. reSTART teaches clients multiple evidence-based coping and recovery strategies, ranging from box breathing to physical grounding exercises. The treatment isn’t cheap. As an out-of-network provider, reSTART’s rate averages about $1,000 per day, though the clinic encourages clients to check with their insurers to see what can be covered. The average length of stay is 12 to 16 weeks, and many continue on via outpatient services for weeks after.
reSTART cofounder Cosette Rae opened the center with therapist Hilarie Cash almost two decades ago. Rae had previously worked as a tech developer and, upon realizing she was overusing technology in unhealthy ways, decided to change careers and pursue social worker training.
She vividly recalls a case in 2009, when she was called to assist a young adult who refused to leave their house or go to school. (Rae uses the pronoun “they” here to protect the individual’s identity.) They were not healthy, and had moved their bedroom mattress into the middle of the living room to play World of Warcraft nonstop. Doctors had diagnosed the person with agoraphobia, but Rae suspected that tech addiction was the real problem. She reached out to Cash for advice, and the two realized there was no place to treat people with these types of issues. They decided to open a center themselves.
Rae remembers being both “revered and rejected” in the early days of the center. Much like today, many didn’t think tech addiction was real. But there was no shortage of clients: She has treated around a thousand in the nearly two decades since the center opened, and spoken to many thousands more, she says.
What her clients struggle with is more difficult than breaking free from substance abuse, Rae says, partly because there’s no getting away from tech; it’s everywhere. “When I go out in the community right now, I do not have a lot of friends that are telling me about meth or heroin,” she says. “I don’t usually go into the store and see people dealing. I don’t go to the restaurant and people are doing a line. But when it comes to technology, it’s everywhere. So you’re constantly being in front of it and having to say no.”
It’s more akin to an eating disorder, Rae says, where a person still has to eat but has a problematic relationship with food. In this day and age, clients aren’t able to drop technology from their lives completely.
It’s not just teenagers who are struggling. Rae mainly works with young and middle-age adults (reSTART takes clients who are 15 and older), but she has seen clients in their late forties or fifties. The most common addictions Rae sees, besides video games, involve virtual reality, pornography, and more recently, AI chatbots.
One client, a 23-year-old Seattle-area college student who asks to withhold their name and gender, describes their own overuse of video games, YouTube, and communication platform Discord. The student says they wished schools today would teach kids how to use technology mindfully and warn against addictive behaviors: “Technology is best used when it’s a tool to enhance your life. But what I got trapped in is technology being my life.”
Some scientists, such as Stanford psychiatrist and Dopamine Nation author Anna Lembke, say compulsive tech use taps into the brain’s reward circuitry in strikingly similar ways to substance addiction. When someone scrolls social media or wins a round of a video game, their brain releases dopamine, which trains them to seek that “hit” again and again. Repeated bursts of stimulation can desensitize the pathways and weaken the prefrontal cortex, which is responsible for planning and self-control, making it harder to resist urges even when the habits are causing problems or affecting school, work, or relationships.
Brain imaging studies of people with internet gaming or social media disorders have found structural and functional changes in these regions that mirror what doctors see in other behavioral addictions such as gambling.
Tech addiction is not listed as a condition in the Diagnostic and Statistical Manual of Mental Disorders (DSM), the guide published by the American Psychiatric Association for diagnosing mental health conditions. However, in its most recent edition, the DSM does list “internet gaming disorder” as a condition warranting more clinical study.
“I don’t think [big tech] set out to depress global youth. I think their algorithms discovered that rage, self-esteem, and funny cat videos just keep people online.”
Its absence doesn’t faze Rae. “It took 40 years for gambling [disorder] to get into DSM,” she says. “So I don’t give any credence to the fact that it’s not in there yet.”
The science is far from settled, and some studies suggest that tech doesn’t cause users’ unhappiness. A 2023 University of Oxford study of 2 million people from around the globe found that links between internet adoption and psychological well-being were “small and inconsistent.”
And in March, California Institute of Technology researcher Ian Anderson and Wendy Wood, a professor at the University of Southern California, wrote a Washington Post op-ed arguing that calling habitual tech use “addiction” was misleading and harmful. In surveys, they found that when people described their Instagram use as an addiction, “They felt stuck, less confident that they had the ability to change.” Yes, they wrote, companies should “amend their platforms to help users regain control over their habits.” But they concluded, “The truth is: Heavy use is not necessarily an addiction.”
Nir Eyal, a tech investor and author of Hooked: How to Build Habit-Forming Products, says it’s not the tech that’s solely to blame for people’s addictions. “Every generation has a moral panic about whatever new technology, but you don’t fix things by stopping their use,” he says. “You fix things by making them better, by making them safer.”
Eyal argues that there is nothing unethical about making a product that some people get addicted to, and asking social media companies to make their products less sticky is not the answer. Why? “Because any product that’s good, somebody is going to get addicted to,” he says. “Stop making the product interesting? That’s dumb. That’s why we use the product. That’s called ‘entertaining and engaging.’”
The debate is only likely to grow more urgent given the rapid adoption, and daunting potency, of AI. Rae fears AI could create new ways for people to get hooked on tech, or treat AI as a “substitute attachment figure” for real relationships. “I think everybody’s been focused on all the talk around the existential threats like, ‘Can it take our jobs?’” Rae says. “But what about taking our humanity? That’s what’s happening.” As a practitioner working with tech addicts, she says, “I’m standing here looking down at a tsunami coming to people who have no idea what their kids are going to be facing. How this is going to change them; how it’s going to change their relationships with each other; and how it’s going to change their futures.”
If tech addiction is accepted as real, it raises another thorny and divisive question: What can—and should—be done about it? Some states, including New York and California, have enacted laws requiring warning labels on social media apps that highlight the risks for young people. In September, the New York attorney general proposed a rule requiring social media companies to restrict algorithmically personalized feeds and nighttime notifications for users under 18 unless parental consent is granted. California put legislation into place last year creating safety restrictions on the development of AI.
In December, Australia became the first country to ban social media for people under 16, and Greece and Britain are considering similar laws.
reSTART’s Rae doesn’t want to get stuck in semantics. Instead of arguing about whether their products are addictive, she says that Big Tech companies should devote some of their profits to resources that can help those “struggling as a result of loving their product,” she says. Many people can’t afford treatment like reSTART, as most health insurers won’t cover problematic tech use—though sometimes clients can get coverage for associated disorders such as depression or anxiety.
Companies could also consider shutting off access to their technology for certain time frames, Rae suggests. Eyal recommends something similar. In addition to implementing a legal minimum age to use social media, he recommends that tech companies adopt a “use and abuse” policy. After a certain number of hours, he says, tech companies should reach out to the user with a message offering resources to prevent or cure addiction.
Sarah Hill recently transitioned out of the center to an apartment owned by reSTART, half an hour away. She still visits the center most days for treatment, but is eyeing a job at a grocery store on her off days—and even got a cell phone. It’s a basic “dumb” Gabb phone, with no apps or games. Even so, Hill recently found herself mindlessly scrolling through new screen backgrounds. “I felt myself losing control again, and it scared me,” she says, tucking the phone underneath her legs on one of reSTART’s oversize chairs.
But Hill says she does have high hopes about managing her addiction in the future and says her phone usage has improved. “After making so many mistakes, I’m finally putting a foot down and saying, ‘I want to get out of this endless cycle,’” she says. “I need to do something to better myself and my life.”
Washington’s reSTART Clinic developed these screening questions to help potential clients consider whether their tech use has become problematic. Here’s an abbreviated version:
This article appears in the April/May 2026 issue of Fortune with the headline “What is tech addiction? It may well be Big Tech’s next problem.”



