Addictive by Design: Bar Talk with Eric Bartosz
In 2024, I wrote about a newly-released Harris Poll that showed 80 percent of Americans had taken steps to reduce the amount of time they spent on social media platforms. Even more striking, 50 percent said they wished the platforms had never been invented. We don’t have to be consumer psychology experts to know we have moved beyond mild dissatisfaction to full-blown cultural regret.
Teens, who now average more than five hours per day on social platforms, are the most heavily affected and also the most neurologically vulnerable. While most of us already sense that Meta does not have our best interest in mind, recently released internal research documents, compiled and analyzed by NYU Stern researchers, offer a rare look behind the curtain confirming suspicions that Meta knows they make a dangerous product and do their best to make it as difficult as possible for users to stop (Meta’s Internal Research). Some readers may immediately find similarities to the Big Tobacco business model, where revenues are directly tied to an addictive product, and the leadership was well aware of the dangers of their product even while they publicly denied it.
What the research documents reveal is not accidental overuse; it is a deliberate design strategy to maximize the amount of time users (especially teens) spend on the platforms each day. Perhaps what is most illuminating in the research is the fact that there was, and is, full awareness of the dangers and risks of the addictive design. It’s easy to make comparisons to Big Tobacco in the era when they claimed cigarettes were not addictive, and documents later revealed they were busy engineering additives to make them as addictive as possible!
This topic resurfaced nationally in February when Mark Zuckerberg appeared in court facing accusations that social media platforms were intentionally engineered as “digital casinos” to hook young users. That analogy was not rhetorical exaggeration; it was grounded in Meta’s own research, literally conducted in casinos.

One of the most telling findings centers on the use of “intermittent rewards.” Behavioral scientists have long understood that unpredictable rewards are the most powerful reinforcers of behavior. Slot machines work not because they pay out consistently, but because they occasionally pay out unexpectedly. That uncertainty keeps gamblers in their seats and pulling the lever, and is a ‘sticky’ model Meta seeks to replicate on their social platforms.
Meta researchers explicitly referenced this dynamic. Their internal discussions compared platform mechanics to slot machines, noting that intermittent reinforcement makes it especially difficult to stop behaviors, even when the reward is minimal. Think about core design elements, including the swipe-to-refresh motion, the notification badge and unpredictable elements like algorithmically-timed comments. Each one of those is designed to act as a digital lever, keeping us in front of their social media slot machine for as long as possible.
The fact that we continue to spend more hours per day and week on social platforms is not a side effect of the company’s growth; addictive usage is their growth strategy.
The Teen Ecosystem Review, identified as Study 7.1 from 2020, documented that teens are “insatiable when it comes to feel-good dopamine effects.” Researchers noted that, due to brain development, adolescents have much more difficulty disengaging once stimulated. The prefrontal cortex, responsible for impulse control and long-term decision-making, is still developing. The dopamine system, however, is fully operational.
In plain English, Meta researchers know the gas pedal works much better than the brakes, and they design the platform to capitalize on that.
Even more concerning, the same internal research found that teens were unhappy with the amount of time they spent on the platform. In other words, Meta’s own data suggested that heavy teen users were not necessarily satisfied users. They were conflicted users, which is an important distinction.
In another internal conversation referenced in the research archive, a Meta employee used the phrase “Reward Deficit Disorder” to describe what happens when users binge on Instagram to the point that ordinary life feels comparatively dull. The comparison drawn was to drug tolerance. The more you consume, the less you feel, which drives more consumption, and more stress and anxiety about compulsively doing something you don’t want to do.
The employee noted discomfort about discussing dopamine effects and suggested that leadership did not want the addiction framing elevated. Whether that was a strategic decision, a branding concern or something else entirely, the demonstrated internal awareness is difficult to ignore.
None of this suggests that social platforms are inherently evil; data suggests that up to 30 minutes a day delivers mental health benefits. They connect families across continents, allow small businesses to reach customers, give voice to creators and communities, and alleviate loneliness for many. But design and business models matter, and that’s where Meta is falling woefully short, at the expense of millions of people each day.
When revenue depends on time spent, the system optimizes for time spent.
That optimization is powered by algorithms trained to study us more carefully than we study ourselves. They monitor pause time, scroll velocity, click patterns, dwell time and social engagement. They do not simply deliver content; they shape behavior and moods, and as consumers of this product, we are generally ignorant of how much design has gone into making us addicted to what is served up to us.
Teen mental health trends over the past decade have raised red flags across medical and educational communities. Anxiety, depression and loneliness have all increased. While correlation does not automatically mean causation, the internal research shows that the companies themselves understood the psychological levers being pulled.
As parents, educators and leaders, we need to recognize that we are expecting developing brains to self-regulate against systems professionally engineered to override self-regulation.
That is, of course, a very unrealistic expectation. Especially considering that many adults are hooked on the same product and modelling the screen reliance we dread seeing our kids exhibit.
For those of you interested in pumping the brakes on social media, here are some practical suggestions.
Awareness matters. Understanding that these systems are engineered to capture attention reframes the conversation from willpower to preventing a tech system from hijacking our goals of living our best life.
Second, add in some behavioral friction. Removing push notifications, charging phones outside bedrooms and scheduling device-free time in your day are helpful. These are small environmental shifts that reduce the frequency of intermittent rewards.
Third, conversation is critical. As most parents know, teens do not respond well to prohibition or the ‘because I said so’ approach; they respond better to genuine communication. This is perfect timing, with the new Meta internal documents and Mark Zuckerberg appearing in court to discuss the science behind dopamine, reward cycles and digital design. Basically, Meta thinks teenagers are puppets they can easily control, and you can prove them wrong.
If half of Americans wish these platforms had never been invented, perhaps they really wish they had been designed differently. Casinos are highly regulated environments precisely because they are engineered to override impulse control. We would never place slot machines in middle school hallways, yet we have placed algorithmic slot machines in adolescents’ pockets.
Addictive by design is the reality of the social media products being pushed on us every day, and Meta is not coming to help us. The good news is we can do that on our own anytime, starting today.
