JILLIAN MICHAELS: Big Tech built a digital drug — and our kids are hooked
"Those who ignore history are condemned to repeat it."
In the 1990s, America watched tobacco executives raise their right hands before Congress and swear nicotine was not addictive. We now know they were lying through their teeth. Internal documents later proved cigarettes were chemically engineered to maximize dependency and deliberately marketed to children to create "replacement smokers" for a dying customer base.
Today, we are watching the same lie unfold in real time. Only now, the product is not Marlboro. It is the algorithm.
A new class of titans – Meta, TikTok, Snap and Google – have built digital machines designed to addict our kids. The damage is not in their lungs. It is in the wiring of their developing brains.
On Feb. 9, a landmark jury trial began in California Superior Court that could fundamentally reshape how social media is regulated. In his opening statement, attorney Mark Lanier put it plainly:
"These companies built machines designed to addict the brains of children, and they did it on purpose."
The plaintiff, known as K.G.M., is suing Meta (Instagram) and YouTube, alleging severe mental health harm caused by social media addiction. Snap and TikTok were originally defendants but settled last month, avoiding a public trial that would have forced executives to testify and exposed internal documents.
The case is groundbreaking because it bypasses Big Tech’s usual shields, including Section 230 and First Amendment defenses, by arguing the harm comes not from user content but from defective product design: algorithms, notifications and behavioral hooks engineered to maximize "time on device."
Just as Big Tobacco added ammonia to cigarettes to spike nicotine absorption, Big Tech engineered dopamine loops to override impulse control.
CALIFORNIA MOM SAYS CHATGPT COACHED TEEN SON ON DRUG USE BEFORE HIS FATAL OVERDOSE: REPORT
This is not speculation. It is documented.
Platforms deliberately deploy intermittent variable rewards, the same psychological mechanism that makes slot machines addictive. When a child refreshes their feed, they do not know what they will get. That uncertainty triggers dopamine, training the brain to keep pulling the lever.
Infinite scroll removes stopping cues. Autoplay erases choice. Push notifications are timed to pull users back the moment attention drifts. These are not communication tools. They are behavior modification systems.
Meta’s own employees warned internally that "Instagram is a drug." They knew the platform worsened body image issues for one in three teen girls. But "time on device," the metric that drives ad revenue, won anyway. Every time.
Some of those employees quit. Then they blew the whistle. And they brought receipts. Internal research showed 32% of teen girls who already felt bad about their bodies felt worse after using Instagram. Forty percent of teen boys experienced harmful social comparison. Even more disturbing, when young users consuming eating disorder content became more depressed, they used the app more.
Depression drove engagement. Engagement drove revenue. That is the business model. And when those findings became public, unsealed communications revealed leadership debating whether they should even continue studying teen harm because the research itself was getting them in trouble.
LAWMAKERS UNVEIL BIPARTISAN GUARD ACT AFTER PARENTS BLAME AI CHATBOTS FOR TEEN SUICIDES, VIOLENCE
They were not asking how to fix it. They were asking how to hide it.
Former Meta engineering director Arturo Béjar later confirmed what parents already feared. After his own daughter began receiving sexual solicitations on Instagram, Béjar conducted large-scale internal surveys. The results were staggering.
More than half of users reported harmful experiences. Nearly a quarter of teens received unwanted sexual advances. Only about 2% of reported harmful content was removed.
CHILDREN ARE AT RISK OF FORMING ROMANTIC BONDS WITH AI CHATBOTS, EXPERTS WARN
He warned top executives directly. Nothing improved. In fact, protections were rolled back.
Independent testing later found most of Instagram’s advertised teen safety tools were nonexistent or ineffective. Kids could still access suicide and self-harm content. Autocomplete recommended searches related to drugs and self-injury. Children under 13 accessed the platform easily despite age limits.
Parents were given a false sense of security while Meta knew harm was occurring.
BIG TECH'S TOBACCO MOMENT IS HERE — AND THE TRUTH ABOUT HARMING KIDS IS OUT
Even the U.S. surgeon general sounded the alarm in 2023, warning of profound risks to youth mental health.
And the damage does not stop at anxiety and self-harm.
Social platforms have become digital drug markets for teenagers. Snapchat and TikTok function as modern open-air exchanges, where dealers advertise pills using emojis and disappearing messages. Kids think they are buying Percocet or Xanax. Instead, they receive counterfeit pills laced with fentanyl.
HOUSE MOVES TO PROTECT CHILDREN FROM ONLINE PREDATORS AS AUSTRALIA CLAMPS DOWN ON SOCIAL MEDIA
Emergency rooms know the outcome.
So do morgues.
More than 40 states are now suing tech giants as the fallout piles up in hospitals, schools and homes across America.
WHY PARENTS MAY WANT TO DELAY SMARTPHONES FOR KIDS
This is exactly how Big Tobacco collapsed. Denial. Internal documents. Whistleblowers. Lawsuits. Then a national reckoning.
We are there now.
CLICK HERE FOR MORE FOX NEWS OPINION
While Congress dithers and courts grind forward, parents are the last line standing between their children and products designed to outsmart them. Here are four steps that the experts suggest:
Big Tech made a bet that it could addict our kids faster than the law could stop it. For a long time, that bet paid off. But the smoke is clearing. Now we can finally see the fire.
It is time to stop pretending these platforms are neutral public squares. They are dangerous products and they must be treated that way, with warning labels, transparency requirements, age appropriate design and real accountability when design choices foreseeably harm children.
Big Tobacco had its reckoning. Big Tech is next.
