On February 11, Adam Mosseri—chief of Instagram, the product at the center of a landmark social media addiction trial—testified under oath that social media is not "clinically addictive" and that Meta was "careful to test features used by young people." The same day, Meta launched an AI feature on Threads where users personalize their feed by writing a public post that begins with "Dear Algo." Also the same day: a $10 billion data center in Indiana, and the news that Meta sold 7 million AI glasses in 2025. Four stories. One company. Two timelines.

The Trial

Two trials opened simultaneously on February 10. In Los Angeles, a social media addiction case went to opening statements—plaintiffs arguing that companies designed features to hook young users. YouTube's defense: it's an entertainment service like Netflix, not a social network at all. In New Mexico, prosecutors in a separate Meta lawsuit argued that the company misrepresented the safety of its platforms.

The LA trial has been three and a half years in the making. The first parent lawsuits were filed in 2022. In October 2023, California and dozens of states sued Meta in federal court. In December 2023, New Mexico sued separately, alleging Instagram had "become a marketplace for predators." By January 2026, TikTok settled to avoid the trial. Snap settled too. Of the original defendants, only Meta and YouTube remained.

And on the eve of trial, Bloomberg reported that Meta had paid for over 3,500 television ads on CNN, Fox, and other networks promoting Instagram's Teen Accounts—the safety feature it built in response to the scrutiny. The defendant ran a national advertising campaign about its safety record in the weeks before the trial about its safety record.

The Testimony

February 2026
Social media addiction trial: Adam Mosseri says social media is not "clinically addictive" and that Meta was "careful to test features used by young people"
New York Times

"Not clinically addictive." The phrase belongs to a specific genre of corporate testimony. Tobacco executives told Congress in 1994 that nicotine was not addictive. Opioid manufacturers argued their products were not habit-forming when used as directed. The formula is consistent: redefine the standard of harm narrowly enough that the product falls outside it. "Clinically" is doing a lot of work in Mosseri's sentence. Something can be compulsive, habit-forming, and psychologically damaging without meeting a clinical definition of addiction. The qualifier is the defense.

Mosseri also testified that Meta was "careful to test features used by young people." This is the more revealing claim. Careful to test—but careful to test for what? Engagement? Retention? Time spent? Safety? The nature of the testing is the question the trial is designed to answer. Internal documents previously filed showed that Meta axed research based on a Nielsen survey in 2020 that found its platforms were associated with teen mental health harms. Being careful to test and being careful to act on the results are different things.

Dear Algo

While Mosseri testified about algorithms in a Los Angeles courtroom, Meta's engineers shipped one to Threads.

February 2026
Meta launches an AI feature that lets Threads users temporarily personalize their feed by specifying topics in a public post that begins with "Dear Algo"
CNBC

"Dear Algo." Users write a letter to the algorithm. They address it by name, tell it what they want, and the algorithm reshapes their feed accordingly. It's a product feature, but it's also a philosophical reframing. If the user writes the instructions, the algorithm is just following orders. The engagement is requested, not imposed. The personalization is a choice, not a manipulation.

This is the trial defense rebuilt as a product. The central claim against Meta is that its algorithms are designed to maximize engagement at the expense of user wellbeing—that the system decides what you see, and what it decides to show you is what keeps you scrolling. "Dear Algo" inverts the relationship. You tell the algorithm what you want. If you get hooked, you wrote the letter.

The timing may be coincidental. Product launches don't coordinate with trial schedules. But the design philosophy is not coincidental. Meta has spent three years defending itself against the claim that algorithms control users. "Dear Algo" is the architectural response: build the product so the user appears to control the algorithm.

The Next Products

The trial is about Instagram circa 2019-2023. The company being tried is building something else.

Meta AI glasses sold in 2025
new data center in Indiana

EssilorLuxottica reported that it sold over 7 million Meta AI glasses in 2025—up from 2 million combined in 2023 and 2024. A 3.5x increase in a single year. These are Ray-Ban frames with cameras, microphones, speakers, and AI. They sit on your face. They see what you see. They hear what you hear. They are more intimate than any phone app, more present than any feed. If Instagram's algorithm is the subject of the trial, Meta's AI glasses are the next generation of the same question: what happens when a company optimized for engagement builds a product that lives on your body?

The $10B data center in Lebanon, Indiana will deliver one gigawatt of power by late 2027 or early 2028. It's not being built for Instagram. It's being built for AI—the infrastructure behind the glasses, behind "Dear Algo," behind whatever Meta builds next. On the same day its CEO of Instagram testified about the last product, Meta committed $10 billion to the next one.

The Lag

The structural pattern is the insight. Trace the timeline:

Three and a half years from first lawsuit to opening testimony. In that time, Meta launched Threads, built AI into its entire product stack, shipped millions of AI glasses, and committed tens of billions to data centers. The product on trial is already two generations old. The features plaintiffs are challenging have been rebuilt, renamed, and superseded by AI systems that didn't exist when the lawsuits were filed.

This isn't unique to Meta. It's the structural condition of technology accountability. Courts operate on the timeline of precedent, discovery, and due process—measured in years. Technology companies operate on the timeline of product cycles—measured in months. By the time any verdict arrives, the product will have changed. By the time any remedy is imposed, it will apply to features that may no longer exist.

TikTok and Snap understood this. They settled—not because they admitted fault, but because trials that adjudicate yesterday's product are more expensive than settlements that close the book. Meta and YouTube chose to fight. YouTube's strategy: argue it's not even social media. Meta's strategy: argue the product was never harmful. Both strategies concede the same thing: the trial is about the past.

The Record

The trial will matter anyway. Not because of the verdict—though damages could be significant—but because of the record it creates. Trials produce sworn testimony, internal documents, and contemporaneous evidence that no settlement can replicate. We already know, from previously filed documents, that Meta's own research linked its platforms to teen mental health harms and that the company chose not to act on those findings. The trial will produce more.

"Not clinically addictive" is now part of the record. It will be quoted, analyzed, and remembered. Whether it ages like a reasonable scientific claim or like the tobacco executives' testimony depends on what the next decade reveals about what Meta knew and when it knew it.

But even as the record is being built, the company is moving. On the day Mosseri sat in a witness chair defending the algorithm, Meta invited its users to write it a letter. Dear Algo. The algorithm will read your request, reshape your feed, and give you exactly what you asked for. And if you can't stop scrolling—well, you're the one who wrote the letter.