
Photoillustration by Clark MillerAn Asbestos Lawyer’s New Crusade: Suing Social Media Companies
Matthew Bergman won $900 million for plaintiffs over a 25 year career filing asbestos lawsuits. Now he sues Meta, TikTok and other social media companies, claiming that they’re hooking teenagers on their apps.
In January, after members of the Senate Judiciary Committee had spent hours grilling tech CEOs about their platforms’ effects on children, Missouri Sen. Josh Hawley pounced on Meta Platforms CEO Mark Zuckerberg.
Hawley, a Republican firebrand and a vocal tech critic, accused Zuckerberg of ignoring internal company data that suggested Meta’s Instagram made life worse for some teenage girls and exposed them to unwanted sexual advances and nudity. He then asked Zuckerberg if he wanted to turn around and apologize to the families in the gallery at the hearing whose children had died or had been harmed after incidents involving the use of social media platforms, including Instagram.
“I’m sorry for everything that you’ve all gone through,” Zuckerberg said to the people, a couple dozen of whom stood holding pictures of their children.
One of the faces in the crowd, just a few yards from Zuckerberg, belonged to Matthew Bergman, a little-known Seattle-based attorney who for years had been working for a moment just like this. Many of the parents who stood to meet Zuckerberg’s gaze were clients of Bergman’s firm, the Social Media Victims Law Center, which he founded in 2021.
Bergman, who cut his teeth suing companies on behalf of asbestos exposure victims, has become an unexpected thorn in the side of big tech over the last three years. His firm, which he says is the first in the country exclusively dedicated to suing social media companies, has brought hundreds of lawsuits against companies like Meta and ByteDance, the parent company of TikTok, and has hundreds more in the works.
The subjects of the suits vary, but they generally focus on claims relating to adolescent social media addiction; the promotion of content encouraging eating disorders and suicide; and product features that allegedly allow kids to easily connect with drug dealers, sex traffickers or pedophiles.
It’s too soon to say whether his clients’ claims will be successful. Most of Bergman’s cases have been consolidated into two slow-moving legal proceedings, which are currently working their way through federal and state courts. But his track record in asbestos litigation suggests he has an eye for cases where plaintiffs can win big. Bergman said he’s recovered around $900 million for clients over the past 25 years.
“I always regretted that I was too young to have been a civil rights lawyer during the height of the civil rights movement,” Bergman said. “And this work to me has some of the same moral clarity and social purpose that the civil rights cases did.”
Bergman has become one of the leading practitioners of a novel legal strategy, now gaining wider adoption, to hold social media companies accountable for harms critics say they’re causing young people. The approach circumvents a key legal protection internet companies have relied on for decades—Section 230 of the Communications Decency Act of 1996—to immunize themselves against claims stemming from user-generated content.
Instead, Bergman and other lawyers aim to hold tech platforms responsible using theories of legal liability typically reserved for the manufacturers of defective physical products, from faulty silicone breast implants to car parts. In essence, the attorneys argue that platforms like Instagram and TikTok are addictive and harmful to kids’ mental health because they have been defectively and negligently designed.
Attorneys around the country have filed more than a thousand lawsuits making such claims against social media companies over the last two years, in addition to the ones involving Bergman’s firm.

It’s a strategy that would have seemed unfathomable just a few years ago, legal experts say. Eric Goldman, a professor at Santa Clara University Law School and co-director of its High Tech Law Institute, said he assumed judges would reject such claims of product liability, citing Section 230. He has been surprised by the number of courts now willing to entertain the theory, which has grown more popular among plaintiffs’ attorneys after a 2021 legal decision validated its use in a case involving Snap.
“The answer to this question could very well change the fate of the internet,” he said. “Because if plaintiffs win, we’ve just seen the tip of the iceberg.”
Still, as the list of lawsuits continues to grow, some critics worry that plaintiffs’ attorneys like Bergman are more interested in tapping into the tech industry’s deep pockets than they are in respecting legal precedent.
“It really has become a cottage industry,” said John Browning, a jurist in residence at Faulkner University and a former justice on Texas’ Fifth Court of Appeals. “This attitude of ‘We’re gonna turn big tech into the next big pharma or big tobacco’ seems to be more of an ambulance-chasing sort of mentality than actual concern for advancing a meaningful legal theory.”
Bergman, for his part, rejects that stereotype and says his motivation in taking on social media companies isn’t financial.
“I think some of the disparagement of plaintiffs’ lawyers might result from the idea that, unlike defense lawyers, we don’t get paid unless we win, and so we are often very resolute about how we prosecute the cases,” said Bergman. “Every field has its bad apples, but I don’t think you get into this work unless you care about people.”
U.S. consumers have long relied on personal injury litigation to hold companies responsible for harms in the absence of regulatory action. Early lawsuits against asbestos manufacturers and tobacco companies in the ‘70s and ‘80s helped advance legislative efforts and shift public perception.
Bergman hopes his social media suits will play a similar role. The lawsuits come as federal and state lawmakers are also pushing to shield children from online harms through a slew of bills designed to restrict minors’ use of social media and require more stringent privacy and safety features.
A bill recently signed into law in New York bans social media companies from showing children algorithmically curated feeds or sending late-night notifications without parental consent. The state’s lawmakers described those features as addictive and detrimental to the mental health of kids.
For his part, Bergman says he has worked with a bipartisan group of U.S. senators to advance bills like the Kids Online Safety Act, which would impose a “duty of care” on social media platforms catering to children that requires them to limit addictive features and take steps to prevent the promotion to young users of content related to suicide, eating disorders and substance abuse.
Critics say the bill restricts protected online speech and empowers the government to censor content it finds objectionable. But advocates of the bill, like Bergman and his clients, say the tech industry has been given carte blanche for too long to make products it knows can harm children’s mental health.
“One of the most disheartening things about this phenomenon is that the things that make social media so dangerous can also be easily fixed,” said Bergman, who pointed to recent legislation passed by the EU and United Kingdom as examples. “This is not a moonshot. This is simply tweaking the existing technologies in a less dangerous direction.”
Bergman’s interest in the law began when he was 12 and stumbled upon a book that had once belonged to his late grandfather: “Attorney for the Damned.” The book chronicles the arguments of Clarence Darrow, a lawyer best known for defending unpopular yet historically important clients in the 1920s, such as high school teacher John T. Scopes, who was tried for teaching evolution.
“I wanted to be just like him,” he recalled thinking about Darrow.
He went to law school with the goal of taking on cases representing what Darrow called “the weak, the suffering, and the poor.” For his first trial, his clients were farmworkers in eastern Washington who had been wrongfully terminated for consulting with a union. The workers didn’t speak English, and their foreman was extorting them to secure work, Bergman said.
“And we lost,” he recalled.
It was disappointing but also a learning experience. “There were opportunities to settle the case, and those opportunities did not come to fruition [because] I was riding on my high horse thinking I was a social justice warrior,” he said.
After a brief stint in corporate law, Bergman settled into a career representing asbestos victims as a plaintiffs’ attorney in 1995. A few years later, he founded his own law firm dedicated to representing victims of mesothelioma, a type of lung cancer connected to asbestos exposure.
“Oftentimes I would get the case right after they had received the terminal diagnosis.…In many cases they died during the pendency of the case or shortly thereafter,” he said.
In 2021, Bergman decided he was ready for something new. “I thought I might litigate civil rights cases in Mississippi; I might do some public service stuff and teach more,” recalled Bergman, who for years has taught a legal strategy course at his alma mater, Lewis and Clark Law School. The course is related to the teachings of Sun Tzu, a military general of ancient China and the author of “The Art of War.”
“I did not anticipate starting a new law firm and setting up a whole new law practice,” he chuckled. “But, you know, the fates intervened.”
Bergman wasn’t the most likely candidate to become a legal crusader against big tech.
He doesn’t use social media himself and prefers to spend his free time boating and fishing. “I clerked for a very wise judge on the Tenth Circuit who has a sign on his desk saying, ‘The worst day of fishing is better than the best day of work,’” said Bergman. “I try to live true to that judicial dictum.”
He has two adult children who grew up largely offline, too. His youngest child, now in her late twenties, used Facebook as a teen, but the app was a more innocent place for kids than it is now, Bergman recalled. “They would friend each other—they would say snarky things—but it was not this all-encompassing, addictive paradigm that has subsequently emerged.”
His first real exposure to the dark side of social media was in the spring of 2021, when he watched the Netflix docudrama “The Social Dilemma.” In it, tech experts and early Meta employees argue that social media platforms are designed to be addictive in order to maximize engagement, and consequently they can result in negative social outcomes for impressionable users.
Shortly after he saw the film, an attorney at Bergman’s asbestos firm offhandedly mentioned to him a recently decided product liability case that was making waves in their industry: Lemmon v. Snap. At his suggestion, Bergman researched the case.
“That was the eureka moment,” he recalled.
Lemmon v. Snap concerned three boys—two 17-year-olds and a 20-year-old—who died in a high-speed car crash one May evening in 2017. Just before the crash, one of the passengers opened the Snapchat app and took a photo using a built-in filter that recorded how fast they were going: 113 miles per hour.
At the time, some Snapchat users believed one of the ways to earn a special achievement on the app was by recording a speed of over 100 miles per hour using the feature, known as Speed Filter. The feature was one of many in Snapchat that in-app rewards and achievements had gamified.
In 2019, the parents of two of the boys filed a lawsuit against Snap, accusing the company of negligently designing a product that incentivized young people to drive dangerously in pursuit of Snapchat achievements. For its part, Snap, the app’s parent company, mounted the same defense most internet businesses in similar circumstances do: It argued that it had Section 230 protection and asked a judge to dismiss the case.
The district court hearing the case agreed to Snap’s request for a dismissal. But the plaintiffs appealed, and in 2021, the Ninth Circuit Court reversed the earlier decision. The court found that Section 230 wasn’t a viable defense in this case, noting in its decision that the suit wasn’t about user-generated content, like messages, which would fall under Section 230.
Instead, the suit sought to hold the company liable for the creation of the Speed Filter itself, the court said. The lawsuit “treats Snap as a products manufacturer, accusing it of negligently designing a product (Snapchat) with a defect (the interplay between Snapchat’s reward system and the Speed Filter); thus, the duty that Snap allegedly violated sprung from its distinct capacity as a product designer.”
A shiver went down Bergman’s spine as he read it, he recalled. It reminded him of Borel v. Fibreboard, a landmark 1973 product liability case that was the first successful lawsuit filed by an insulation worker against an asbestos manufacturer.

This was the first time a judge had ruled that parts of a tech company’s platform could be treated as a product and held liable for foreseeable defects. Snap and the plaintiff eventually settled the suit for terms not disclosed in court filings. Still, the appellate court’s ruling opened the floodgates for similar litigation, said Browning, the law professor.
“They proved that it could be done,” he said.
Bergman began transitioning out of his previous firm in the summer of 2021 and started researching social media addiction and its connection to what many researchers and parents view as a crisis in youth mental health.
Around the same time, The Wall Street Journal began publishing its Facebook Files series of stories, which relied on internal documents and other reporting from whistleblower Frances Haugen, a former Facebook employee. Haugen later told Congress that Meta, the parent of Facebook and Instagram, knowingly harmed children and should be regulated like big tobacco.
“I was, like, ‘Well, here it is. This is asbestos all over again,’” said Bergman. A month after Haugen testified, he founded the Social Media Victims Law Center.
Bergman sees parallels between asbestos and social media. Both were ubiquitous and popular products that people were unabashedly positive about for years.
“I mean, asbestos was the miracle fiber. It won World War II,” he said. At that time, it was commonly used as an insulator in Navy ships and for other military applications. “Similarly, when social media came out, it was going to flatten the world and make things better.”
In both cases, the initial exuberance eventually gave way to public health crises, Bergman said, pointing to the U.S. Surgeon General’s recent call for warning labels on social media due to the risks it poses to kids’ mental health.
A final parallel, he said, is misconduct. “You have companies that know that their products are hurting people and making self-conscious decisions to put their profits over the safety of the people that are using them,” said Bergman.
On that front, he says “the social media companies make the asbestos companies look like a bunch of Boy Scouts.”
Over the last three years, the Social Media Victims Law Center has grown from a solo operation into a team of six attorneys. “I’m working harder than I’ve ever worked in my life and I’m still behind,” said Bergman.
“We focus on kids that have suffered a particular physical or mental health harm and [are] being seen by a physician or psychotherapist in conjunction with that harm,” said Bergman. “If the standard of a case was: ‘Has a person suffered an adverse emotional reaction from social media?’ I mean, that would be everybody.”
One of Bergman’s clients is Tammy Rodriguez. In 2021, her 11-year-old daughter Selena died by suicide after suffering for years from an “extreme” addiction to Instagram and Snapchat, according to a complaint later filed by Bergman.
Adult Snapchat and Instagram users also solicited Selena Rodriguez for sexual content, which worsened her mental distress, the suit alleges. The suit accuses Instagram and Snapchat of creating products that failed to verify minor users’ age, had inadequate safeguards for young users and were designed to be addictive to children, among other things.
At first Bergman was surprised by the volume of outreach the firm received. “Not anymore,” he said. “I think we’ve just scratched the surface.”
Most of the Social Media Victims Law Center’s suits have been consolidated, along with hundreds of similar ones, into two sprawling umbrella cases—a federal multidistrict litigation in the Northern District of California and a California state consolidated proceeding in Los Angeles. Unlike in a class action, the individual lawsuits remain legally distinct from one another, but the grouping helps streamline pre-trial proceedings.
The two megacases recently completed “bellwether selection,” a process in which a small subset of the lawsuits are chosen to go to trial with the goal of giving plaintiffs and defendants a sense of which side is likely to prevail and the amount of damages juries might award. Bergman’s firm filed about 14% of the suits selected as bellwethers in the cases, including one of the 12 federal suits and four of the 23 state suits.
“We feel very strongly about our cases,” some of which will go to trial as early as next year, said Bergman.
Spokespeople from Meta, Snap and TikTok didn’t respond to requests for comment.
Some lawyers believe the fact that the litigation has progressed this far reflects growing skepticism about Section 230, which a growing number of lawmakers, including Ohio Sen. and Donald Trump’s running mate, J.D. Vance, have said they want to reform.
“The nature of precedent is that when a crack emerges in Section 230, other judges will start expanding that crack or reinterpreting that crack,” said Goldman, the Santa Clara University Law School professor.
Browning believes this type of litigation is getting traction because it exploits societal ills such as teenagers engaging in self-harm for financial gain.
“I think part of it is simply the fact that big tech and social media platforms have become this almost universally reviled target,” said Browning.
Paris Martineau (@parismartineau) is a feature writer and investigative reporter for The Information's Weekend section. Have a tip? Using a non-work device, contact her via Signal at +1 (267) 797-8655.