The tobacco litigation story is usually told triumphantly, with a malicious industry that was held accountable, victims that were vindicated, and a dangerous product that is now regulated. What that story leaves out is directly relevant to what happens next with social media.
The tobacco litigation succeeded not because cigarettes were addictive, but because the industry had committed fraud. For decades, tobacco companies knew about nicotine’s addictive properties and the link between smoking and cancer and they actively concealed that knowledge. The lawsuits that worked were the ones that went after the concealment directly. But once that concealment was exposed and disclosure became mandatory, the personal responsibility narrative reasserted itself: adults who smoke know the risks, and they choose to smoke regardless.
The processed food industry traced an almost identical arc. In the 1970s, consumer advocates petitioned the Federal Trade Commission to restrict advertising of junk foods to children. The industry fought back hard. A Washington Post editorial called the proposal a measure to “shield children from their parents’ weaknesses.” Decades later, a bill formally protecting fast food companies from obesity lawsuits passed the House. It stalled in the Senate, but the industry managed to pass similar laws in states across the country. The message was that obesity was a matter of willpower. Despite well-documented socio-environmental determinants of diet, the personal responsibility narrative stuck.
Last month’s verdict is being hailed as a break in that pattern, but I am not convinced it is.
The pattern across tobacco and processed food suggests a predictable trajectory for social media. Meta’s internal research documenting harms to teenage girls, which were suppressed then exposed, was its big tobacco moment. The litigation that followed reflects that reckoning. But as the story of tobacco and processed food demonstrated, after exposure come disclosure and warnings, and, above all, a reassertion of personal responsibility. The underlying product remains as it was.
The fixes already being floated around the social media’s verdict follow that pattern exactly. Age verification, parental controls, push notification settings, and various disclosures all place the burden of protection on individual users (or their parents), while leaving the design choices a jury just found unreasonably dangerous exactly where they are. It all goes back to the notice-and-consent model, the idea that informed individuals can and should manage their own exposure to harm.
