A Los Angeles courtroom is hosting what may become the most consequential legal challenge Big Tech has ever faced.
This is an inflection point in the global debate over Big Tech liability: For the first time, an American jury is being asked to decide whether platform design itself can give rise to product liability – not because of what users post on them, but because of how they were built.
Here, Kuhl established that the conduct-versus-content distinction – treating algorithmic design choices as the company’s own conduct rather than as the protected publication of third-party speech – was a viable legal theory for a jury to evaluate. This fine-grained approach, evaluating each design feature individually and recognizing the increased complexities of technology products’ design, represents a potential road map for courts nationwide.
Internal communications disclosed in the K.G.M. proceedings have included exchanges among Meta employees comparing the platform’s effects to pushing drugs and gambling. Whether this internal awareness constitutes the kind of corporate knowledge that supports liability is a central factual question for the jury to decide.
First, a manufacturer has a duty to exercise reasonable care in designing its product, and that duty extends to harms that are reasonably foreseeable. Second, the plaintiff must show that the type of injury suffered was a foreseeable consequence of the design choice. The manufacturer doesn’t need to have foreseen the exact injury to the exact plaintiff, but the general category of harm must have been within the range of what a reasonable designer would anticipate.
This is why the Facebook Papers and internal Meta research are so legally significant in K.G.M.’s case: They go directly to establishing that the company’s own researchers identified the specific categories of harm – depression, body dysmorphia, compulsive use patterns among adolescent girls – that the plaintiff alleges she suffered. If the company’s own data flagged these risks and leadership continued on the same design trajectory, that would considerably strengthen the foreseeability element.
The K.G.M. trial represents something more fundamental: the proposition that algorithmic design decisions are product decisions, carrying real obligations of safety and accountability. If this framework takes hold, every platform will need to reconsider not just what content appears, but why and how it is delivered.



