In a dramatic turn last March, two American juries delivered a resounding verdict against tech giants Meta and Google. In Los Angeles, the jury concluded that these companies were guilty of harming young users by embedding addictive design features into their platforms. This verdict resonated just days after a separate jury in New Mexico found Meta culpable for endangering children, a decision that resulted in $375 million in damages following an intense six-week trial.
The case in New Mexico was particularly revealing, featuring testimonies from 40 witnesses, including insiders-turned-whistleblowers. Meta has announced plans to appeal these decisions, but the dual rulings collectively signal a shifting tide in how courts and the public perceive social media accountability.
These courtroom decisions mark a pivotal moment, challenging the long-standing defense tech companies have relied upon: that they are merely platforms, not responsible for the user-generated content they host. Historically, this argument deflected responsibility, placing the onus on users for any harassment, misinformation, or abuse occurring online.
However, the New Mexico case takes a deeper dive, questioning the role of platform design in facilitating these issues. Prosecutors argued that Meta's algorithms were not just passive tools but actively directed adult users toward content from teenage accounts as part of their engagement-driven model. This insight challenges the notion that design features like recommendation engines and endless scrolls are neutral, highlighting that they inherently shape user interactions and can lead to harmful outcomes.
“This isn't just about what people post – it's about how the app is built,” a digital sociologist noted, emphasizing the need for accountability in platform design.
The impact of these rulings particularly resonates in countries like the Philippines, where social media use ranks among the highest globally. Platforms like Facebook are integral to Filipino life, affecting everything from communication to access to information and employment. This deep integration means that the risks inherent in platform designs also pose significant threats, manifesting in forms like online harassment and rapid disinformation spread.
While personal responsibility remains crucial, these cases underscore that it cannot entirely shoulder the burden of systemic issues embedded in platform architecture. Age restrictions on digital platforms are often seen as quick fixes, but they fall short of addressing the fundamental problem.
As the rulings against Meta and Google spotlight, there's a growing recognition that responsibility in the digital age extends beyond individual actions – it begins with the architecture of the systems we use. This shift calls for a reevaluation of how platforms are designed and a demand for improvements from both regulators and users worldwide.