
theguardian.com
Social Media's Deadly Algorithms: "Can't Look Away" Exposes the High Cost of Tech's Pursuit of Profit
Can't Look Away" details the legal battles fought by families against social media giants like Snapchat and Facebook, holding them accountable for the deaths of teens who engaged with harmful online content, including auto-asphyxiation videos and drug deals, challenging Section 230 immunity.
- How does the film's comparison of social media companies to the tobacco industry strengthen its argument regarding corporate culpability in the deaths of young users?
- The documentary connects the documented cases of teen deaths to the broader issue of social media algorithms potentially exacerbating negative tendencies. It argues that these companies, akin to the tobacco industry, knowingly prioritize profit despite awareness of harmful consequences. The legal fight against Section 230, which shields these companies from liability, represents a key challenge in holding them accountable.
- What immediate impact do the lawsuits against social media companies, highlighted in "Can't Look Away", have on the ongoing debate about online safety and Section 230?
- Can't Look Away" details how social media companies prioritize profit over user safety, illustrated by cases of teens dying after engaging with harmful online content like auto-asphyxiation videos and drug deals facilitated on Snapchat. The film highlights the legal battle against tech giants using Section 230 immunity, showcasing the struggle for justice by families of victims.
- What long-term systemic changes, beyond legal challenges, might be needed to address the issues raised by "Can't Look Away" concerning the intersection of social media, mental health, and algorithmic design?
- Future implications include potential legal precedents that may limit the liability protections of social media companies under Section 230. This could force significant changes in content moderation policies and algorithm design, potentially impacting the spread of harmful material. The film suggests a long-term trend towards increased scrutiny and potential regulation of the tech industry's role in public health.
Cognitive Concepts
Framing Bias
The framing of the review is strongly negative towards social media companies. The choice of words like "cynicism," "obfuscation," and "ruthlessness" sets a critical tone from the beginning. Headlines and subheadings (if any) would likely reflect this negative slant, influencing the reader's perception before they engage with the details.
Language Bias
The language used is emotionally charged, employing terms such as "addicts' narrative," "harrowing indictment," and "ruthlessness." These words carry strong negative connotations and lack neutrality. More neutral alternatives could include: instead of "addicts' narrative", "a pattern of behavior designed to increase engagement"; instead of "harrowing indictment", "critical analysis"; and instead of "ruthlessness", "aggressive business practices".
Bias by Omission
The review focuses heavily on the negative impacts of social media on young people and the legal battles against tech companies. It mentions the use of social media by drug dealers on Snapchat, but doesn't explore other potential positive uses or impacts of social media platforms, omitting a balanced perspective. The review also doesn't delve into the complexities of Section 230 or alternative legal approaches. This omission limits a fully informed conclusion on the issue and the overall responsibilities of tech companies.
False Dichotomy
The review presents a somewhat simplistic dichotomy between the tech companies prioritizing profit over user well-being. While this is a significant concern, it doesn't explore the nuances of the business models, the complexities of content moderation, or other potential factors influencing the companies' decisions.
Sustainable Development Goals
The documentary highlights the negative impact of social media on the mental health of young people, leading to self-harm, suicide, and overdoses due to exposure to harmful content and predatory activities facilitated by social media platforms. The lack of effective content moderation and the prioritization of profit over user safety directly contribute to these negative health outcomes.