
dailymail.co.uk
Which? Investigation Exposes Excessive Data Demands by Popular Smartphone Apps
A Which? investigation revealed that 20 popular apps, downloaded over 28 billion times, request excessive "risky" permissions, including microphone and location access, raising concerns about user data privacy; Xiaomi Home requested 91 permissions, Facebook 69, and WhatsApp 66.
- Why do popular apps request extensive permissions, and what are the potential consequences for users who grant these permissions?
- The investigation revealed that apps like Facebook (69 permissions, 6 risky), WhatsApp (66, 6 risky), and Xiaomi Home (91, 5 risky) request extensive permissions. Two apps sent data to China, highlighting cross-border data transfer issues. This pattern suggests a widespread problem of excessive data collection by popular apps.
- What are the most significant privacy risks revealed by the Which? investigation of popular smartphone apps, and how do these risks affect users?
- Which? investigated 20 popular apps, finding all requested "risky" permissions like location and microphone access, even when unnecessary. This raises concerns about user data privacy, potentially exposing vast amounts of personal information.
- What future regulatory changes or industry practices could effectively mitigate the privacy risks associated with excessive data collection by smartphone apps?
- The findings indicate a need for greater user awareness regarding app permissions and increased transparency from app developers. Future regulations might focus on limiting unnecessary data requests and improving cross-border data transfer controls. This could lead to more privacy-focused app design.
Cognitive Concepts
Framing Bias
The framing emphasizes the negative aspects of app data collection and the potential privacy risks. While acknowledging some legitimate uses for permissions, the overall tone and emphasis heavily lean towards portraying apps as 'data hungry' and their requests as 'shocking' and 'risky.' Headlines and introductory paragraphs contribute to this negative framing. This might lead readers to view apps with suspicion rather than considering the balance between functionality and privacy.
Language Bias
The article uses charged language such as 'data hungry,' 'shocking,' 'risky,' and 'scary' to describe app data collection practices. These terms are emotive and contribute to a negative portrayal of the apps. More neutral alternatives could include 'extensive data requests,' 'significant permissions,' and 'potentially sensitive data.'
Bias by Omission
The analysis focuses primarily on the permissions requested by apps and the potential privacy implications. However, it omits discussion of the specific data collected, how that data is used, and the security measures in place to protect it. This omission limits the scope of the analysis and prevents a full understanding of the privacy risks.
False Dichotomy
The article presents a somewhat false dichotomy by framing the issue as apps needing access to data versus users' privacy. It doesn't fully explore the complexities of data usage, legitimate needs for certain permissions (e.g., location for mapping apps), or the potential benefits of data-driven services. This oversimplification could mislead readers into believing all data collection is inherently harmful.
Sustainable Development Goals
The article highlights how numerous popular apps request excessive permissions to access user data, often without a clear need. This excessive data collection impacts responsible consumption and production by raising concerns about privacy violation and the potential misuse of personal information for targeted advertising and other purposes. The lack of transparency and user awareness around these permissions contributes to unsustainable consumption patterns.