AI Spyware on your Phone - Client Side Scanning
An abstract
Client-Side Scanning (CSS) is a technology that scans your files (photos, messages, etc.) on your device to detect harmful content. While designed to combat serious crimes, it raises privacy concerns about device surveillance, potential misuse by governments or bad actors, and expanded monitoring of sensitive or lawful content.
This article explains CSS, its implications for privacy and security, and why it matters to everyone, not just tech experts. I also provides actionable tips to help protect your data and mitigate the risks of CSS.
What is client-side scanning?
Client-Side Scanning (CSS) is a technology that allows your device, like your phone or computer, to check files—such as photos, messages, or videos—for specific content and compares those files “fingerprints” with known file fingerprints host on a server. Imagine your phone acting like a security guard that scans your pictures for anything illegal or inappropriate before they get uploaded to the internet or cloud storage. The stated goal of CSS is often to stop crimes like the spread of harmful materials, such as child abuse images, avoiding your raw data being sent to a server somewhere for server-side scanning to take place.
To work, CSS uses special tools to compare your files against a database of known content, but it does this directly on your device instead of on a company’s servers. This means your data doesn’t need to be sent anywhere else for scanning. If your device finds a match with harmful content, it can alert the company or authorities, depending on how the system is set up.
The idea is founded in good intentions, but many are concerned about it is used. There is worry it could used invade privacy because your personal device is being used to monitor what you store or share. There’s also the fear that governments or others might expand the technology to monitor lawful but sensitive things, like political opinions or private conversations. While CSS is meant to protect people, critics argue it could easily be misused and pose risks to everyone’s privacy and security.
Why does it matter to me?
This matters to the average person because it directly impacts privacy, security, and how personal devices like smartphones or laptops are used. While it’s often touted as a way to protect against serious crimes like child abuse or terrorism, it raises concerns about who controls the scanning and how it could be used in the future.
CSS involves your device scanning your personal files, such as documents, photos or messages, without your explicit consent or any visibility into what they are looking for. Even if the system is designed to look for specific harmful content, it means your device is being used as a surveillance tool. This shifts control over your data from you to companies or governments, which removed true privacy over our thoughts and ideas. Over time, there’s a risk that this type of survaillence could expand to target other types of content, like political speech or private conversations, especially in countries with weaker protections for personal freedoms. Governments around the world have taken action against basic controls of security/privacy or even outright banned these technologies.
Second, CSS could create new security risks. If scanning systems are hacked or misused, they could be used to spy on people or falsely accuse someone of having illegal content. With even large govnments or corporations getting breached regularly, even if those in control have good intetions bad actors can possibly leverage these tools. For the average person, this means their trust in their devices and the privacy of their data might be at risk. In a world where we rely on our devices for work, communication, and personal storage, these concerns make CSS a topic that affects everyone—not just tech experts.
Actions to take
Technologies and ToS are ever evolving so this recommendations may change over time. There are a few key steps one could consider to help mitigate the potential risks and/or exposure to technologies such as CSS.
- Use Open-Source software or privacy centric tech
- Switch to software and platforms that are transparent and do not implement CSS. Look for open-source apps (like Signal for messaging) that prioritize end-to-end encryption and do not scan your data. Open-source tools allow independent experts to verify their security and ensure there are no hidden surveillance features.
- Switch to Linux and De-Googled Andriod
- Consider switching away from Microsoft Windows and Mac OSX towards open-sourced desktop OSes like Linux. Both Microsoft and Apple have been slowing integrating CSS and adjacent technologies into their Operating Systems over the years.
- Consider switching off of Apple’s iOS and Standard Andriod towards something like GrapheneOS.
- Avoid most Cloud Services
- When you send your data into “the cloud” you almost always relinquish control of that data. You should consider self-hosted options if you are technically inclined, or keep copies locally.
- If you do use cloud storage consider using something like VeraCrypt to upload only fully encrypted containers so CSS is mitigated.
- Review application updates manually instead of relying on automatic updates. While the current implementation of CSS-enabled software can respect some level of privacy, code changes can swifty change that.
- If you do use cloud storage consider using something like VeraCrypt to upload only fully encrypted containers so CSS is mitigated.
- Most importantly, be vigilent
- Stay updated on emerging technologies and policy changes. Support organizations and groups who oppose intrusive tech such as CSS. Public knowledge and resistance to such technologies will hopefully at minimum delay the inevitable surveillance state and further erosion of privacy
References
Article | Link | Summary |
---|---|---|
EEF - In 2021, We Told Apple: Don’t Scan Our Phones | Article | Summary |
Ask HN: If client side scanning on devices becomes mandatory, what would you do? | Article | Summary |
IEEE - Attitudes towards Client-Side Scanning for CSAM, Terrorism, Drug Trafficking, Drug Use and Tax Evasion in Germany | Article | Summary |
Bugs in our pockets: the risks of client-side scanning | Article | Summary |
Apple quietly deletes details of derided CSAM scanning tech from its Child Safety page without explanation | Article | Summary |
Article Summaries
Ask HN If client side scanning on devices becomes mandatory
- Overview
- In a recent Hacker News thread titled “Ask HN: If client side scanning on devices becomes mandatory, what would you do?” HACKER NEWS , users discussed potential responses to the implementation of mandatory client-side scanning (CSS) technologies.
- Key Points from the Discussion:
- Adoption of Open-Source and Alternative Software: Many participants suggested transitioning to open-source applications that maintain end-to-end encryption without incorporating CSS. This approach is seen as a way to preserve privacy and control over personal data.
- Use of Alternative Operating Systems: Some users indicated a preference for operating systems like Linux, which may offer greater flexibility and resistance to mandatory CSS implementations compared to mainstream systems like Windows or macOS. Segregation of Device Usage: A strategy mentioned involves dedicating specific devices to particular tasks. For instance, using one machine for sensitive activities and another for general use, thereby minimizing exposure to potential privacy intrusions.
- Concerns Over Government Overreach: Participants expressed apprehension that mandatory CSS could lead to increased surveillance and misuse by authorities, potentially infringing on individual rights and freedoms. Potential for Resistance and Sabotage: There was discussion about actively resisting such measures, including disabling scanning features or generating false positives to undermine the effectiveness of CSS.
- Overall, the thread reflects a strong inclination among users to seek out and support technologies and practices that uphold privacy and resist intrusive scanning mandates.
- Use of Alternative Operating Systems: Some users indicated a preference for operating systems like Linux, which may offer greater flexibility and resistance to mandatory CSS implementations compared to mainstream systems like Windows or macOS. Segregation of Device Usage: A strategy mentioned involves dedicating specific devices to particular tasks. For instance, using one machine for sensitive activities and another for general use, thereby minimizing exposure to potential privacy intrusions.
IEEE Attitudes towards Client-Side Scanning for CSAM Terrorism Drug Trafficking Drug Use and Tax Evasion in Germany
- Abstract The document investigates public attitudes in Germany toward Client-Side Scanning (CSS) technologies, focusing on their potential for combating crimes like CSAM, terrorism, drug trafficking, and tax evasion. A survey with 1,062 participants reveals broad support for CSS in combating severe crimes like CSAM, with concerns about privacy risks and abuse by authorities.
- Introduction The section highlights the controversy surrounding Apple’s 2021 CSS proposal for detecting CSAM on users’ devices and subsequent legislative pushes in the EU. It frames the survey’s purpose as understanding the German public’s perspective on CSS compared to cloud-based approaches.
- Related Work This section reviews previous research on CSS and debates between privacy and surveillance. It notes technical critiques of CSS and concerns over potential misuse and feature creep. Public opinion studies related to digital privacy and crime prevention are also examined.
- Methodology Describes the survey’s design and variables, including crime type (e.g., CSAM, terrorism), data type (images, text), scanning location (device or cloud), and responsible organization (government or private). Participants were primed with positive or negative context before answering. Demographics and exploratory variables like trust in institutions and parenthood were analyzed.
- Results The findings indicate high overall support for CSS, particularly for combating CSAM (65%) and terrorism (61%). Trust in government and law enforcement positively influenced acceptance, while private companies garnered less support. Concerns about misuse were significant even among CSS supporters.
- Discussion Analyzes key insights, emphasizing the importance of addressing feature creep and maintaining public trust. Highlights the distinction between public and expert perspectives on privacy trade-offs, advocating for transparent risk-benefit analyses and inclusive debate.
- Conclusion Summarizes the survey’s findings and implications for policymakers and researchers. Calls for balanced approaches that respect privacy while addressing serious crimes, emphasizing the need for continued research into public attitudes and technical safeguards.
Bugs in our pockets the risks of client side scanning
- Abstract The document critiques Client-Side Scanning (CSS) technologies, which aim to balance encryption with crime prevention. CSS systems analyze user data on their devices before or after encryption. While proponents claim it respects privacy and enables crime detection, the authors argue it poses severe security and privacy risks, with limited efficacy for law enforcement.
- Introduction The rise of strong encryption has challenged law enforcement’s access to evidence. CSS, as an alternative, introduces on-device content analysis. It is framed as a potential compromise but fundamentally alters the surveillance paradigm, moving from server-based to user-device-based scanning. The introduction also highlights unresolved questions around transparency, misuse, and design limitations.
- Methodology and Terminology This section outlines the definitions and scope of CSS. It covers key terms like “content” (text, images, videos) and “targeted content” (illegal material like CSAM). The authors discuss existing technologies for content scanning, such as perceptual hashing and machine learning, which form the basis for CSS implementation.
- Current Practices and Challenges Examines existing server-side scanning systems (e.g., spam filters) and contrasts them with CSS. Server-side scanning benefits from centralized control and scalability but is computationally expensive. CSS decentralizes scanning, raising unique challenges such as security vulnerabilities on individual devices and risks of unauthorized access.
- Privacy and Security Risks CSS introduces new vulnerabilities: Expansion of Scanning: Facilitates large-scale surveillance beyond original purposes, allowing unauthorized parties to target non-criminal content. Local Adversaries: Risks misuse by abusive partners, employers, or governments. Technical Vulnerabilities: Creates new attack surfaces, enabling adversaries to manipulate scanning systems.
- Legal and Policy Considerations Highlights the need for legal frameworks to regulate CSS. The authors argue that CSS violates principles of proportionality, privacy, and specificity, which are critical in democratic societies. They critique the potential for misuse, emphasizing the lack of transparency and oversight.
- Adversarial Efficacy CSS struggles in adversarial contexts where users attempt to bypass detection or flood systems with false positives. These challenges are exacerbated when scanning happens on personal devices, as adversaries can exploit vulnerabilities more easily than in centralized systems.
- Recommendations and Design Principles The authors propose principles for secure CSS design: Minimize trust dependencies (e.g., reduce reliance on single actors). Ensure open and auditable systems. Limit scope and purpose to prevent misuse. They emphasize the importance of balancing law enforcement needs with privacy rights.
- Conclusion The paper concludes that CSS, in its current form, fails to achieve a balance between privacy and crime prevention. It introduces significant risks without providing reliable benefits. The authors call for transparent debates and robust safeguards before deploying such technologies.
Apple quietly deletes details of derided CSAM scanning tech from its Child Safety page without explanation
- Introduction Apple initially announced plans to deploy a Child Sexual Abuse Material (CSAM) Detection system as part of its child safety initiatives. The feature would use on-device scanning to detect known CSAM before iCloud synchronization. While framed as a way to protect children, the approach sparked significant controversy due to privacy concerns and potential misuse.
- The CSAM Detection Plan The proposed system relied on a hybrid model combining on-device and server-side components. Images would be hashed locally and matched against a database of known CSAM hashes using private set intersection, a cryptographic method designed to balance detection and privacy. The plan aimed to enable end-to-end encrypted backups while addressing law enforcement concerns.
- Public Backlash The announcement faced widespread criticism from advocacy groups, security experts, and privacy advocates. Critics argued that: The system violated user privacy by co-opting personal devices for scanning without consent. It normalized intrusive client-side scanning, opening the door for abuse by governments and corporations. Apple’s assurances of resisting external demands for broader scanning were dubious, given its compliance with restrictive policies in countries like China and Russia. Prominent figures like Edward Snowden and Matthew Green highlighted the risks of the system, including the potential for global surveillance and privacy erosion.
- Apple’s Response In September, Apple announced a delay in deploying the feature, citing feedback from customers, researchers, and advocacy groups. The company promised to collect additional input and refine the system before release. However, by December, references to the CSAM detection plan were quietly removed from Apple’s Child Safety webpage without explanation.
- Expert Critique and Developments Security and cryptography experts published a detailed paper in October opposing client-side scanning, citing significant security and privacy risks. Matthew Green speculated that Apple might shift to server-side scanning, which could undermine the company’s ability to offer full end-to-end encryption for iCloud.
- Conclusion and Update Apple’s decision to remove references to the CSAM detection plan suggests a reevaluation of its approach. However, the company has clarified that the plan is not entirely scrapped but remains delayed. The quiet changes to its Child Safety webpage leave questions about the future of the technology and its implications for privacy.