Barrett, Lindsey and Liccardi, Ilaria, Accidental Wiretaps: The Implications of False Positives By Always-Listening Devices For Privacy Law & Policy (February 8, 2021). Available at SSRN: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3781867
Abstract:
Always-listening devices like smart speakers, smartphones, and other voice-activated technologies create enough privacy problems when working correctly. But these devices can also misinterpret what they hear, and thus accidentally record their surroundings without the consent of those they record, a phenomenon known as a ‘false positive.’ The privacy practices of device users add another complication: a recent study of individual privacy expectations regarding false positives by voice assistants depicts how people tend to carefully consider the privacy preferences of those closest to them when deciding whether to subject them to the risk of accidental recording, but often disregard the preferences of others. The failure of device owners to get consent from those around them is exacerbated by the accidental recordings, as it means that the companies collecting the recordings aren’t obtaining the consent to record their subjects that the Federal Wiretap Act, state wiretapping laws, and consumer protection laws require, as well as contravening the stringent privacy assurances that these companies generally provide. The laws governing surreptitious recordings also frequently rely on individual and societal expectations of privacy, which are warped by the justifiable resignation to privacy invasions that most people eventually acquire.
The result is a legal regime ill-adapted to always-listening devices, with companies frequently violating wiretapping and consumer protection laws, regulators failing to enforce them, and widespread privacy violations. Ubiquitous, accidental wiretaps in our homes, workplaces, and schools are just one more example of why consent-centric approaches cannot sufficiently protect our privacy, and policymakers must learn from those failures rather than doubling down on a failed model of privacy governance.
h/t, beSpacific