If you have an Amazon Echo, try this: Say, “Alexa, tell me a joke,” but do it very quickly so that you finish the request before Alexa “wakes up” (indicated on the Echo by the blue light). Did you notice that Alexa dutifully complied, seemingly catching the request before she (it?) was awake? There is a simple explanation for this: Alexa (like other artificially intelligent digital assistants) is always listening. Indeed, Alexa starts recording “a fraction of a second” before the wake word. Google Home listens to snippets of conversations to detect the “hotword.”
After becoming more familiar with Alexa at home, I considered adding an Echo or similar device to my law office. I imagined the added convenience of having my own artificially intelligent digital assistant in the office. She could make notes and calendar entries, add items to my checklist, tell me who I’m meeting for lunch and where, and perhaps add time entries and quickly retrieve obscure facts, all with a simple verbal command. But since smart devices like Alexa are always listening, the added convenience comes with a tradeoff – one with substantial privacy implications. How comfortable would you be knowing that transcripts of your verbal interactions are kept by many digital assistants’ service providers?
What are your rights to restrict the use and dissemination of collected voice data? Can private parties or the federal government obtain this data through a subpoena, search warrant, or court order (or without)? To challenge a search under the Fourth Amendment, you must have a reasonable expectation of privacy. Is such expectation reasonable in the presence of a digital assistant? While these devices are generally designed only to record information once a designated wake word is spoken, few consider the practical reality that to detect the wake word, the device must always be listening for it.
What if a device is accidentally activated? In a recent client meeting, someone answered in agreement to a question, beginning with, “sure, he can do that …” On a nearby iPhone, Siri heard her name and began actively listening. Even scarier: A friend recently explained that he loves his new Samsung Galaxy phone but is annoyed that Bixby (Samsung’s AI assistant) is often triggered unintentionally and seems to have a mind of his own. Accidental activations, often through similar sounding words or simple software glitches, create risks of unintended recordings.
Additional risks are present in the data you intend to share. Your privacy expectations may be undercut by a service provider’s terms of service or privacy policy for a given device. For instance, as disclosed by Alexa’s terms of use, if you access third-party services and apps through Alexa, Amazon (naturally) shares the content of your requests with those third parties. Amazon further discloses that data you provide may be stored on foreign servers. As such, U.S. Fourth Amendment protections may not apply.
Amazon handles the information received from Alexa in accordance with its privacy policy. Your interactions with Alexa, including voice recordings, are stored in the cloud. You can review and delete them, but Amazon explains that deleting them may degrade your Alexa experience. Google similarly explains that deleting your interaction history will limit the personalized features of your Google assistant. Artificially intelligent devices need data from users – the more the better – to learn and adapt. The privacy paradox is that users must, therefore, agree to sacrifice some degree of privacy to enrich the user experience.