This week, I browse through a record of everything I’ve said to Alexa, and it felt a bit like reading an old diary. Until I remembered that the things I’ve told Alexa privately are saved in an Amazon server and also have possibly been read with an Amazon worker . That is all to make Alexa better, the provider keeps saying. But to many people, it is not immediately evident how humans interacting with your apparently private voice commands is anything other than surveillance. Alexa, these people today say, is a spy hiding in a wiretapping device.

The debate over whether Alexa or some other voice helper is spying on us is years old at this point, and it is not going away. Privacy advocates have registered a complaint with the Federal Trade Commission (FTC) alleging these devices violate the Federal Wiretap Act. Journalists have researched the dangers of always-on microphones and artificially intelligent voice assistants. Skeptical tech bloggers like me have argued that these things were more powerful than individuals realized and bombarded with privacy violations. Recent news reports about how Amazon employees review certain Alexa commands suggest the situation is worse than we thought.
It is starting to feel like Alexa and other voice supporters are not able to spy on us because that is the way the system was designed to work. These programs rely on machine learning and artificial intelligence to improve themselves over time. The new technology underpinning them is also now prone to error, and even if they had been perfect, the data-hungry companies that built them are constantly thinking about new ways to exploit users for profit. And where imperfect technology and powerful companies collide, the government will struggle so much with knowing what’s going on, that regulation sounds like an impossible answer.
The situation is not completely dire. This technology could be very cool, if we pay closer attention to what is happening. That, again, is pretty damn complicated.
Never-ending Errors
1 basic problem with Alexa or alternative voice assistants is that the technology is prone to fail. Devices such as the Echo come equipped with always-on microphones that are only supposed to record when you need them to listen to. When some devices need the push of a physical button to beckon Alexa, many are designed to start recording you once you’ve stated the wake word. Anyone who has spent some time using Alexa knows it doesn’t always work like this. Sometimes the computer software hears a random sound, thinks it is the aftermath of the word and starts recording.
The degree to which false positives are a problem became glaringly clear the moment I started reading through my record of Alexa commands on Amazon’s web site. Most of the entrances are boring:”Hey Alexa;””Show me an omelet recipe;””What is up?” But sprinkled among the mundane dribble was also a daunting series of messages that stated,”Text not available–audio wasn’t meant for Alexa.” Each time that I watched it, I saw it twice again and read it aloud in my head:”Audio wasn’t intended for Alexa.” These are the things Alexa heard it shouldn’t have heard, controls that have been sent to Amazon’s servers and sent back since the machine decided that the aftermath word hadn’t been said or that Alexa had recorded sound once the user was not giving a control. In other words, they are mistakes.

At face value, voice supporters picking up stray audio is an inevitable flaw from the technology. The very complex computer software that can understand anything you say is hiding behind a really simple one that’s been trained to listen to a wake word and then send whatever controls come after that to the smarter computer. The dilemma is that the simple computer frequently doesn’t do the job correctly, and people don’t always know that there’s a recording device inside the room. That is the way we get Echo-based nightmares such as the Oregon couple who inadvertently sent a recording of a whole conversation to an acquaintance. Amazon itself has been working on enhancements to decrease the error rate with aftermath words, but it’s hard to imagine that the system will ever be flawless.
“That’s the scary thing: there is a microphone in your house, and you do not have ultimate control over when it gets triggered,” Dr. Jeremy Gillula, tech jobs director at the Electronic Frontier Foundation (EFF), informed me. “From my perspective, that’s problematic from a privacy perspective.”
This sort of thing happening is bad fortune, although it’s more common than most people would like. What’s perhaps worse than glitches is that the exact intentional behind-the-scenes workflow which shows users’ interactions with voice assistants to strangers. Bloomberg lately reported that a team of Amazon employees has access to Alexa users’ geographical coordinates and this information was collected to enhance the voice helper’s abilities. This revelation came just a couple of weeks after Bloomberg reported that tens of thousands of individuals employed by Amazon around the world analyze users’ Alexa commands to train the software. They can overhear compromising situations, and in certain instances, the Amazon workers make fun of what people say.
Amazon pushed hard against those reports. A company spokesperson told me that Amazon just annotates”a very few of interactions by a random group of customers to be able to enhance the customer experience.” These recordings are kept in a protected system which uses multi-factor authentication to ensure”a limited number” of carefully monitored workers can obtain access. Bloomberg suggests that the team numbers in the thousands.
But for Alexa and other artificially intelligent voice assistants to work, some human inspection is essential. This practice can prevent future errors and lead to better features. Amazon isn’t the only company using humans to examine voice orders, either. Google and Apple also employ teams of individuals to examine what users say to their voice supporters to train the program to understand people better and also to create new capabilities. Sure, the individual element of those apparently computer-based services is creepy, but it’s also a crucial part of how these technologies have been developed.
“At the end, for quite hard circumstances, you need a human to tell you what was going on,” Dr. Alex Rudnicky, a computer scientist at Carnegie Mellon University, stated in an interview. Rudnicky’s been developing speech recognition software since the 1980s and has led teams competing in the Alexa Prize, an Amazon-sponsored competition for conversational artificial intelligence. While he asserts that humans are necessary for improving natural speech processing, Rudnicky also believes that it’s incredibly improbable for a voice command to get traced back to one individual.
“Once you are out one of 10 million,” Rudnicky said,”it is kind of hard to argue that somebody’s going to locate it and follow it back to you and find out things about you that you do not want them to understand.”
This idea does not make the notion of a stranger reading your daily thoughts or understanding your place history sense any less creepy, nevertheless. It may be unusual for a voice helper to record me accidentally, but the systems do not yet look smart enough to wake up with 100% accuracy. That Amazon catalogs and makes available all the Alexa records it captures–accidental or otherwise–makes me feel terrible.
The Privacy Problem Nobody Wants to Repair
In recent conversations, half a dozen privacy and technology specialists told me that we want stronger privacy legislation to tackle some of those problems with Alexa. The quantity of private data an Echo gathers is bound by conditions that Amazon sets, along with also the United States lacks powerful national privacy laws, like Europe’s General Data Protection Legislation (GDPR). To put it differently, the businesses that are building voice supporters are more or less making the rules.
I find myself circling back into a few questions. Who’s looking after the consumers? Why can’t I opt in to allowing Amazon record my orders rather than wading through privacy settings looking for ways to stop sending my data to Amazon? And why are my choices for picking out restricted?
In the Alexa privacy preferences, you can choose out of letting Amazon utilize your records to come up with new features and enhance transcriptions. You can’t opt out of letting Amazon to retain your records for different purposes.

Despite the toggles being off, Amazon is still keeping my Alexa recordings.
Settings like these place the onus on the consumer to protect their privacy. If this must be true, why can not these businesses make my interactions with voice supporters completely anonymous?
Apple appears to be attempting to do this. Whenever you talk to Siri, those commands are encrypted before they are sent to the company with a random Siri identifier attached. Your Siri identifier isn’t associated with your Apple ID, so there is no way you can open up your privacy settings on an iPhone and see what you’ve been stating to Siri. Not all Siri performance requires your own device to send data into Apple’s servers, either, so that cuts down on exposure. Apple does use records of Siri commands to train the applications since you need to train unnaturally intelligent applications to make it better. The fact that Apple’s not linking particular orders with a certain user may explain why so many people think Siri is awful . Then again, Siri could be your very best choice for some semblance of privacy in a voice assistant.
This is the point in the debate if Tim Cook would like to remind you that Apple isn’t a data firm. Companies like Google and Amazon flip your private data into products that they can market to advertisers or utilize to sell you more stuff, he’d say. This is exactly the identical argument we saw in the Apple CEO when he wrote a Time Magazine column earlier this year and announced plans to push for national privacy laws.
The idea is starting to get some traction. Back in January, the Government Accountability Office published a report calling for Congress to pass comprehensive internet privacy laws. This report joined a chorus of privacy advocates who have long argued that the USA wants its own version of GDPR. In March, the Senate Judiciary Committee heard testimony from many people who pushed for national privacy legislation. It is far from clear whether or not Congress will act with this notion, however.
“Speech technology is becoming so great, it is important to worry about privacy,” explained Dr. Mari Ostendorf, an electrical engineering professor and speech technology expert at the University of Washington. “And I believe that probably companies are more worried about it than the U.S. government is.”
An individual would hope that Amazon is rethinking its approach to voice and privacy supporters. Because right now, it feels like the general public is simply just unraveling the myriad ways that devices like the Echo are documenting our own lives without our consent or sharing our personal data with strangers. The most recent controversy over Alexa just scratches the surface of how a world full of always-on microphones is a complete privacy nightmare.
The problem is that businesses like those with data-driven business models have every incentive to collect as much information regarding their customers as possible. Every time you use Alexa, by way of example, Amazon gets a sharper view of your interests and behavior. When I asked for specifics on how Amazon utilizes this data, the business gave me a bizarre example.
“If a client utilizes Alexa to make a purchase or interact with other Amazon providers, such as Amazon Music,” an Amazon spokesperson said,”we may use the fact the customer took that action in precisely the exact same way we would if the client took that action through our website or one of our programs –for example, to give product recommendations.”
There is evidence these kinds of recommendations could become more complicated in the future. Amazon has patented technologies that can interpret your emotions dependent on the tone and volume of your voice. According to the patent, this hypothetical version of an Alexa-like technology can tell if you’re happy or sad and provide”highly targeted audio content, such as sound advertisements or promotions.” One could argue that the one thing holding Amazon straight back from releasing an ad-supported Alexa is the possibility of blowback in the Echo-owning public. The government’s probably not going to prevent it.
The Frightening Future
A future without more supervision could get quite Philip K. Dickian, very quickly. I recently spoke with Dr. Norman Sadeh, a computer science professor at Carnegie Mellon, who painted a grim picture of what a future with no better privacy regulation could look like.
“At the end of the afternoon every one these speakers link back to a single entity,” Sadeh clarified. “So Amazon could use voice recognition to identify youpersonally, and as a result, it might potentially construct extremely extensive profiles about who you are, what you do, exactly what your habits are, all kinds of different attributes that you would not necessarily wish to disclose to them.”
He suggests Amazon can make a business out of this, knowing who you are and what you like by the mere sound of your voice. And unlike the many dystopian notions of what facial recognition could empower, voice recognition could function without actually seeing you. It might work over phone lines. In a future where internet-connected microphones are present in an ever-increasing variety of chambers, a system like this could always be listening. A number of the investigators I spoke to brought this up dystopian thought and lamented its imminent arrival.
Such a method is indeed far hypothetical, but if you think about it, all of the pieces are in place. There are tens of millions of devices full of always-on microphones all around the country, in houses in addition to public areas. They are permitted to listen to and document what we say at particular times. These artificially intelligent machines are also prone to errors and will just get better by listening to people sometimes letting humans correct their behavior. Without any government oversight, who understands how the system will grow out of here.
We wanted a much brighter future than this, didn’t we? Talking to your computer seemed like a very cool thing in the 90s, and it was certainly a significant part of the Jetsons’ lifestyle. But so far, it seems to be an unavoidable truth that Alexa and other voice assistants are bound to spy on uswhether we like it or not. In ways, the tech is designed in such a way that it can not be avoided, and later on, without oversight, it will probably get worse.
Maybe it’s foolish to believe that Amazon and another companies building voice assistants really are concerned about privacy. Maybe they’re working on fixing the problems caused by error-prone tech, and maybe they’re working on fixing the anxiety people feel when they view that devices such as the Echo are documenting themsometimes with no users’ realizing it. Heck, perhaps Congress is working on laws that would hold these companies accountable.
Inevitably, the future of voice-powered computers does not have to be so dystopian. Talking to our gadgets would alter the way we interact with technology in many profound ways if everybody were on board with how it was being performed. At this time, that doesn’t seem to be the situation. And, ironically, the fewer people we have helping to develop technology like Alexa, the worse Alexa is likely to soon be.
Leave a Reply