Not only are transcripts and voice logs stored on Amazon servers indefinitely, but even when a user explicitly deletes a particular recording, they can only be certain the files are removed from the company’s “primary storage systems,” Amazon admitted last week in a letter to Senator Chris Coons (D-Delaware) after he wrote to the company with questions about its data handling and privacy practices.
“We have an ongoing effort to ensure those transcripts do not remain in any of Alexa’s other storage systems” after a user deletes a recording and Amazon removes it from their primary storage systems, the company’s VP of Public Policy Brian Huseman wrote, acknowledging that servers “may still retain other records of customers’ Alexa interactions, including records of actions Alexa took in response to the customer’s request.”
And customers wouldn’t want it any other way, Huseman claimed.
“Customers would not want or expect deletion of the voice recording to delete the underlying data or prevent Alexa from performing the requested task” in the case of setting recurring alarms, date reminders, or sending messages to friends, Huseman is certain.
Developers of third-party “skills” – tasks performed through Alexa like ordering a car, having food delivered, and even intra-Amazon requests like streaming music – “obviously need to keep a record of the transaction,” as well.
“We use the customer data we collect to provide the Alexa service and improve the customer experience, and our customers know that their personal information is safe with us,” Huseman reassured the Senator – despite the recent privacy-related incidents like the November breach that exposed Amazon customers’ emails or several cases of Alexa sending users’ private conversation to a random contact.
“Training Alexa with voice recordings and transcripts from a diverse range of customers helps ensure Alexa works well for everyone,” Huseman nonetheless assured.
Alexa owners learned earlier this year that Amazon employees are listening to them even before they speak the “wake word” set to trigger recording, and the company has since tried to sell this daunting revelation as a feature, unveiling its “Alexa Guard” service, which listens for glass breaking, smoke alarms, and other sounds of distress to “protect” the user, no wake word (or consent) required.
The company also received a patent in May to constantly record and store audio – supposedly to allow users to address the AI more “naturally,” for example saying “Play this song, Alexa” instead of “Alexa, play this song.” Another patent, uncovered last month, continued the “surveillance as a service” trend by offering customers the opportunity to pay for the delivery drones it insists are right around the corner to watch out for any suspicious activity in their yards, such as broken windows, prowlers, or fires.
Amazon admitted in a white paper published last year that it doesn’t delete data obtained through Alexa until the “machine learning” processes the data is used for are complete. Because Alexa is constantly evolving – “designed to get smarter every day,” as Huseman put it – it’s unclear how long the process could stretch and keep the customers’ data around.