“Stop it!” Moms and Dads may have to repeat that instruction to their kids, but when parents said it to Amazon in an effort to get the company to delete children’s voice data obtained through its Alexa voice assistant, Amazon should have honored those requests immediately. But according to a complaint filed by the Department of Justice on the FTC’s behalf, Amazon responded by deleting files in some databases while maintaining them elsewhere – meaning the information was available for Amazon to use for its own purposes. The lawsuit alleges Amazon violated the Children’s Online Privacy Protection Act Rule by flouting parents’ deletion requests, retaining kids’ voice recordings indefinitely, and not giving parents the straight story about its data deletion practices. Amazon also allegedly violated the FTC Act by falsely representing that Alexa app users could delete their geolocation information and voice recordings and by engaging in unfair privacy practices related to deletion, retention, and employee access to data. The $25 million settlement with Amazon sends a clear message about the consequences of putting profits over privacy.
In addition to the massive amount of other information Amazon collects about customers, the company has found a direct route into millions of consumers’ homes through its Alexa-powered devices, which respond to users’ spoken requests. First some background about how consumers of all ages interact with the Alexa technology. When Alexa devices detect someone saying the “wake” word, Alexa begins recording what it hears in two formats: an audio file and a text transcript. Alexa then responds by performing the requested tasks.
Given Alexa’s access to so much highly personal data, privacy is an important consideration for many Alexa users. Not surprisingly, Amazon made privacy a centerpiece of its marketing. For example, on Amazon.com’s Alexa Privacy Hub, the company claimed “Alexa and Echo devices are designed to protect your privacy” and “Alexa and Echo devices provide transparency and control.”
Users also can interact through the Alexa app, which can collect their geolocation information. Amazon repeatedly assured Alexa app users that they could delete their geolocation information. But even when consumers clicked on the appropriate delete buttons, the FTC says Amazon deleted the data from certain locations but retained data elsewhere, in direct contravention of its privacy promise. Amazon first discovered the problem in early 2018, but the FTC says it wasn’t until September 2019 that Amazon finally took some corrective action. “Some” being the operative word here because as the complaint alleges, due to faulty fixes and process fiascos, it wasn’t until early 2022 that Amazon finally got the problem fully under control.
Amazon made similar promises that Alexa users could “view, hear, and delete [their] voice recordings . . . at any time.” But as the complaint explains, once again Amazon deleted the files in certain locations, but retained transcripts of the recordings elsewhere in a form the company could use for product development. Adding insult to privacy injury, for at least a year, Amazon gave 30,000 employees access to Alexa users’ voice recordings – even though many of those staffers had no business need for the files.
Now to how the FTC says Amazon’s misrepresentations and compliance failures resulted in violations of COPPA. Given Alexa’s presence in consumers’ homes, and Amazon’s sale of child-centric products like Echo Dot Kids Edition, FreeTime on Alexa, and FreeTime Unlimited on Alexa, many users of the technology are under 13. In fact, more than 800,000 children interact directly with Alexa using their own Amazon profile, which links to their parent’s profile and contains the child’s name, birth date, and gender. As with adults, Amazon saved children’s voice recordings as audio and text files and used persistent identifiers to connect those files to the child’s Amazon profile.
The complaint alleges three ways in which Amazon’s practices usurped parents’ rights under COPPA to maintain control over their kids’ personal data. First, Amazon programmed Alexa to keep kids’ recordings forever, a practice that violated Section 312.10 of the COPPA Rule. That provision mandates that companies may keep kids’ data “for only as long as is reasonably necessary to fulfill the purpose for which the information was collected.” After that, according to the plain language of the Rule, they “must” securely delete it.
Second, Section 312.6 of COPPA gives parents “the opportunity at any time . . . to direct the operator to delete the child’s personal information.” But according to the complaint, children’s voice recordings were subject to the same ineffective deletion procedures as adults’ recordings. So even when parents told Amazon to delete those files, Amazon retained transcripts stored with persistent identifiers linked to the child’s account.
Another fundamental COPPA privacy protection is Section 312.4’s requirement that companies “provide notice and obtain verifiable parental consent prior to collecting, using, or disclosing personal information from children.” By failing to follow through with its stated practices, Amazon didn’t give parents complete and truthful notice about their ability to have their children’s personal information deleted – falling short of what COPPA requires.
In addition to imposing a $25 million civil penalty, the proposed settlement prohibits Amazon from making misrepresentations about geolocation and voice information. The company also can’t use geolocation, voice information, and children’s information for the creation or improvement of any data product after customers have directed Amazon to delete that data. What’s more, Amazon must delete kids’ inactive Alexa accounts, notify users about the FTC-DOJ lawsuit, and implement a privacy program specifically focused on the company’s use of geolocation information.
Here are some pointers other businesses can take from the action against Amazon.
COPPA compliance demands continuous vigilance. For companies covered by COPPA, compliance isn’t a one-and-done, check-the-box set of technicalities. Rather, the Rule creates substantive rights for parents and mandates ongoing legal responsibilities for businesses. Just one example is the requirement that companies clearly explain to parents up front their right to have their kids’ information deleted and the accompanying obligation to honor those requests. But even if parents don’t exercise their deletion rights, companies can’t hold on to the data “just because.” Once the purpose for which it was collected has passed, companies must securely delete it. And both for specific parental requests and COPPA’s general data deletion requirement, make sure you also delete relevant files stored in back-ups, separate databases, or other locations.
Voice recordings are biometric data deserving of scrupulous care. As the FTC’s May 2023 Policy Statement on Biometric Information establishes, voice recordings – and transcripts of recordings – fall within the category of data derived from biometric sources that “raise significant consumer privacy and data security concerns.” Now factor in biometric data about kids and those concerns are elevated to the nth degree.
Don’t secretly use customers’ personal information – especially data about kids – to feed your algorithms. Through this law enforcement action and the proposed settlement with Ring, the FTC is sending an unmistakable message to companies developing artificial intelligence: You can’t placate consumers with promises of privacy when you intend to use their data for other purposes. This principle takes on added significance in the context of kids. Children’s speech patterns are markedly different from adults, so Alexa’s voice recordings gave Amazon a valuable data set for training the Alexa algorithm and further Amazon’s commercial interest in developing new products. That’s just one reason why Amazon’s hollow promise to honor parents’ deletion requests was particularly troubling.