Ring’s “Familiar Faces” feature promises smarter alerts, but critics warn it expands home surveillance

Amazon’s Ring video doorbells are getting a major artificial intelligence upgrade, and it is already proving controversial.
The company has begun rolling out a new feature called Familiar Faces to Ring users across the United States. Once enabled, the tool uses AI-powered facial recognition to identify people who regularly appear on a user’s doorstep. Instead of a generic alert such as “Person detected,” users may receive notifications like “Mom at Front Door.”
Amazon says the feature is designed to reduce alert fatigue and make home monitoring more useful. Privacy advocates argue it crosses a line.
How Familiar Faces Works
Familiar Faces allows Ring owners to create a catalog of up to 50 identifiable faces, which may include family members, friends, neighbors, delivery drivers, household staff, or other frequent visitors. Once a user labels a face in the Ring app, the system attempts to recognize that individual during future visits.
Critically, anyone who regularly passes in front of a Ring camera can be labeled by the device owner, even if that person is unaware their face is being identified.
Users can manage alerts on a per-face basis, rename or merge entries, or delete faces entirely. The feature is not enabled by default and must be manually activated in the app.
Amazon says unlabeled faces are automatically deleted after 30 days, but once a face is named, it remains stored until the user removes it.
Why Privacy Advocates Are Alarmed
Consumer protection groups and lawmakers argue the feature introduces serious risks, especially given Ring’s history.
Ring has long maintained close ties with law enforcement. In previous years, police departments were able to request footage through the Ring Neighbors app. Amazon has also partnered with Flock, a company whose AI-powered surveillance cameras are widely used by police and federal agencies.
Ring’s internal security record has also drawn scrutiny. In 2023, the US Federal Trade Commission fined Ring $5.8 million, finding that employees and contractors had unrestricted access to customer videos for years. Past issues have also included exposed home locations in the Neighbors app and compromised Ring account credentials appearing online.
Critics argue that adding facial recognition amplifies these risks rather than reducing them.
“When you step in front of one of these cameras, your faceprint is taken and stored on Amazon’s servers, whether you consent or not,” said Mario Trujillo, a staff attorney at the Electronic Frontier Foundation (EFF). “Today’s feature to recognize your friend at your front door can easily be repurposed tomorrow for mass surveillance.”
The EFF has urged state regulators to investigate.
Where the Feature Is Blocked and Why It Matters
Legal pressure is already shaping where Familiar Faces can operate. According to the EFF, the feature is not available in Illinois, Texas, or Portland, Oregon, jurisdictions with stricter biometric privacy laws.
US Senator Ed Markey has called on Amazon to abandon the feature altogether, citing concerns over biometric data misuse and surveillance creep.
Amazon says facial data is processed in the cloud and not used to train AI models, and that it cannot search for or track where a specific face appears, even at law enforcement’s request. Critics note similarities to Ring’s Search Party feature, which scans neighborhoods to help locate lost pets.
Amazon did not respond to requests for comment before publication.
A Different AI Approach: Video Descriptions
Not all of Ring’s AI updates raise the same level of concern.
The company recently introduced Video Descriptions, a generative AI feature that summarizes motion events in plain language, for example: “A person is walking up the steps with a black dog.”
Unlike Familiar Faces, this system focuses on actions rather than identities, helping users quickly assess whether an alert is routine or urgent without naming or tracking individuals. The feature is currently rolling out in beta to Ring Home Premium subscribers in the US and Canada.
That distinction matters.
Should You Enable Familiar Faces?
Privacy experts advise caution.
While Familiar Faces may reduce notification noise, it also creates a detailed log of who comes to a home and when. Given Ring’s history and its relationships with law enforcement, many advocates recommend leaving the feature disabled.
For users who choose to enable it, experts suggest avoiding full names, regularly deleting old face data, and remembering that checking live video is often safer than relying on AI labels.
Not every smart home feature needs to know who someone is.
The Bigger Picture
Amazon’s latest Ring update highlights a growing tension in consumer AI: convenience versus consent.
Features like Video Descriptions show how AI can improve usability without identifying people. Familiar Faces, by contrast, pushes facial recognition deeper into private spaces, where regulation is light and trust is fragile.
As smart home technology advances, the real question is no longer what AI can do, but what it should be allowed to do.

