Dystopian or useful? Amazon’s Ring doorbell will now be able to identify visitors through a new AI-powered facial recognition feature, the company announced Tuesday. The controversial feature, called “Familiar Faces,” was announced in early September of this year and is now rolling out to Ring device owners in the United States.
Amazon says the feature lets you identify regular visitors by creating a catalog of up to 50 faces. These may include family members, friends, neighbors, delivery drivers, domestic staff, etc. Label someone in the Ring app, and your device will recognize them when they approach your Ring camera.
Then, instead of alerting you to “Someone’s at your door,” you’ll receive a personalized notification like “Mom’s at your door,” the company said in an announcement.
The feature has already faced pushback from consumer protection groups like the EFF and from U.S. senators.
Amazon Ring owners can use this feature to disable alerts they don’t want to see, such as notifications about their arrival and departure, the company says. And you can set these alerts per face.
This feature is not enabled by default. Instead, users must turn it on in the app’s settings.
Faces, on the other hand, can be named directly from the event history section of the app or from the new Familiar Faces library. Once labeled, the face will be named in all notifications, app timelines, and event history. You can edit these labels at any time, and there are tools to merge duplicates and remove faces.
tech crunch event
san francisco
|
October 13-15, 2026
Amazon claims facial data is encrypted and never shared with anyone. Additionally, unnamed faces will be automatically deleted after 30 days.

Privacy concerns about AI facial recognition
Despite Amazon’s privacy guarantees, the addition of this feature has raised concerns.
The company has a history of building partnerships with law enforcement, and previously gave police and fire departments the ability to request data from the Ring Neighbors app by requesting footage from people’s doorbells directly from Amazon. Most recently, Amazon partnered with Flock, a maker of AI-powered surveillance cameras used by police, federal law enforcement, and ICE.
Ring’s own security efforts have historically been inadequate.
Ring must pay a $5.8 million fine in 2023 after the U.S. Federal Trade Commission found that Ring’s employees and contractors had extensive and unrestricted access to customer videos over the years. The company’s Neighbors app also exposes users’ home addresses and precise locations, and users’ Ring passwords have been leaked on the dark web for years.
Given Amazon’s willingness to work with law enforcement and digital surveillance providers, and its poor security track record, Ring owners are advised to at least be careful about identifying people using their legal names. Even better, just disable the feature and see who it is. Not everything requires an AI upgrade.
Amazon’s Ring has already been asked by Sen. Ed Markey (D-Massachusetts) to abandon this feature due to privacy concerns, and it has also faced pushback from consumer protection groups such as the EFF. EFF noted that privacy laws prevent Amazon from launching the feature in Illinois, Texas, and Portland, Oregon.
In response to questions from organizations, Amazon said users’ biometric data is processed in the cloud and insisted it does not use the data to train AI models. It also argued that even if law enforcement requested this data, from a technical standpoint it would not be possible to determine every location where a person was detected.
But it’s unclear why this isn’t the case, given its similarities to the Search Party feature that scans a network of nearby Ring cameras to find lost dogs and cats.
Asked for comment, F. Mario Trujillo, EFF’s chief attorney, said, “No one should have to give up their privacy just by knocking on a door or walking in the door. With this feature activated, it’s more important than ever for state privacy regulators to step in and investigate to protect people’s privacy and test the strength of biometric privacy laws.”
Updated with EFF comments after publication.
