- Apple has added more child protection features to facetime in iOS 26
- The last of the blurring videos when he detects nudity is present
- It currently affects adult accounts, but it could be a bug
Apple adds parental control features designed to protect minors for years now, and it seems that a new one has just been found in the beta iOS 26 version. However, this is quite controversial, because it is to be feared that this is an excessive excess on the part of Apple.
More specifically, the new feature was added to the FaceTime Video application. When FaceTime detects that someone undresses on the call, it takes a break and displays instead a warning message that reads there are then buttons entitled “Audio and video CV” and “End Call”.
During its WWDC 2025 in June, Apple published a press release covering new ways that its systems will protect children and young people online. The version included a feature that aligns the new FaceTime behavior: “Communication security is developing to intervene when nudity is detected in facetime video calls and to blur nudity in shared albums in the photos.”
The actual implementation was noted by Idevicehelp on X. Under the message, @ User_101524 added that the functionality can be found in the parameter application in iOS 26 by going to applications> FaceTime> Warning of sensitive content.
By default, the functionality is deactivated, it must therefore be activated by the user, but that did not prevent him from arouse online debate …
Generate controversy
Although this new feature may seem sensible, it has in fact generated a certain degree of controversy. Indeed, at the moment, this seems to affect all users of iOS 26, not just those who use a child account. This has rushed to a few feathers among people who believe that potentially censorship the behavior of consenting adults.
In addition to that, some users have asked for how Apple knows what is displayed on the screen and if the company has access to customer video calls. On this point, Apple said the following:
“Communication safety uses automatic learning on devices to analyze attachments and video and determine whether a photo or video seems to contain nudity. Because photos and videos are analyzed on your child’s device, Apple does not receive any indication that nudity has been detected and does not have access to photos or videos.”
Like many Apple features, disk treatment means that the content is not sent to Apple servers and is not accessible by the company. Rather, he uses artificial intelligence (AI) to report video content which probably contains nudity, then censor it.
The fact that Apple’s safety of communication safety aim to protect minors suggest that this latest Facetime feature may not be supposed to cover adults and children. Its inclusion on all accounts could therefore be surveillance or a bug. Although we do not know with certainty, we should discover it by September, when iOS 26 leaves the beta version and all outings to the public.