YouTube declares war on deepfakes with new tool that lets creators report AI-generated video clones


  • YouTube launched a deepfake detection tool to help creators identify AI-generated videos using their image without consent
  • The tool works like Content ID, allowing verified creators to review flagged videos and request their removal.
  • Initially limited to members of the YouTube Partner Program, the functionality may expand more widely in the future.

YouTube is starting to take illicit deepfakes more seriously, rolling out a new deepfake detection tool designed to help creators identify and remove videos with AI-generated versions of their image made without their permission.

YouTube has started emailing information to some creators, offering them the ability to scan uploaded videos for potential matches with their face or voice. Once a match is flagged, the creator can review it through a new content detection tab in YouTube Studio and decide whether to take action. They can simply report it, submit a takedown request under privacy rules, or file a full copyright complaint.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top