Google will update its policy to ban advertising services that aid in creating synthetic content altered or generated to be sexually explicit or contain nudity.
Google is revising its advertising policy to ban websites and applications that generate deepfake pornography from promoting their services on its platform. This prohibition encompasses activities such as providing instructions on producing deepfake pornography and endorsing or comparing services related to deepfake pornography.
For now, Google prohibits sexually explicit advertisements, yet it hasn't explicitly barred advertisers from promoting services enabling the creation of deepfake porn and other forms of generated nudes. This oversight is set to be rectified with the updated policy. The new policy is effective from May 30 and any ads violating the policy will be removed. The enforcement mechanism will employ both human reviews and automated systems.
This shift in policy comes as certain applications facilitating the production of deepfake pornography have circumvented the current regulations by marketing themselves as non-sexual entities on Google ads or the Google Play Store. Simultaneously, these apps promote themselves as tools for generating sexually explicit content on pornographic websites.
To support 2024 Indian general election, Google also introduced Shakti, India Election Fact-Checking Collective, which comprises news publishers and fact-checkers in India. This collaboration aimed to proactively identify online misinformation, including deepfakes, and establish a shared repository for news publishers to address the widespread challenges posed by misinformation during elections.