Apple Removes AI Apps Creating Non-Consensual Nude Images
Apple has removed several AI image-generation apps from the App Store that were found to advertise the capability of creating non-consensual nude images.
As artificial intelligence applications proliferate on mobile platforms, many of them are recognized for their image-creation abilities. However, some have drawn attention for promoting the generation of explicit content, leading to Apple’s decision to enforce its policies against such applications.
Apple has removed some AI apps from the App Store.
Recently, a surge of AI applications across various online platforms, including Instagram, claimed the capability to create non-consensual nude images.
These apps purported to produce a “nude” version of any individual and directed users to their respective App Store pages. However, these claims are merely about generating AI-manipulated visuals.
Apple decided to remove these apps from the App Store following a report by 404 Media, which detailed the apps’ advertising activities on Instagram.
Apple has removed three such applications. Notably, Apple identified apps that breached its policies independently, and 404 Media provided additional information about the apps and their advertisements.
Additionally, related advertisements have been removed from Meta platforms. These apps typically do not advertise their ability to generate explicit content directly on their app store pages, making them challenging to identify.
They instead target potential users through advertisements. Apple’s proactive steps could encourage other companies to implement similar moderation efforts.
You may also like this content
- OpenAI’s AI Course for Educators Sparks Privacy and Security Concerns
- Microsoft Teams to Overcome Language Barriers with AI Translator Feature
- ESPN Introduces AI-Powered Announcer “FACTS” for SEC Nation
Follow us on TWITTER (X) and be instantly informed about the latest developments…