Apple has implemented strict measures for some apps available in the app store. Apple has taken strict measures to combat the presence of AI-powered applications that generate nude images on the Apple App Store.
Apple has taken action against several apps on its App Store that were found to be promoting the creation of non-consensual nude images through the use of Artificial Intelligence (AI), as reported by 404 Media. The advertisements for these apps were discovered on Instagram.
According to 404 Media, Apple only addressed the issue after the publication shared the links to the apps and their corresponding ads. This indicates that the tech giant was unable to identify these policy-violating apps without external assistance.
The report revealed that five such ads were found in Meta’s Ad Library, which serves as an archive for all ads on the platform. While two of these ads were for web-based services offering similar functionalities, the remaining three led to apps available on the Apple App Store. Some of these apps provided face swaps on explicit images, while others were marketed as ‘undressing’ apps that utilized AI to digitally remove clothing from regular photos.
The text further states that although Meta promptly removed these advertisements, Apple initially chose not to provide a comment and requested additional information regarding these advertisements. This occurred subsequent to the publication of the story last week.
Apple has previously been notified about AI-powered deep-fake applications on the App Store. In 2022, numerous similar applications were discovered on both the Google Play Store and Apple App Store. However, neither of these technology giants took action to remove them. Instead, they urged the developers of these applications to cease advertising such functionalities on well-known adult websites.