Apple finally pulls generative AI nude apps from the App Store


by AppleInsider

AppleInsider— Apple has removed apps from the App Store that claimed to make nonconsensual nude imagery, a move that demonstrates Apple is now more willing to tackle the hazardous app category. App Store icon The capabilities of generative AI to create images based on prompts has become a very useful tool in photography and design. However, the technology also has been misused in the creation of deep fakes -- and nonconsensual pornography. Despite the danger, Apple has been remarkably hands-off...

9to5Mac—Apple pulls AI image apps from the App Store after learning they could generate nude images. Apple is cracking down on a category of AI image generation apps that “advertised the ability to create nonconsensual nude images.” According to a new report from 404 Media, Apple has removed multiple AI apps from the App Store that claimed they could “create nonconsensual nude images.” more

AppleInsider—Apple's new Photos app will utilize generative AI for image editing. A new teaser on Apple's website could be indicative of some of the company's upcoming software plans, namely a new version of its ubiquitous Photos app that will tap generative AI to deliver Photoshop-grade editing capabilities for the average consumer, AppleInsider has learned. The new Clean Up feature will make removing objects significantly easier The logo promoting Tuesday's event on Apple's website suddenly turned interactive earlier on Monday, allowing users to erase some or all of...

Tech Times—Apple Cracks Down on AI-Image Generation Apps Promoting Instant Nudity Magic. With nudity now a hot subject in AI image generation, Apple is removing apps that promote the creation of "nonconsensual nude photos" of people.