Nonconsensual AI nude apps kicked out of the App Store


Creepy AI nude apps are not allowed in the iPhone App Store.
Photo: Danielle Reese/Cult of Mac

Applications that advertised the ability to use AI to turn regular images into nudes have reportedly been removed from the iPhone App Store.

Apple needed to be notified of their existence, though.

Apple doesn’t allow nonconsensual AI nude apps in App Store

2024 is turning into a banner year for artificial intelligence. It’s in Google search results, in most Microsoft products, Samsung built it into its latest high-end phones, and Apple CEO Tim Cook promised to make some big AI-related announcements soon — likely at WWDC24 in June.

But the company recently got a reminder that the technology has as many negatives as positives. 404 Media reported Friday that:

“Apple has removed a number of AI image generation apps from the App Store after 404 Media found these apps advertised the ability to create nonconsensual nude images, a sign that app store operators are starting to take more action against these types of apps.”

The removed applications apparently allowed users to upload pictures of people wearing clothes then use to AI turn the image into a nude.

404 Media says it reported three such applications to Apple and all of them were subsequently kicked out of the App Store.

It’s likely these violate the App Store guidelines that state, “Apps should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste,” which includes “overtly sexual or pornographic material.”

It’s unlikely Apple needed a reminder that artificial intelligence has downsides — the problems get at least as much attention as the benefits. Still, it’s something for developers to keep in mind when working on new AI features in iOS 18, macOS 15, etc.

See also  Leo Messi es imparable en la MLS y ahora rompe el récord que ostentaba Carlos Vela





Source Article Link

Leave a Comment