Skip to main content

Apple Pulls App Containing Child Porn

Apple has pulled an application called Beauty Meter from the App Store. The application is based around the same idea as You rate pictures of girls based on their face, body and clothes. The pictures are user generated. In other words, anyone who has the app can snap a photo and upload it for other people to rate. Most of the images are women in their underwear or sexy clothes and, according to KRAPPS, every picture is approved by the developer. However, the developer also approves images of women not wearing clothes, despite Apple's stance on "inappropriate content."

Everything hit the fan when KRAPPS came across a bunch of pictures of under-aged girls. It seems most of them were sixteen and partially clothed with none of the uh, important parts, on show. Not good. However, those pictures are easily overlooked when you consider KRAPPS also found an image of a topless fifteen year old girl, posing for the camera and pulling off her knickers.

This is the part where everyone speculates as to how these girls (underage or not) were able to have their photos approved in the first place. Apple probably approved the app before it contained any nudity. Indeed a follow up article from KRAPPS shows that the application contained zero nudity for a time, which leads to speculation as to whether or not the developer pulled all the naked pictures because it was resubmitting a newer version for approval. That said, no matter what way you spin it in relation to women over eighteen submitting nude photos, it still doesn’t explain how young girls were able to have their images posted, along with a caption detailing their age and location if the developers were approving every single picture.

The application has been pulled but reports say that the images (including the ones of underage girls) are still available to those who already have the app. Apple has yet to release a comment.