By Aaron Tilley
Stacy Ann Sipes thought she had her daughter's phone locked down.
For her daughter's first iPhone, at age 14, Sipes used Apple parental-control settings to block Instagram and Snapchat.
After a terrifying ordeal in which her daughter developed an eating disorder, Sipes discovered a gaming app on her daughter's phone in which the main character was focused on losing weight. Players could send the character to the gym or purchase a waist-training corset using digital coins. It was rated in the App Store for ages 12 and above.
"I turned to Apple's safety features, but these weren't keeping her safe, " Sipes said. "The way Apple rates these apps doesn't protect kids."
New research provided to The Wall Street Journal shows that her experience isn't unusual. Over a period of 24 hours, researchers found about 200 apps with inappropriate content that were rated for children. Those included apps for dieting, circumventing banned sites, beauty filters, violent or sexual games, and anonymous chat, according to a report by the child safety advocacy nonprofits Heat Initiative and ParentsTogether Action.
Those apps amounted to about a quarter of the total reviewed by the groups in the 24-hour period. The findings suggest that many apps including objectionable content for children are rated as safe.
While Apple leaves it to app developers to determine the appropriate age rating for their apps, the company said it reviews and regularly rejects apps for including content and features that don't align with the rating. Apple guidelines for different age categories -- including 4 and above, 9 and above, and 12 and above -- restrict apps from featuring violence, language and other suggestive content.
The company said it has rejected more than 100,000 app submissions in the past five years for violating guidelines related to age ratings. "At Apple, we work hard to protect user privacy and security and provide a safe experience for children," a spokesman said, adding that it gives parents a range of capabilities to protect children, including restricting access to apps and flagging problematic content.
Since its launch in 2023, the Heat Initiative has focused its child-safety advocacy on Apple. The group has taken out TV ads accusing Apple of not taking down child-abuse material on its cloud backup service. The group recently backed a lawsuit against Apple for allegedly hosting child sexual abuse material on iCloud.
Heat Initiative's biggest backers are the Children's Investment Fund Foundation and the Oak Foundation, two philanthropic organizations focusing on childhood-health and social-justice issues. Sipes said she joined ParentsTogether as a member after her experience.
The new research from the groups found that an app called FaceMax: Face Rating, Looksmax, rated 4 and above, takes a user's photo and claims to rate the attractiveness level using artificial intelligence. Body Tune -- Photo Editor, rated for ages 9 and up, lets users alter their bodies' appearance with a focus on thinness: "make your body slim and skinny." Apps that facilitate chats with strangers are widely available for ages 12 and above, including Spin the Bottle: maybe you?, which advertises to users: "Sit at any table to meet new people, flirt and chat."
FaceMax and Body Tune didn't respond to requests for comment sent to email addresses listed for the app developers. FaceMax can no longer be found in the App Store after the Journal reached out for comment. The Spin the Bottle: maybe you? developer, Cyboma, said it would be changing its age rating to 17+. The app is currently no longer available in the App Store.
The App Store is big business for Apple. The company takes a cut on transactions occurring on many third-party apps -- up to 30% but lowered to 15% for smaller apps and subscription services over time. In the company's 2023 fiscal year, revenue from the App Store was $26 billion, with an operating margin as high as 80%, estimated Martin Yang, an analyst at the investment firm Oppenheimer.
Apple said it needs to take such large cuts because of the investments it makes in keeping the App Store safe and calls it "a safe place for kids." The company employs an App Store review team, which is composed of several hundred reviewers who work long hours and struggle to keep up with the large numbers of apps coming in everyday, according to former employees of the team. Apple expects its review team to move quickly through a certain number of apps a day, making it difficult to spot potential problems, the former employees said.
An increasing number of digital-safety advocates and parents are arguing that Apple's app-review process, including how it handles age ratings, is inadequate.
"Apple has touted the App Store as a Fort Knox, and yet we have these gross inconsistencies," said Chris McKenna, founder of the advocacy group Protect Young Eyes, who wasn't involved in the report. McKenna has directly advised Apple on digital-safety issues for children.
While Google has a similar digital store for Android apps, Apple has an outsize role in the U.S. Some 87% of American teens own an iPhone, according to 2023 research from the investment firm Piper Sandler.
Parents have complained that the current set of parental controls that exist on Apple devices are inadequate. The Screen Time feature, which lets parents set time limits and restrict apps on their children's iPhones, has been at times completely unusable and restrictions are randomly removed, the Journal has reported. There are also loopholes in the parental-control features that let children sidestep restrictions on the Safari browser and access illicit content, the Journal has also reported. Apple responded that it was fixing both problems.
One recommendation the report from the Heat Initiative and ParentsTogether makes is that Apple should apply an independent expert review and verification process for the age ratings of App Store apps, similar to how movies and videogames are rated based on age appropriateness. This approach would give parents who want to rely on these age ratings more confidence in their accuracy, advocates said.
The research isn't the first time problems have been alleged with child safety in the App Store. The National Center on Sexual Exploitation, a nonprofit that has urged improved child-safety protections online and has sought to restrict access to pornography, found several so-called nudifying apps that use artificial intelligence to create pornographic images of people. Some were rated appropriate for ages 4 and above in Apple's App Store, the organization said earlier this year. In 2022, the Canadian Center for Child Protection wrote a report explaining how both the Apple and Google app stores don't enforce app age ratings for younger users.
While most of the conversation regarding digital safety for children has been centered on large apps such as Instagram and TikTok, Apple is an emerging target for digital-safety advocates and lawmakers with its enormously profitable App Store.
Last month, Sen. Mike Lee of Utah, a Republican, introduced the App Store Accountability Act, which would require app stores owned by Apple and Google to verify the ages of their users to prevent downloading of software that is rated above their ages. Rep. John James of Michigan, also a Republican, later introduced the bill on the House side.
"This is just holding the App Store to the same standards that we hold liquor stores," said James, comparing the proposed law to how the government requires stores selling alcohol to check the ages of buyers. He added: "We're protecting our children from adult and addictive materials in the physical world, and now we must protect them in the online world."
Apple has said that comparing Apple to a store selling alcohol is wrong. A more appropriate analogy is that Apple is a mall and that the social-media platforms are the liquor store, the company has said.
Sipes tried limiting what her daughter could access after discovering the eating disorder and changed the age rating of the apps she could download to 4 and above.
She let her daughter download puzzle games rated appropriate for 4-year-olds, but ads and content within the apps continued to surface weight-loss material, including promotion of intermittent fasting and eating only ice for a day, Sipes said.
"There is this blind trust in Apple that they are doing everything they can to ensure children's safety, that they're not only looking at their bottom line," said Sipes, who works in education in Michigan. "But Apple is making a lot of money off these apps."
Write to Aaron Tilley at aaron.tilley@wsj.com
(END) Dow Jones Newswires
December 22, 2024 07:00 ET (12:00 GMT)
Copyright (c) 2024 Dow Jones & Company, Inc.
Comments