The findings of the report
This 57-page document written by these two security specialists is not intendedas exhaustedf and highlights a simple sample of apps deemed appropriate for children aged 4, 9 or 12. Among the 800 applications analyzed in 24 hours, more than 200 were problematic. However, they total more than 550 million downloads.
It classifies the disputed apps into several categories: chat (which actually allow children to interact with strangers, including predators), beauty or photo classification apps (which encourage unhealthy behaviors, such as uploading images to be rated as attractive), apps to bypass internet restrictions (such as accessing content that is not monitored and not controlled by one's parents) and finally games containing inappropriate challenges (especially sexualized).
Apple promises called into question
The two associations want to denounce Apple's behavior. According to the firm, the App Store is still an environment safe and reliable
with a control system intended to verify age classifications. Besides, Apple often highlights this security to justify its commission on in-app purchases or in inquiries regarding the opening of its application store. We could see this with the gaming or casino applications which manage to filter, or even more recently apps allowing free access to exclusive content from streaming services.
However, the report highlights that Apple reviews do not effectively detect inappropriate content and that – too often – the firm places the responsibility for age ratings on the developers. These types of -lax- controls could benefit Apple financially by increasing downloads and associated commissions.
The authors of the report therefore call on Apple to strengthen its verification processes to guarantee appropriate age classifications, or even to assume a more direct responsibility for content distributed via the App Store. And of course to actively protect young users from risky content or features.
Tech
France