Less than two months after it was first introduced, Apple’s Screen Time continues to catch flack for what it does and doesn’t do. First, kids figured out a way around the tool’s time limitations. Then, it was accused of draining iPhone batteries too quickly. Now, the tool’s parental controls are being criticized for blocking (or allowing) the wrong content.
According to O.school, an online resource for sex education, Screen Time could be doing more harm than good when it comes to content blocking. At issue is how Content & Privacy Restrictions are being handled under Screen Time as they relate to keywords or phrases. Specifically, the resource site takes issue with how Apple deals with searches about and related to sex.
Yes, Screen Time’s parental controls do a good job of blocking nudity and images of individuals engaging in sex. However, there’s a lot of “good” content it’s also blocking.
O.school discovered, for example, that searches for “sex ed,” “teen pregnancy,” and “gay teen suicide hotline” were also blocked when content restrictions were activated. Searches for “how to report sex abuse” and “what is consensual sex?” were also blocked.
With these types of restrictions, you’d think Apple was at least blocking everything that most would find inappropriate for children. Unfortunately, O.school found that wasn’t the case.
It explains, “You might not be surprised to learn that, with this filter, O.school’s own content is blocked entirely. Conversations and information on consent, on basic anatomy, on healing after trauma — on the long-term effects of bad sex ed — all blocked.”
TUTORIAL: How to keep your iPhone usage under control with Screen Time
Finally, and this might be the most disturbing part of O.school’s discoveries, it found that Apple isn’t blocking some of the cruelest content found online. This includes material located on an American white supremacist site that has promoted something so inappropriate it won’t be mentioned here.
Content blocking solutions are always dicey since, on a certain level, they are highly subjective. On the one hand, it looks like Apple made a blanket decision to block everything that’s related to specific keywords such as “sex.” You might disagree with this practice as it relates to sex education, for example, but at least on this point, you can see what Apple did to accomplish this.
What Apple needs to work harder on is when keywords or phrases don’t pick up inappropriate content. In the example above, that website has no business being allowed past Apple’s sensors.
O’school does ask some critical questions for Apple that should be answered, such as:
- What are the filter settings?
- Were parents consulted? Conservatives and religious groups? Doctors and Sex educators?
- How can people report problems?
Perhaps moving forward, Apple should take the lead and work with other companies to create a much more dynamic content blocking system. At the minimum, it should explain the process more.
What do you think Apple should do to improve parental blocking in iOS 12?