This was the “plan” I wrote for my talk at AlterConf Dublin. Incidentally, I strongly recommend trying to plan a talk by basically writing an essay like this ⏤ it’s not easy to keep up with it on stage. I knew this wasn’t a great idea going in, but I was struggling to get my thoughts out in any other way. It did have the nice side-effect of being bloggable though! I’ll write more about the event itself soon, but there’s a little thing that needs to be resolved before I do that so I thought I should publish something!

You might remember Kiddle, a website which caused controversy online a few months ago.

If you don’t remember, Kiddle is a search engine for children. In their FAQ, they say:

Since Kiddle results are either handpicked and checked by our editors or filtered by Google safe search, you know you get kid-oriented results without any explicit content. In case some bad words are present in a search query, our guard robot will block the search.

The reason this was controversial is that Kiddle’s idea of “bad words” meant that terms like “transgender”, “bisexual” and “child abuse” were blocked with a message like this:

Oops, looks like your query contained some bad words.

Imagine a young person exploring their sexuality or gender identity trying to find information about it using Kiddle, only to be reprimanded for trying to look up a “bad word”.

After some pushback from the LGBTQIA+ community, some LGBTQOIA+-related search terms were changed to [display this [rather defensive message]]:

You have entered an LGBT related search query. Please realise that while Kiddle has nothing against the LGBT community, it’s hard to guarantee the safety of all the results for such queries. We recommend that you talk to your parent or guardian about such topics.

As I’m sure we all know, sometimes talking about these topics with parents can go poorly, and be downright dangerous for the children involved. And many of the young people trying to search for “what is child abuse” are going to be those who could come to major harm from trying to talk to their parents.

(Kiddle have since improved and searches for LGBT-related terms now return informative results, though most of them seem to be aimed at LGBT parents.)

Internet filtering is one of the most prevalent features that online child safety software offers. It prevents a computer user from accessing certain websites. Kiddle, while not a traditional Internet filter (it isn’t software that creates a hard block on certain websites) aims to accomplish the same results by restricting which parts of the Internet children are able to access.

Internet filters are installed because people who are responsible for children want to protect them from Internet content they don’t think is appropriate for them to see.

But, even when used with good intentions and the best interests of children in mind, in my opinion internet filtering software tends to do more harm than good.


My secondary school had two levels of internet filtering. One for students, and another for teachers. In some situations, even teachers weren’t allowed to access the content they wanted to use to deliver their lessons because of assumptions made by the company that developed the filtering software.

And, when there was content that teachers were allowed to access but students weren’t, it was even worse.

Access to YouTube, for example, was blocked for students but not for teachers. YouTube has a lot of educational content that is used in schools, but this meant that when a teacher wanted to allow students to access content on YouTube, they had to give the students access to their own computer account to get around the filtering. By having access to such an account, a malicious student could have impersonated the teacher, or accessed personal information about other students that they shouldn’t have been able to see, presenting a serious data protection problem, that could result in major legal consequences for those involved.


My school, by the way, also recognised that it couldn’t stop committed students from getting around the filter. The library had a big sign on the wall saying “please don’t bypass the Internet filter”.


I remember, in primary school, a friend of mine had just broken a regional sporting record. His achievement was listen on a website that had information on everybody registered with his league, and he wanted to show it to our teacher. But upon filling in the search form that would narrow down the profiles, allowing him to find his own, he was unable to proceed because the URL now contained a parameter reading “sex=m”. This was blocked by the filter, which left my friend embarrassed when he was warned against accessing “pornographic or obscene content” in front of his friends and the teacher.

This is an example of the “Scunthorpe Problem”. Because “Scunthorpe” contains an explicit term, there have been multiple situations in which filtering software, including Google’s, has hidden Scunthorpe-related content or prevented Scunthorpe residents from signing up for services. The “Scunthorpe Problem” Wikipedia article contains a huge list of similar incidents of names of people and places being caught by overzealous filters.


But these filters aren’t just in schools ⏤ home computer internet filtering software, targeted to parents, has been around for a long time, though it hasn’t been particularly widespread or effective, especially the number of devices in homes has risen ⏤ with newer mobile devices often being seen as more personal or private, as well as in some cases being more difficult to install a filter on.

Recently, however, there have been moves to make the Internet filtered by default. In the United Kingdom, all of large home and mobile ISPs began to filter the Internet by default for all customers for a couple of years ago. This was encouraged by Prime Minister David Cameron, who span the filters as a way to prevent young people having easy access to online pornography.Shortly after being introduced, the filters were found to block sex and relationships education resources, as well as support for victims of suicide, domestic abuse, discrimination, rape, and addiction. All of this information could be vitally important to some of the people this system was supposedly put in place to protect.

Only adults can remove the filters. Age validation is often performed by making a trivial charge against a credit card. This excludes adults who are unable or unwilling to have a credit card. But it also potentially completely removes access to important resources from young people who have anything but the most approachable, understanding parents or guardians. It’s pretty difficult to ask your parents if they’d mind disabling the “porn filter” on your phone.

This type of “child safety” product is an application of the “children should always be supervised when using the Internet” mantra, which I believe is actively harmful to children. It’s important for children to be able to grow and explore away from their parents. Everybody has a right to privacy. Trying to limit children’s online activities using the “Big Brother is watching you” approach isn’t going to keep them safe.

If the Internet is filtered at school, they’ll wait until they’re at home. If the Internet is filtered at home, they’ll wait until they’re at somebody else’s house, or until they have a mobile device they can use away from everyone else without anybody knowing about it.

The approach currently taken by many guardians is to have the Internet filters in the places where children are likely to be around supportive people who can help them. All this will do is push them to cross the boundaries in less supportive, perhaps even harmful, environments.

It will make young people hide the activities they don’t want their guardians to see until Big Brother probably isn’t watching them, which is probably when their guardians are least likely to be in a position to help if something does go wrong.

There are many aspects of the Internet that can be dangerous, but trying to pretend that we can completely shield vulnerable people from their existence is doomed to failure.

Instead, we need to be educating young people on behaving responsibly online, so that when they do inevitably come upon something we’d rather they hadn’t seen, they have the ability to handle it maturely.

The relationship between guardian and child has to be built on trust, and forcing a child to use an annoying and invasive Internet filter is a clear indication that you don’t trust them. This being the case, why should you be trusted when something goes wrong?