Dean Russell MP: Izzy, we have heard some really difficult testimony today, especially on harms for children, and I want to build on a question from earlier. Why do you think the big tech platforms do not seem to care about safeguarding children in the way that they should?
Izzy Wick: There is an element that ignorance is bliss. The minimum age of use for most social media platforms is 13, but we know that a huge number of under-13s use these platforms. If the companies recognised this, their use base would drop quite significantly. As Clare said, it is about following the money. It will not have escaped the committee’s notice that the age-appropriate design code came into force earlier this month and with that regulation has come a huge raft of changes, such as removing auto-play on YouTube for Kids and minimising the amount of data that is collected on children. So regulation works and, unfortunately, it is the only thing that works. These companies have not been forthcoming in making changes of their own volition, but they absolutely do when it is required.
Dean Russell MP: Building on that, in terms of the wilfulness of not caring or seemingly not caring, do you think that we will put this legislation in place and they will spend billions of dollars and pounds just trying to find loopholes, so they can continue as they are? What do we need to do in this Bill so that we are not just forcing them and bringing the horse to water to drink, as it were, but so that they actually want to make sure that safeguarding is there for our kids?
Izzy Wick: I sincerely hope they do not do that, but the key to this is compliance and enforcement. Without wanting to sound like a broken record, this comes back to minimum standards. This is something that not just users but actually the companies themselves are really crying out for. I will give the example of age assurance. At the moment, everyone has got into a big muddle about age assurance, and people are rightly concerned about privacy implications, as are we when it comes to children’s privacy, but age assurance is not the same thing as identification, and you can establish a user’s age without knowing anything else about them. The technology exists. What is missing is the governance around it. Industry, as well as parents and children, is saying, “We want rules of the road to understand what the requirements are, when it’s needed and the standards that it needs to meet”. Until we have that governance and that framework, there will not be trust in the system.
Clare Pelham: To answer your question directly, the single most impactful thing you could do would be to make the sanctions on social media senior managers include, in the ultimate, following repeated breaches, a custodial sentence. To paraphrase what someone once said, once you have them with a custodial sentence, their hearts and minds will follow.
Dean Russell MP: Nina, talking about politics and the impact especially on women in politics, a phrase was heard earlier. Matt called it “dogpiling”. I call it hate mobbing, where an individual will say something on social media, and all of a sudden a hate mob will appear out of seemingly nowhere and attack that person en masse, and then move on to the next target. I have seen it happen to my female colleagues from across the Floor in particular, but to all politicians. From your experience, especially based on testimony we had the other day, which indicated that the same groups of people seemed to be doing a lot of the targeting on these things, do you think there is a concerted effort in the space of politics to try to take down certain individuals, or is 26 it just that society happens to get together and criticise these people en masse?
Nina Jankowicz: It is absolutely a concerted effort. In our research, we have seen that one individual will post a piece of content, whether on Twitter or on Facebook, with a large following, and in that original piece of content there is nothing that violates terms of service, but that is the dog whistle that initiatives the dogpile and then all the vitrioling content comes from there. We have had network analysis done by my colleague Alexa Pavliuc, who has visualised how this content moves around on these networks. The networks can see this. If they wanted to, they could visualise this themselves, see where it is coming from and enact penalties because of that.
That initial piece of content did not trip the tripwire, but these folks who are initiating the dogpile we believe should and can be punished, because they are unleashing what is much worse. It is the quantity, again, of that content coming to you and it is concerted. It follows different patterns and networks. They are constantly being reported over and over, and nothing is happening. That is what is most disheartening about trying to report. I know a lot of folks who do not even report any more because they do not see the point. They do not want to waste their time. They do not want to retraumatise themselves. It is within the capacity of the social media platforms to see these networks, see how it is moving and see who the repeated offenders are, and they need to be compelled to do so.
Dean Russell MP: Thank you. Within the context of this Bill, often when we talk about these large hate mobs, as I call them, we are thinking about people, but do you think from your analysis that it is not just people but it is actually triggering a lot of bots that are not real people, and helping to put lots of content online that is written by AI bots, which are helping spread that misinformation or that hatred.
Nina Jankowicz: In our analysis, which again was a moment in time of just two months and quite a lot of content, we did not see that much computational propaganda—the artificial creation of bot accounts or things like that. We did see some instances of repetitive posting—certain individuals posting over and over on a Reddit subreddit, tweeting at a certain individual over and over or potentially using sock puppet accounts to do so. I have seen that in my own experience, where someone will come at me, log in to their other account and say something else, and then log in to another account. It is a remarkable amount of effort and, again, that is what is so disheartening, because these are real people behind it. If it was bots, you would be able to say very easily, “Okay, here are 25 eggs that have just tweeted the same curse word at me”, but these are often real people.
To come back to the anonymity question, which we mentioned before, I have actually had people very happily abusing me under their legal names, in fact even on LinkedIn, where their employers might see it. I do not think that introducing a real identity requirement would stem at least the misogynistic abuse that we see. It is so endemic that people are quite happy to attach their name to it and, as you have heard in previous sessions, it would endanger activists in many countries, so, for me, addressing anonymity does not solve this issue at all. Dean Russell MP: If you had all the heads of all the big platforms here and you were to say that there was one bit of the Bill that you absolutely want to see get through, or a bit of the Bill that is missing that you would want to see, to make sure you are holding them to account, what would it be?
Matt Harrison: I spoke before about explicitly listing the protected characteristics in the Bill and, as I talked about before, putting that towards the top in the priority content list. Then that flows throughout the Bill and it gives a sense of direction to everyone—to the Government, to Ofcom and to the platforms—that when you look at the codes of practice, sitting in the background are already those protected characteristics explicitly. When you start looking at phrases like content that is harmful to adults, you already have a sense of direction about the characteristics that need to be explicitly protected. It takes out all the ambiguity and then you can actually start to work with Ofcom and the social media companies from a slightly stronger point of view. That is the one big ask that we would have. #
Nina Jankowicz: I talked about this a lot, so I will keep it brief. Transparency in reports, take-downs and other decisions is incredibly important in order to understand how large the problem is and what, if anything, the social media platforms are doing to stop it. Izzy Wick: I have 14 things that I want to change about the Bill, but I will stick to one.
Dean Russell MP: Just give one for now please.
Izzy Wick: Children have a right to protection wherever they are online, but in the current draft of the Bill we have a definition of regulated services that includes only user to user or search services. This will leave a huge number of services that children access on a daily basis out of scope of the Bill. I am talking about app stores that routinely misadvertise the minimum age of the use of apps and commercial pornography sites that might not host user-generated content. The status of edtech is also unclear. Without bringing those things into scope, there are just huge corners of the digital world that will not need to comply with the Bill’s safety objectives. There is a very simple solution to this, which is to include in the definition of a regulated service any service that is likely to be accessed by children.
Ian Russell: I am going to pick the sharing of data for bona fide research. We are all slightly working in the dark, because it is very difficult to know what research has been done on this and for what reasons, and it is about time that we did have a clearer picture of the effects of the online world on the people who use it. It is very hard to gain that at the moment, because some of the research is paid for by the tech platforms and, whether or not there is a conflict of interest, there is a possibility of that existing and it is very hard to know how you can judge that. I would like to see the tech companies, which are effectively data-mining companies—let us face it—being compelled to give anonymised data, so there are no privacy issues, to bona fide researchers, so that up-to-date and constant research can be done and we all know where we are on this issue.
Clare Pelham: I just have one ask for the committee, really. Please give us Zach’s law, as recommended by the Law Commission. Please make it a criminal offence to deliberately send flashing images with the intention of causing a seizure. If you do, 600,000 people with epilepsy will be grateful to you.