Culture, Media and Sport Committee — Oral Evidence (HC 1338)

14 Apr 2026
Chair149 words

Welcome to this meeting of the Culture, Media and Sport Select Committee. Today we have two panels looking at different aspects of TV and video content for children and young people. In our first panel we are hearing from broadcasters. I welcome Louise Bucknole, the Senior Vice President for Kids and Family at Paramount UK and Ireland, and Ian France, Head of Kids Content at Sky. Before we begin, I remind members of the Committee to declare any interests at the point that they ask their question. Louise, you are here representing Paramount, which means that you are wearing more than one hat. For the record, can you very briefly explain what Milkshake! is and how it relates to Channel 5 and Nickelodeon and to any other parts of the Paramount empire that you have responsibilities for so that we understand where we can go with our questions, please?

C
Louise Bucknole239 words

Yes, sure. At Paramount we are committed to kids’ content. Milkshake! on Channel 5 is the only PSB preschool brand that is on the main channel every day. We are across linear TV, streaming, pay TV and digital platforms. We are also on Paramount+ and YouTube. We are a preschool brand. We are 25 years old. We have UK originations on the platform. We are absolutely the jewel in the crown of Channel 5, really part of the DNA of our PSB remit. We are really proud of our commissioning strategy on Channel 5 for Milkshake! with high-end content. We have a voluntary commitment of 40 hours of UK origination each year. We spend 85% of our budget on that content, 60% of our commissions are regional, and it is nourishing content for families to celebrate kids and families every day. For Nickelodeon, we have Nick Jr., Nickelodeon and Nicktoons for kids aged two to 12. We combine global shows such as “PAW Patrol” and “SpongeBob” with local shows such as “Paddington” and we have a content sharing agreement where we can share content from Milkshake! to Nick Jr. Our shows on Milkshake! are things like “Peppa Pig”, “Fireman Sam”, “Thomas & Friends”—really good British content. We are super proud of our content and the studio that we have at Milkshake! as well, with our presenters and our puppet, Milkshake Monkey, to delight, entertain and inspire kids every day.

LB
Chair27 words

Thank you very much. Ian, unless you say otherwise, I am just going to work on the principle that you are here representing Sky and Sky alone.

C
Ian France11 words

That is right. Would you like me to explain Sky Kids?

IF
Chair69 words

You will probably have the opportunity to do that as we go through the questions. I just thought that, because Louise has numerous imaginary hats on, it was important for her to be able to set them out. But I am going to start the questions with you, Ian. Last year Sky Kids said it would stop commissioning new children’s content. What factors led you to make that decision?

C
Ian France319 words

There are a number of factors. First, it is important to set out the pillars of our funding that enriches children’s lives from Sky. We are basically funding news, acquisitions and films. There is a large amount of money being invested in those areas for children at Sky. We did not take the decision lightly last year—that is for sure. It is entirely because of everything that you have heard so far in the inquiry, really. Children’s habits have changed over the past few years. Even though we commercially have no obligation to make kids’ content, we are incredibly passionate about what we do with our children’s output. An original for us was a fully funded original—a fully funded programme. To be specific, we have 150 originals now on Sky Kids. I know that some people have made the point in their evidence—I think Baroness Benjamin did so in her written evidence—that it is finding and funding. That is correct, really, and of course that has a part to play in it. We have 150 originals that we now want to make sure are found and are out there for children to view on the Sky platforms. We have originals coming through on Sky Kids for the next two years as well, because we have honoured all the agreements that were in place, including all the developments. Funding has not stopped. We are still commissioning news for children. We are still acquiring content for children. We are also entering into pre-buy deals, where we have a bit more editorial, and we perhaps put more money into that production at the start of its life so we can make sure that we help financing. In other sessions it has perhaps been said that Sky is closed for business, but that is not true. Fortunately, we are still investing in children at Sky in all the areas that I have listed.

IF
Chair23 words

You talked about finding and funding, but is there anything specific that would have changed your decision to stop commissioning new children’s content?

C
Ian France81 words

It is those two main points. Discoverability is a very big issue. I want to make sure that children are seeing what we make, that is for sure, and I know that viewing habits have changed significantly, but against these headwinds, Sky is still passionate about making sure that the originals that we have made become famous. That is what we are doing going forward. My job now is to make sure, as best as possible, that children find our content.

IF
Chair26 words

But given that, as you say, viewing habits are changing, do you still think there is a demand for a linear channel dedicated to children’s TV?

C
Ian France179 words

I think that it serves a purpose. Looking at all the platforms, and Sky specifically, I know that the linear channel is very useful for parents who have children who are at the younger end, in the under-seven age category. I have a seven-year-old, and it can be incredibly useful to put the channel on and be doing things while it is on. What is good about our channel is that everything on it is curated and has gone through a compliance team. We have a very big group of passionate humans curating the content for the Sky video on demand and the Sky linear channel. I think it does have a place, yes—I really do. We do have viewers watching our great shows such as “Pip and Posy” and “BooSnoo”, which is a great show. We also create programming to cater for different fan bases. “BooSnoo” really helps neurodiverse children, and the same is true for “Ready Eddie Go!”. The content that we make all has a purpose, whether it is on linear or on video on demand.

IF
Chair15 words

I am glad you mentioned “BooSnoo”. In January we visited Mackinnon & Saunders in Manchester.

C
Ian France14 words

Yes, I had heard that you have been to see that. That is fantastic.

IF
Chair35 words

Mackinnon & Saunders told us that “BooSnoo” has already been licensed for 94 countries, so it is hugely successful. That seems to be a real opportunity. Do you have a commitment to them going forward?

C
Ian France40 words

Yes, that is part of the two-year plan for our originals. We are making more “BooSnoo”. “BooSnoo” is still in production, and we are talking about doing more “BooSnoo” over the next two years as part of that original commitment.

IF
Chair6 words

What happens beyond the two-year period?

C
Ian France175 words

When you are making kids’ TV, there is a certain limit to the volume that you make, because children are watching the same content over and over again, and “BooSnoo” has not waned in its popularity since the first series. You find that you make a series and then another series, and there is an expectation to make more series, but you are looking back at the viewing on the first two series and they are still incredibly popular. When you are running a kids’ channel, you have to decide whether you are going to create more content or stop. Sometimes, when you have 100 to 400 episodes of something and it is still being watched and still popular, there is a decision to be made. That is something that happens way before any issues in the industry. There are decisions to be made at that level. “BooSnoo” is incredibly popular. It is selling. There is more “BooSnoo” coming down the line over the next two years and Sky is committed to making more “BooSnoo”.

IF
Chair15 words

So two more years of “BooSnoo” and then no more “BooSnoo”? No more red ball?

C
Ian France36 words

We have not discussed two years’ time. I cannot say what the future holds for “BooSnoo”, but we are very proud of “BooSnoo”, we really love making it and the team that makes it is great.

IF
Chair34 words

Louise, can I move on to you? Before we get into the granular detail, do you think that the current model of providing children’s TV in the UK is viable for the longer term?

C
Louise Bucknole3 words

Funding or finding?

LB
Chair1 words

Funding.

C
Louise Bucknole141 words

I am here as a commissioner of preschool content, and I see the challenges of how to fund content every day. You have heard in other sessions about how challenged it is. It is very hard to sustain making content at the moment. There are some things have been impacting that. First, we have seen the audience shift from traditional linear channels. Advertising revenue has fallen 12% in the last four years, and restrictions on food and drink advertising have added additional pressure for commercial broadcasters. There are also rising production costs. Making the high-end content that we do at Milkshake! is really expensive. In trying to get funding we work with a lot of co-productions, and the tax credit really matters for the content that we are making. Funds such the young audiences content fund were absolutely transformational for us.

LB
Chair9 words

We will come on to that in a minute.

C
Louise Bucknole12 words

The pressure on trying to find the funding is real every day.

LB
Chair6 words

Could you survive without the co-producers?

C
Louise Bucknole63 words

No. Co-productions are crucial to everything that we do. We work with international partners. We are in a unique position at Paramount because we have PSB for Channel 5, but also the wider Nickelodeon global network, so we share content. Something like “Peppa Pig” is premiered on Milkshake! and then has a secondary window on Nick, which allows us to fund that content.

LB
Chair14 words

You have economies of scale in the sense that you have the different platforms.

C
Louise Bucknole91 words

Absolutely. The other thing is that if it is a non-franchise, the content does not get funded. Something like “Peppa Pig” is a global show, sold in every country around the world, but you may have a very UK-specific show. We are just about to launch a show called “Move It, Milkshakers!” with our presenters. It is a movement and dance show, learning the alphabet, and it is very UK-specific. We do not fund that with a co-production partner. You cannot find that funding globally for those kinds of UK-specific shows.

LB
Chair18 words

That is where we need to move on to some of the financial support, which Jeff will do.

C

Morning. Louise, you mentioned the young audiences content fund. I think both of you have called for some version of that fund to be reintroduced, and also for some kind of tax reliefs. Of the two, which intervention is likely to have the most impact?

Louise Bucknole3 words

I think both—sorry.

LB
Ian France7 words

I was going to say both too.

IF
Louise Bucknole374 words

First, thank you to the Government for having the tax credits. We absolutely welcome them and want them to continue. They really matter in our finance plans. When you are making content, you look at the percentages of how much you have for tax, how much you have as your licence fee, and so on. The young audiences content fund was transformational for us. We made 13 shows—high-end content, regional content. We were working with first-time producers. We made a show with Fourth Wall in Liverpool called “Milo”. We made “Pop Paper City” with LoveLove in Bournemouth—they were both first-time producers. The pilot, as it was, for three years was hugely successful. It funded 61 shows and 160 development projects. It just accelerated production. We were able to get into production really quickly, because the funding from the young audiences content fund was up to 50% of the budget. We had a show called “Mixmups”. That was one of the biggest awards. It was such an important show for us. It was a beautiful stop-frame animation made by Mackinnon & Saunders; they make amazing content. It was a show about disability representation, and we worked with a production team that had lived experience of disability representation as well. It was a regional show. We also made Ultra Access episodes as part of that show, which was the first time that audiences could watch content on a platform in a way that was suitable for the child. It went beyond audio description, signing and subtitling. You could watch it with Makaton, with low clutter—you could choose how you wanted to watch it—and we had amazing feedback from the audience about it. One quote was: “Finally someone gets it. They’ve produced a show that is for my child.” That was only possible with the young audiences content fund. It made content that was not previously feasible for us to make. We made a drama, “Mimi’s World”. It was set around kids’ emotions, how they dealt with friendships, really understanding their place in the world. It was this really amazing content built around the early years foundation of all the content that we made. We could not have made those shows without the young audiences content fund.

LB
Ian France118 words

The young audiences content fund is a force for good. It is incredibly important. We also accessed the fund because we could show our news output on Sky News through the fund. We were making the show anyway, but we were able to do spin-off shows using the young audiences content fund, and I think it was solely responsible for growing our Sky News output at the time when it first came in. I think we are all very supportive of the young audiences content fund. It is a way of increasing partnerships, it is a way of decreasing budgets for those partners, and it makes it a very viable opportunity for everybody who is involved with it.

IF

You have both talked glowingly about the fund. Of the two interventions, would you say the YAC fund is more likely to stimulate production?

Ian France93 words

The tax credit works well, but a lot of productions face problems because you get the tax credit afterwards. You have to stump up and find the money to make the show. That process causes a lot of problems, especially for small companies. The tax incentive is great, but lots of companies are forced to find the money and then claim the tax incentive afterwards. That finance gap can cause a bit of a problem in the process, especially for a small company. But both work hand in hand. We definitely need both.

IF
Louise Bucknole177 words

I agree. Making high-end, quality content—content that is thought through, working with educational consultants and animation—is very expensive and a long process, too. You need funding there in order to benefit from the tax credit, so they work hand in hand. We have seen a great example of this in France, where there is a tax credit that is paid up front and comes with content quotas. That drives this sort of commissioning. You have the supported cost, but the commissioning strategy as well. Tax credits are really important for ensuring that you can make content, because they are part of the overall budget, but you need the fund as well. You need a pool of funding. We tap into other funds, such as Northern Ireland Screen or North East Screen. We made a show in Sunderland called “Aneeshwar’s Outdoor Adventures” and we had North East Screen funding for that. Those screen funds are crucial for us to be able to fund the content in the first place, as well as the licence fee that we pay.

LB

How would you fund it?

Louise Bucknole66 words

I don’t think it is for us to tell the Government how to fund things, but what I can say from Paramount is that it really made a difference. Having the tax credit and the fund really encouraged investment in the content. Working with co-production partners, it would really invigorate production. Jeff Smith: You can give us ideas on how to fund it if you like.

LB
Ian France39 words

I know that in other evidence sessions it has been suggested that other Government Departments could come together, or the National Lottery. There is a decision to be made, but we would very much welcome it—that is for sure.

IF

What is your view of the idea that the BBC has put forward for an enhanced tax relief for UK culturally relevant content?

Ian France15 words

That is more in the public service broadcasting world, which I cannot really comment on.

IF
Louise Bucknole59 words

For tax credits, there is a cultural test that you have to have for it to apply, but any tax credit is really beneficial to productions. To have that continue is really important for kids’ television, to ensure that we can continue to make this really high-end, nourishing content for the audience. So yes, I think I would agree.

LB
Chair67 words

Can I very gently encourage both of you to get off the fence? Jeff asked you a very simple question: which intervention would be most likely to have the most impact? Rather than going around the houses and talking at us for a really long time, can we just have a one-sentence answer? In your minds, which intervention is likely to have the most impact, most quickly?

C
Louise Bucknole2 words

The fund.

LB
Ian France2 words

The fund.

IF
Chair5 words

Thank you—that is really good.

C
Ian France7 words

They do go hand in hand, though.

IF
Chair22 words

Maybe it was me, but I drifted off there for a second and it felt like we were waffling a little bit.

C
Anneliese MidgleyLabour PartyKnowsley27 words

Thanks for coming in. I am going to ask some questions about commissioning. Louise, on Milkshake!, what is the balance between programmes that you commission and acquisitions?

Louise Bucknole111 words

We spend 85% of our budget on UK originations and about 10% on acquisitions. The rest is for continuity. We have an amazing studio with our presenters, so we have continuity, and we also make in-house short-form. That is the balance in terms of the spend. We absolutely prioritise UK originations for our content output on Milkshake!—things like “Peppa Pig”, “Ben & Holly” and “Stan & Gran”, which is an animation that we launched last year that showcases Stan and his Gran having adventures, learning about nature, based in a coastal town that looks like the UK. We did “Pip and Posy” in a co-production with Sky. I talked about “Mixmups”.

LB
Anneliese MidgleyLabour PartyKnowsley12 words

We saw “Mixmups” at the studio when we visited. It was brilliant.

Louise Bucknole40 words

It was such an amazing show. We learned so much about how to make really good, inclusive content that represents the disabled community, and we had so much feedback from the audience that it really mattered to them as well.

LB
Anneliese MidgleyLabour PartyKnowsley15 words

When you commission children’s programming, what proportion of the financing are you putting into it?

Louise Bucknole214 words

We look at the overall budget. It can vary. It depends on the overall budget and also the type of show, the number of episodes and the duration. We make short programmes—five, seven and 11 minutes long—of animation or live action. We make a lot of factual shows as well, such as “Reu and Harper’s Wonder World” from Doc Hearts, which was a show about kids going on playdates but learning about cultures from around the world. Financing for live action and factual is slightly different from financing for an animation because we look at co-production partners. It can vary. It might be 70% of the budget for some of the live action shows; it could be 10% for animations, depending on the type of animation and whether there are global partners or multiple partners. We work with lots of other broadcasters. I mentioned “Stan & Gran”. We had a lot of partners on that—S4C in Wales, RTÉ in Ireland, ABC in Australia, NRK in Norway. Also, that budget was put together with investment from production partners such as Zodiak, Tiger Aspect and Xentrix. When you are looking at that sort of financing plan of how you put the co-production together, or any budget together, you see what other partners you have as well.

LB
Anneliese MidgleyLabour PartyKnowsley14 words

Do you all take an equal piece of the pie or does it vary?

Louise Bucknole61 words

It varies. It depends on the rights. Milkshake! will take the rights in the UK and Ireland, but we are open to sharing and having flexible rights to make budgets work. We are doing a show at the moment working with RTÉ, and we are looking at equal rights in both territories. It really depends on the rights that you have.

LB
Ian France70 words

We have worked really well together on “Pip and Posy”. It has been a great relationship, and we have been able to work out the rights, the transmission times and who is going to have which episode when. We have been working really well with “Pip and Posy”, which is a top show for Sky and for—well, I won’t say it is your top show because I do not know.

IF
Louise Bucknole7 words

It is one of our top shows.

LB
Ian France104 words

It is definitely one of our top shows, and it has worked out really well. I think that partnerships will be really important going forward. When we are commissioning, the more partners that are involved—I just know that you have two broadcasters here that have worked together on a show, and that is fantastic. That is quite rare, I think, especially in the UK. We have really enjoyed making “Pip and Posy”, and that has worked very well for Sky. I just wanted to make the point that working together can happen. We can get together and work across channels and broadcasting as well.

IF
Anneliese MidgleyLabour PartyKnowsley38 words

You have both talked about the economic pressures. How does that affect what kind of content you commission? You have touched on the partnerships. I presume that the main driver of that is these economic factors and pressures.

Ian France117 words

Budgets are getting more expensive—there is no doubt about it. Because of that, you have to make certain decisions. That is why we talked about the YAC from the tax credit. We talked about both of them all the time when the YAC existed, because both worked hand in hand, because they helped your budget. Things are more expensive to make now, as well. Because the budgets are expensive, you find that you are making editorial choices and decisions to make things less premium, and that affects things totally. You want to make sure that you make a great show, but the budgets do restrict you sometimes, and that can affect the commissioning decision that you make.

IF
Anneliese MidgleyLabour PartyKnowsley18 words

Can you give an example of a show that cost very little and has been a huge success?

Ian France55 words

I don’t think it would be fair to mention budgets now, but we are more than happy to give you supplementary evidence afterwards if that helps the inquiry. There are plenty of those shows and there are shows that do perform incredibly well, but when you link those to the budgets, I would probably be—

IF
Anneliese MidgleyLabour PartyKnowsley22 words

Fair enough. You touched on this, Louise, but what blend of content that you make would you describe as UK culturally relevant?

Louise Bucknole421 words

We do animation and live action. Animation is key for the preschool audience that we serve on Milkshake!. I mentioned “Peppa Pig”, which is over 20 years old. For us to be able to continue to invest in that, our ecosystem across Paramount is really important. We can share the budgets with Nick Jr. and also Paramount+. But we also have acquisitions like “Thomas the Tank Engine” and “Fireman Sam”. We did a co-production with Nick Jr. last year—the first one we have ever done globally—called “Tim Rex in Space”, about dinosaurs in space. What is not to love there? It has been hugely successful, but it is really expensive, so having those partners is really important. We then have live action content, and we are really committed to that on Milkshake!. We have a wonderful show—relatively low-cost compared with animation—called “Animal Care Club” that we make with Daisybeck. It is about kids’ love of animals—we know they love animals—and how they can look after them and understand different animals. We base everything around the early years foundation. Whether it is about learning about the world or about themselves, or whether it is about literacy or counting, we try to balance the content that we have, because it is really important for kids to be represented on screen. They want to see versions of themselves, and especially regional content—60% of our content is regional, made with producers from across the UK. At the moment we have about 500 jobs as a result of our productions in the UK. That is really important. We look at the balance. As a commissioner, I look at what fits within our DNA and our portfolio. We try to have that mix of shows between animated stories, content that is really important to us—sometimes slower-paced content such as “Tweedy & Fluff”, which was supported by the young audiences content fund. It was the first stop-frame animation that was made in Birmingham by a first-time producer. It is such a beautiful show. It is much slower. It is five minutes long. It is about love and nurture, and understanding how you build friendships. We look at the type of content that we want and also think about the age and stage of that content. That is very important, because a two or three-year-old is very different from a five or six-year-old. We have to think about the content and how we make it for the different audiences. We try to balance the live action and the animation.

LB
Anneliese MidgleyLabour PartyKnowsley17 words

With Milkshake!, you have a “no diversity, no commission” policy. Can you go into that a bit?

Louise Bucknole108 words

It is fundamental to everything we do. We think about how we can be representative of kids, socioeconomically and across the UK, not only for the stories that you see on screen, with a storyline about a particular culture in “Reu and Harper’s Wonder World”, for example, but for the production team. It can also be about the writers that we have on the shows. They are super-critical for how we make productions, and we try to encourage new writers within animations. The “no diversity, no commission” policy is fundamental to everything that we do, and we really drill down on that with every commission that we do.

LB
Anneliese MidgleyLabour PartyKnowsley9 words

And it runs right through from top to bottom.

Louise Bucknole7 words

It runs all the way through, yes.

LB
Chair96 words

The picture you paint of having to do some really quite complicated co-production deals in order to bring some of these important shows to fruition reflects quite a lot of what we have heard in other evidence. For example, when we went to Mackinnon & Saunders, they spoke about how, although the BBC and Channel 5 are significant commissioners, the amount that you can afford to put in means that you have to build quite a complicated patchwork of other funders, which significantly dilutes the IP ownership in exchange for the investment. Does that concern you?

C
Louise Bucknole152 words

For the rights that we take, the producer retains the overall IP ownership. We license the content for a period of years. It is usually five years for us on Milkshake!. When we make PSB content, we want it to be seen, so obviously we have it for the different platforms that we have—our linear service, the 5 streaming service, on Paramount+ and with affiliate partners such as Sky and Virgin. It is difficult to get that funding. Content is expensive and so we have to have those multiple partners on shows in order for them to get made. It is the only way that we can do it. No one is fully financing kids’ content, so you have to work with partners. There is an element that, yes, there are different rights that the producer will negotiate with whoever they are working with globally, whether it is us in the UK.

LB
Natasha IronsLabour PartyCroydon East71 words

I will declare some interests: I worked at Channel 4 before becoming an MP, and I have two small children who have watched a lot of Milkshake!—thank you so much for your service. It is a really lovely thing. I am going to ask some questions about YouTube and video sharing platforms, and about prominence and what that means for you. What are your motivations for putting your content on YouTube?

Louise Bucknole154 words

We use YouTube as a promotional platform—a marketing platform. We put short-form content on there, and some full episodes. Most of our viewing is on live and scheduled television but also on VOD. It is just not commercially viable for commercial broadcasters or, I think, independent producers to put first-run content on YouTube at the moment. As I said, it is very expensive. The economics just do not work for us. YouTube takes a percentage of the ad revenue, and PSB content is not surfaced—it is not prominent—so it is more reliable for us to have an audience that is on linear and streaming. YouTube just is not obligated to provide that support to PSB content financially. The monetisation is weak for us. If it weren’t, we would be on that platform. It just does not work for us, and the way the algorithm surfaces the content just does not prioritise PSB quality content.

LB
Ian France124 words

I agree. From a Sky Kids point of view, we do the same thing. We have a Sky Kids YouTube channel with a very small number of subscribers. We put some full episodes up there outside of our subscriber zone. That is largely because they are shows that serve certain fan bases and we really want them to see it, such as “Ready Eddie Go!”. That is a wonderful show about Eddie, who is autistic, and we felt that having those shows outside the paywall would be crucial for that audience, and it has gone down well. It is the same with “BooSnoo”; we put some “BooSnoo” up there as well. But I agree with Louise; it is the same for us with YouTube.

IF
Natasha IronsLabour PartyCroydon East17 words

So it is not economically viable to be using YouTube. It is more of a marketing tool.

Ian France144 words

We do not really have a relationship, if I am honest, for Sky Kids content with YouTube. We have a video on demand service and a linear channel that we do at Sky and that works well for us at the moment. We know that audiences are migrating to YouTube, but there is naturally a concern if you put something on YouTube about whether people will find it. It goes back to finding and funding. That is the issue. There is no guarantee—even as a broadcaster—that if you put your content on YouTube that the audience will find it. Of course, after they have watched the great content on there, what do they see next in their onward journeys? As a parent, I am concerned about that. If you have great content on YouTube, what happens afterwards? That is something that we all discuss.

IF
Natasha IronsLabour PartyCroydon East40 words

Ian, in your written evidence, Sky asked that any new prominence regime covers public service content more broadly, rather than only content from PSBs. Sky is behind a paywall, so why would we pass legislation to protect your income, basically?

Ian France70 words

I think we said that purely because we think that content is the most important thing, and that is what the decision about prominence should be based on; it should not be about the platform or the provider. When it comes to children, content is massively important, and that should have the prominence. It should be based on the content rather than the provider. That is all we were saying.

IF
Natasha IronsLabour PartyCroydon East57 words

On a broader point around video sharing platforms, you have both spoken passionately about the importance of high-quality content and what goes into making the programmes that you make. We have heard Milkshake! described as a safe space for parents and children on the linear schedule. Do you think that video sharing platforms are safe for children?

Louise Bucknole117 words

All our content goes through a broadcast compliance process; that applies to all our content on any of our platforms. Safety for children, and the trust of parents and carers, are fundamental to everything that we do. We curate the content that we put on YouTube, and we think about how it is placed. We have content that is created by expert producers. They think about the story structure. They think about the educational content that is being made. We also follow all the codes, including Made for Kids on YouTube, so there are no personalised ads and you cannot add comments. We follow all those regulations to ensure that our content on that platform is safe.

LB
Natasha IronsLabour PartyCroydon East43 words

Obviously, they have just been involved in a quite high-profile legal case in which they were found to have created a platform that is intentionally addictive to young people. Are you comfortable with your content being on a platform that is intentionally addictive?

Ian France3 words

On our streaming—

IF
Natasha IronsLabour PartyCroydon East2 words

On YouTube.

Ian France64 words

Oh, we just have promotional clips. I couldn’t possibly comment about that, really. When it comes to putting content on YouTube, we do it very few and far between. We are not out to put lots of content on YouTube and drive it as a business. I do not feel that our very small Sky Kids YouTube channel would be addictive in that way.

IF
Natasha IronsLabour PartyCroydon East79 words

Your content isn’t, but the platform that you put it on has been found to be so. Your walled garden of Sky Kids on Sky—your Sky platforms—I am sure has been created to the highest possible standards, but whether you are using it as a marketing tool or not, the content is on a platform that has been found to be intentionally addictive. Is it time for broadcasters and PSBs to think again about how we use these platforms?

Ian France15 words

Well, we wait to see the outcome of the decision and further decisions, of course.

IF
Louise Bucknole136 words

At Paramount, as I said earlier, the safety controls that we have are really important. YouTube is not our main platform. It is a marketing and promotional tool for us. We focus on our linear and streaming platform, where we do have those safety measures. We have age ratings, we have descriptors, we have a kids’ mode so that when you are going into that platform it is all kids’ content and it is curated. You can click through to it from the homepage to get into that platform. We use YouTube in the same way, I think, as Sky: as a promotional platform. I cannot speak about the wider YouTube content because we do not make that content. We make good-quality content. We place clips on our YouTube platform to be where the kids are.

LB
Ian France140 words

Similarly, for our streaming and our VOD service, we have various safety rules in place. I work in VOD and linear channels, and I am quite proud of working in that space. I think it is curated really well. We are a very passionate team. Even the compliance team—they watch everything on Sky, but they really love watching the kids’ shows. One of the things that makes it safe for us is that we made a decision not to run adverts on the linear channel. We spoke to loads of parents and they did not want adverts, so, even as a commercial broadcaster, we made that decision, because we felt it was right for our viewers. That is something that we stand by. That is another reason why we feel that our linear and VOD services are safe for children.

IF
Natasha IronsLabour PartyCroydon East77 words

The point I am making is that although where you can control things, you are producing these things to the highest possible safety standards, you are being encouraged to put more content on to a place that you do not control and has been found to have been designed to be intentionally addictive. Do you think that the Government should be looking at further legislation to regulate this space in a similar way that you are regulated?

Ian France74 words

We want it safe, don’t we? We want everything to be safe for children. No matter where we work, if we are parents, we want children to be safe. Of course I want my children to be able to go on somewhere and know that there are rules in place to make it safe. It is as simple as that. The more that that happens, it can only be a better place for children.

IF
Louise Bucknole2 words

I agree.

LB
Mr Alaba26 words

You mentioned that you use YouTube sparingly—you do not use it a lot, but you use it. Why is that? Why do you use it sparingly?

MA
Ian France118 words

My job and our team are focused on the VOD platform and the linear channel. We have thought about YouTube. We know that they are on YouTube, but we have concerns. There are concerns. Maybe I am hesitant about that—I don’t know—but we are in discussions about how we can grow YouTube. We do not talk to YouTube about our channel. They sometimes advise us on things that are happening on YouTube for events when we might want to put content up that might promote certain things that they have on their calendar, but my job is mainly to look after video on demand and linear and make that the best experience for children in a safe space.

IF
Mr Alaba5 words

What would those concerns be?

MA
Ian France35 words

Just what I said earlier about onward journeys—making sure that children on third-party platforms have a safe journey and see amazing content and perhaps make sure they do not see stuff they should not see.

IF
Louise Bucknole77 words

We put content on YouTube because 40% of kids’ viewing is going to YouTube. We want our content to be seen. This is nourishing, fantastic content that is made for British children and we want that prominence. We want content to be seen, so we have a certain selected amount of content on there, but really we are trying to ensure that the viewing is on our own curated, safe platforms—our linear service and our streaming platform.

LB
Cameron ThomasLiberal DemocratsTewkesbury62 words

I think it was you in particular, Ian, who said, “We want our children to be safe.” We all do. Both your organisations have safeguards to make sure that the content you create is suitable for children. Do you think that YouTube at large—the other content creators that post their material to YouTube—all feel the same way? Do they have those safeguards?

Ian France53 words

I cannot comment on YouTube, really. I am here to talk from a Sky Kids perspective and about what I do in my job day to day for the linear channel and VOD service. We want children to be safe, that is for sure, but I cannot comment about that, I am afraid.

IF
Cameron ThomasLiberal DemocratsTewkesbury14 words

The safeguards that you have in place, though, are there for a reason, right?

Ian France10 words

They are absolutely there for a reason, for our service.

IF
Cameron ThomasLiberal DemocratsTewkesbury11 words

Do you think they should be in place across the board?

Ian France28 words

Safety for children is important, so we shouldn’t be discussing otherwise, should we? We should always be discussing that wherever a child goes online, it should be safe.

IF
Louise Bucknole134 words

The content that creators make is slightly different from the commissioned content that we make. Some of it is short and quicker. We look at slower-paced content—content that has repetition, a story structure and a purpose. I am not saying content creators do not have that—there is some amazing content on YouTube from certain creators—but I cannot really comment on the whole ecosystem of YouTube because I am focused on what we make for You Tube. I agree with Ian that we want our platforms to be safe. That is why at Paramount we have all our safety measures in place. We really want to make sure that when children are on our platforms, parents and carers feel safe—they know it is curated and that children can watch in an area that is good.

LB
Chair146 words

There is a problem here, isn’t there? You guys are investing a lot of money, effort, care and passion in producing content that is well thought out, educationally sound and appropriate for young audiences, and yet, at the same time, you are sharing it, in some cases, on YouTube, where we know that it has no particular prominence over the AI-generated slop that might be there, which does nothing for education or the mind. We need to rationalise this. You have said that it is because that is where the kids’ eyeballs are—that is where the audience is—and that is where you have to go. How do we tackle this? What is the key to ensuring that that high-quality content that you put so much time, effort and care into making, is what our children see predominantly when they go on to YouTube Kids, for example?

C
Louise Bucknole30 words

It is prominence. It is making sure that the algorithm surfaces that content. We want that content to be found and at the moment it is not. That is why—

LB
Chair32 words

But how? You said prominence. It is very easy to say that word, but how should the Government ensure that YouTube gives prominence to high-quality content like the stuff that you commission?

C
Louise Bucknole88 words

I suppose it could be in the same way as we make our content. We have a broadcast compliance code and we follow it, as Ian said. I give a shout-out to the compliance teams, because they are amazing in what they do every day. They watch everything and make sure that it follows all the safety measures for kids’ content: how it is produced, how it is edited together, the tone, the pace—everything—as well as editorial that we have. That could be adopted for content on YouTube.

LB
Chair8 words

So YouTube should be regarded as a broadcaster.

C
Louise Bucknole31 words

It is a bit different. It is a video-sharing platform. Obviously, it has content on it. YouTube does not fund content; it is not like a broadcaster. It is slightly different.

LB
Chair65 words

There is no legislation out there to govern distribution platforms. There is legislation out there for broadcasters and legislation for social media companies. YouTube denies that is either of those things. It talks about itself being a distribution platform, as you say. On the basis that there is no specific legislation geared towards it and what it does, should it be legislated for as both?

C
Louise Bucknole93 words

That is something for the Government to look at, but it could be something that is reviewed. Going back to we have both said, content needs to be nourishing, good content for kids, but it also needs to be safe on those platforms. We are very happy to discuss with YouTube the measures that we use. Made for Kids and the COPPA rules are in place at the moment, and we follow all those guidelines. We ensure that our content is compliant and follows all the editorial process that is on the platform.

LB
Chair71 words

We know all this, but then you plop it on to YouTube, where it is in the wild west of content and you do not know what is going to be popping up next to it. What I want to know is how YouTube should be regulated to ensure that the eyeballs that are drawn to it, once they have seen your high-quality content, see something else that is equally high-quality?

C
Ian France94 words

When we put content on YouTube, we are aware of personalisation, and YouTube Kids has certain safeguards in place. That is obviously something that YouTube has done. I am not sitting on the fence, but I want to bring a balanced view here. They do have things in place that do protect children, and we have heard that in other evidence sessions. We do not put content up against slop. We do think about what we put up there and we are assuming that it goes on YouTube Kids, which is for children only.

IF
Natasha IronsLabour PartyCroydon East92 words

Just so you are aware, your content is on the non-logged-in bit as well. If you go on to just the website, which is next to the slop and next to whatever else it may be, it is also there. The point that we are trying to make, I feel, is that broadcasters are jumping through hoops and doing the right thing when it comes to kids’ content, but in being asked to put it on to a platform that is not doing the same thing, you are fighting an unfair battle.

Ian France7 words

Of course. In answer to your question—

IF
Natasha IronsLabour PartyCroydon East5 words

How do we help, basically?

Ian France112 words

All I know is that there can be learnings from our processes because we take it to our heart. We know exactly what we do with our output and we know that we care for it, we comply it, we check it, it is safe, we have PIN numbers on our VOD services to make sure that children cannot go away from Sky Kids and see other things on television, let alone on a VOD service. We require all these things because we want to make sure that children are safe. We agree that there should be something in place to make sure that we are all on the same playing field.

IF
Cameron ThomasLiberal DemocratsTewkesbury46 words

Ian, you mentioned your concerns. Is one of those concerns perhaps that you put your content up in good faith and it is safe content, but children who go to YouTube for your content might then be exposed to this slop and to more damaging content?

Ian France54 words

I can only comment today on my job at Sky Kids and what we do at Sky Kids to make it safe for children. There is a concern, obviously, that that might happen, but at the moment my evidence is based on my role at Sky Kids and our VOD service and linear television.

IF
Cameron ThomasLiberal DemocratsTewkesbury23 words

What is the best way to ensure that that doesn’t happen and children are not exposed to more damaging material on whatever platform?

Ian France21 words

I don’t understand the inner workings of the YouTube platform, I am afraid, so it is probably a question for them.

IF
Cameron ThomasLiberal DemocratsTewkesbury5 words

Any thoughts on that, Louise?

Louise Bucknole138 words

Adopting some of the safety measures that we have—age ratings, for example—could be reviewed. In a previous session you were talking about kitemarks showing whether content is of a certain quality. How you do that would have to be looked into. I do not have the answer to that, but certainly it is about how we, as broadcasters, adopt those safety measures, not only for the content from an editorial point view, but in how it is presented on the platform and whether it is within the safe, curated, trusted spaces for children, so that families can understand. I am afraid that I too cannot comment too much on the algorithm and how YouTube works, because we focus on our broadcast platforms, but we could get our policy team to review that and send some details to you.

LB
Cameron ThomasLiberal DemocratsTewkesbury27 words

That would be helpful. Thank you for the content that you do create, by the way. My daughter also watches Milkshake! and I grew up with Nickelodeon.

Louise Bucknole2 words

Thank you.

LB
Mr Alaba10 words

Ian, what motivated Sky to produce its “FYI” news service?

MA
Ian France405 words

It is absolutely amazing, and I feel very proud to have worked on our Sky news service for seven years now. What motivates us is that we give children a voice in our news output. We work closely with Fresh Start Media, an amazing production company. They are fantastic journalists and they work with children so well. We feel that it is incredibly important for children to have a voice. We have funded a daily bulletin that goes on the Children United website, a new global website for safe news for kids set up by Nicky Cox. On our daily bulletin, you get two minutes of the day’s news presented by children. We have also carried on commissioning the “Investigate” series, which was the BAFTA winner last year for non-scripted content. Last week we discovered that we have also been nominated for a BAFTA for it this year. We cannot quite believe it. We are so happy that it has been recognised in these different award ceremonies—we can’t believe it. We have some great topics, too. We focus on big topics that kids want to talk about. The episode that has been nominated is called “World. War. Me.” We focused on lots of children in different war zones talking about the wars they are growing up in, because we know that children should not be in war. We are working on a new film looking at girls in sport and why girls drop out of sport. We are in the edit with that at the moment, and it will be released soon. We are so proud of the reach of those things. At the last count I think we have had 5 million views of our Children United bulletin and the films, which is fantastic. Sky News collaborates with them on their platform, and shows the linear version of the films that we make as well, which is fantastic. It is really good to have that support from Sky News; even when it is children’s news, we still need the big journalists to help us and make sure that it is on the right platform. I must point out education as well. That is a really big thing for the news. We like getting our news service into schools. We have had 14,000 sign-ups by teachers. Whenever they view, you know that is a whole classroom getting the news in a safe and trusted way.

IF
Mr Alaba20 words

Thank you for explaining that. How does Sky use platforms such as TikTok and Instagram to access its younger audience?

MA
Ian France93 words

Children United uses Instagram for the news from its platform. Sky Kids does not use Instagram or TikTok at all. There is a wider, grown-up Sky social media platform, on all the different platforms, and we discuss with them if we have a really big launch. For example, when “Miffy” launched recently, we told parents via the grown-up socials that it had started, but we are not on social media. Q446 Mr Alaba: What is your view of Ofcom’s suggestion that the Government might explore prominence for news on social media?

For children?

IF
Mr Alaba1 words

Yes.

MA
Ian France109 words

Again, as long as it is safe; this is the main issue we face. I think it needs to be, because that is where they are currently. We talk about age restrictions on social media, but we know that younger children are on it. They should not be, possibly, but our priority at Sky is making sure that our news films go out on linear and, via Sky News, trusted sources. We know that teachers use it, too, when they show the content. More than anything else, we just want to make sure that it is safe. We are very proud of the news output. It is very special.

IF
Mr Alaba25 words

Do you have any targets for “FYI” that you could share with us? And in terms of achieving those targets, what would success look like?

MA
Ian France8 words

Can I ask what you mean by targets?

IF
Mr Alaba27 words

In terms of misinformation, disinformation, single-source, diverse perspectives, say. What does that look like? How do you see young people getting on that journey of discovering news?

MA
Ian France160 words

You have just described everything we discuss on a daily basis. Louise mentioned diversity targets. That is something we discuss with our news output as well—and making sure that we have a rich mix of news. The funny news—news of the funny things that have happened in the day—does really well. It is not all serious news and serious targets; sometimes you get funny news, and children love that too. I do, too. From a target perspective, Sky looks at our news output as one thing—the content—but also funds children to learn skills in news making. We have the Sky Up Academy, a great set of studios on various Sky sites, where children visit and make their own films. Through that, we have found children to appear and be reporters on our news service. We have various targets. We talk about things every day, but it is quality news and a rich variety of news that makes those targets work.

IF
Mr Alaba18 words

You have proposed a new tax incentive for news and current affairs. What is your thinking behind that?

MA
Ian France29 words

Do you mean the children’s tax incentive? Mr Alaba indicated assent.

Well, of course the tax incentive is used on our news and current affairs, and it is good.

IF
Mr Alaba5 words

What would be the criteria?

MA
Ian France19 words

Well, I think the fact that it is reaching children and making sure that they are seeing safe news.

IF
Chair53 words

Can I help you? In your evidence, you suggested developing “a targeted news tax relief to ensure that children continue to benefit from access to high quality news and current affairs programming”. That was your evidence to us, so Bayo is only asking you to expand on it. I hope that helps you.

C
Ian France113 words

It has helped me—thank you. Basically, I think what we are saying is that you have the animation tax relief, which works well in the children’s industry, and it might help to bolster news output and make sure that news is safe for children if a tax incentive was used in that genre. I think it would make sense, and I think it would make more people think about the safety of news if it were more economical to make it. It does help funding news. Even though we are funding and commissioning news, anything with a tax incentive can help create more safe news and allow more people to create that content.

IF
Mr Alaba26 words

Given Sky’s retreat from original children’s commissions, what ringfenced protections will there be for “FYI”, to ensure that it is not next on the chopping block?

MA
Ian France29 words

We do not have a chopping block. News is quite important to us, especially with our partnership with Sky News. We are making more films. It is in commission.

IF
Chair5 words

Sky has no chopping block.

C
Mr Alaba6 words

That was crude, wasn’t it? Sorry.

MA
Ian France3 words

No, that’s fine.

IF
Damian HindsConservative and Unionist PartyEast Hampshire171 words

Ian, you have referred a number of times to the changing viewing habits of children, without being explicit about what those changing habits are. In the last session we had YouTube, and today we are going to have TikTok and Instagram. “FYI” was an award-winning 15-minute television slot that used to play multiple times at the weekend; in a sense, it complemented the BBC’s “Newsround”. Newsround is a school-day thing that gets shown in schools; it seems that most primary schools in the country are now showing “Newsround”. Yours was on at the weekend, and then you had “FYI Investigates”, another award-winning segment. If you had the two of them back to back, you had a 30-minute segment on the television. As I understand it, the strategy now is that you default to 90-second episodes. You still have the 15 minutes, but the emphasis is all on the 90-second episodes, “FYI 90”. Can you talk us through what the boardroom conversation was around that and how you personally feel about it?

Ian France125 words

Absolutely. It all started with our presenters telling us that we were doing a long show and that children would not watch it. It was our presenters, who are children, who started saying, “You need to make it shorter, because kids won’t watch a longer magazine news show.” We were at a turning point anyway with how we could make it more snackable—snackable news for children—and Nicky Cox was setting up Children United, which we had been discussing with her. We thought that if we made a bulletin, it would be more snackable, takeaway content that you can talk about with your mates, because it is quicker. We also know that they are used to viewing shorter content now. We do make the longer films—

IF
Damian HindsConservative and Unionist PartyEast Hampshire64 words

What do you think about that? Come on, tell us what you think. Is this a good thing, that news now comes in a 90-second package? By the way, you said a moment ago that you don’t think you have content on TikTok and Instagram. Are you sure? Because that 90-second news snack sounds a lot like it is made for TikTok and Instagram.

Ian France3 words

Sky News does.

IF
Damian HindsConservative and Unionist PartyEast Hampshire5 words

Yes, but including “FYI 90”?

Ian France9 words

It is not on TikTok. It is on YouTube.

IF
Damian HindsConservative and Unionist PartyEast Hampshire9 words

You say it is not on Instagram or TikTok?

Ian France18 words

It is on Instagram, as I said earlier, and it is on YouTube. It is not on TikTok.

IF
Damian HindsConservative and Unionist PartyEast Hampshire3 words

You are sure?

Ian France2 words

Absolutely sure.

IF
Damian HindsConservative and Unionist PartyEast Hampshire15 words

Okay. How do you feel about this change in the way that the news is—

Ian France80 words

I think that the 90 seconds is good. If children are saying that having it in shorter bursts encourages conversation about news, and I know that everything in that bulletin is safe and I know that they want to talk about snackable news, that is a good thing. I will point out that we do make longer documentaries. They have not gone anywhere. They are still on Sky News, and they are being watched and loved by teachers and audiences.

IF
Damian HindsConservative and Unionist PartyEast Hampshire74 words

I realise that you are a TV guy and not a doctor, but in your professional judgment, you spend a lot of time thinking about children and what children watch and consume. You have both used the word “enriched” and talked about how you enrich children’s lives. Do you think that it is a good development for our society to move from a 15-minute “FYI” or an 8-minute “Newsround” to a 90-second bitesize news?

Ian France27 words

“FYI” was once every week, and it was 15 minutes. They were not watching it; it is as simple as that. Now we have every day, and—

IF
Damian HindsConservative and Unionist PartyEast Hampshire17 words

I appreciate that. I know why. I am trying to gauge, and not just about your programme—

Ian France12 words

I understand that. I am saying I agree with the 90-second bulletin.

IF
Damian HindsConservative and Unionist PartyEast Hampshire95 words

This is about the change overall. Maybe it is a misty-eyed look back at a world that is long gone and we cannot do anything to bring it back, but I think it is not a bad thing to ask whether we think the change, even if it is irreversible, is a good thing or a bad thing—and not just in news but in general. There is a move to shorter content on all sorts of things; bingeing of that content, quite often, rather than a varied diet; and a move from channels to self-selection.

Ian France81 words

I think what we are showing here is that there are certain genres that work well with shorter content. News is that. When it comes to animations, they usually run at seven or 11 minutes—that works well too—and, when you have a live-action drama, 22 minutes. That is the misty-eyed days, and it is still with us, so we don’t have to worry. Those durations are still with us, but when it comes to news, 90 seconds daily is working well.

IF
Damian HindsConservative and Unionist PartyEast Hampshire11 words

Louise, you run channels. Give us a good defence of channels.

Louise Bucknole50 words

It depends how you make that content. For shorter content, we have songs—nursery rhymes—on every day in the studio with our presenters. They are about 30 seconds, although they can be a minute. It is what Ian says: it depends on the type of content. We do crafts as well.

LB
Damian HindsConservative and Unionist PartyEast Hampshire32 words

But you have a product that presents a curated set of content, with some variety and balance. Do you think there is a benefit to that compared with a totally self-selected world?

Louise Bucknole8 words

Say the question again, sorry—I didn’t quite understand.

LB
Damian HindsConservative and Unionist PartyEast Hampshire52 words

Are channels worth having? Are they worth defending? In a world where you can do video on demand, even if it is a parent with a child under seven and the parent can set the controls in advance, is it worth defending the concept of a channel—and how are you doing that?

Ian France1 words

Yes.

IF
Louise Bucknole167 words

Yes, I think it is. For 25 years, we have had Milkshake!, and it has been the morning breakfast solution for families. It is that routine. Routine is so important to parents, and particularly for preschool children under five or six years old. Routine is absolutely crucial. When talking about durations, we have fives, 11s and sevens, but we also do short form, which can be two and a half minutes. It depends what it is about. We have done shorts on transport and science. We work with educational consultants. Your session with the academics was fascinating. I sent that transcript around to my whole team and said, “Read this; it affirms everything that we do.” We think about the duration, how it is created and where the repetition is. It really depends on the duration, but are channels worth defending? Yes, because it is that regular viewing that you can have within your day for children, and for us with Milkshake! it is in the morning.

LB
Ian France25 words

I will just add that teachers were saying to us that they wanted shorter news for their classroom sessions. They were saying that to us.

IF
Louise Bucknole108 words

That is a good point. Our content is used in schools and preschools. We do cultural shorts. It is Vaisakhi today, and we have a short about that. I sent it to my daughter’s school and they said, “Great, we can watch this.” We had some amazing shorts that we did for Black History Month where we celebrated carnival. I sent that to some of my NCT groups and some of my mums groups and said, “This is great content that you can watch.” Sometimes that short content can be as good and educational as the longer-form content that is created around a story where you have characters.

LB
Cameron ThomasLiberal DemocratsTewkesbury41 words

Ian, you were at pains to state that your material was on Instagram but was not on TikTok. TikTok is one of the largest social media platforms in the world. What decision was made not to publish to TikTok, and why?

Ian France18 words

I think that we had seen that when we were using Instagram and YouTube there was more engagement.

IF
Chair4 words

Did you try TikTok?

C
Ian France24 words

No, we didn’t try TikTok. We were just really happy with the engagement that we had got with the ones that we had chosen.

IF
Chair14 words

So there was not more engagement, because you did not try the other one.

C
Ian France34 words

Correct, yes—that is right. We were trying one thing, then we tried another. We were really happy with the engagement and we felt that it was the right thing to do at that time.

IF
Chair29 words

In one of your earlier answers you said that you had to go where the eyeballs are, and the eyeballs are on TikTok, so why are you not there?

C
Ian France11 words

Not for the age group that we are targeting, I hope.

IF
Chair7 words

Is it that? Is it that they—

C
Cameron ThomasLiberal DemocratsTewkesbury13 words

What if we were to tell you that that is not the case?

Ian France24 words

I do know that. That is why I said “I hope” with tongue in cheek, but we will not put our news service there.

IF
Chair40 words

That is helpful. You will be delighted to know that you are now on our chopping block, because we have come to the end of our session. Before we let you leave, is there anything that you want to add?

C
Louise Bucknole131 words

First, thank you to the Committee for looking into this. Children’s content is so important for 20% of the population. If we want that content to continue to be fantastic, nourishing and educational, as well as entertaining, as we have talked about, looking at tax relief is so important for funding. Re-establishing a fund is something that we would urge the Government to look at. Kids want to see themselves reflected on screen; they want those stories. We all remember the shows that we watched as children. “Paddington” was one of mine, and I was delighted when I was able to work on “Paddington” at Nick Jr. You really remember that, and kids get inspired by that content. Thank you for all the work you are doing to look at this.

LB
Ian France92 words

We are in these roles now, but Louise and I are celebrating 27 years this year of making children’s TV. We started off at Children’s ITV back in 1997, and I was a “Blue Peter” runner at one point. We have been making children’s content our whole lives. This is an incredibly important session, and to be here with you all today blows my mind. To have been sat here where peers like Greg, Maddie and Jackie have been sat is a real honour, so thank you very much for inviting us.

IF
Chair116 words

Thank you very much for joining us today. Witnesses: Giles Derrington and Rebecca Stimson.

Welcome to our second panel this morning. We are now going to hear from those responsible for TikTok and Instagram. We are joined by Giles Derrington, Senior Government Relations and Public Policy Manager for TikTok, and Rebecca Stimson, UK Director of Public Policy for Meta. Thank you both so much for joining us today. I do not know if you had the opportunity to watch our session with YouTube, but we had a discussion about how we define them. They were adamant that they were not a broadcaster or a social media company, but a distribution platform. How would you define yourselves?

C
Giles Derrington238 words

Thank you to the Committee for having us today and to the Clerks for being flexible and patient with us as we organised the session. TikTok ultimately is a global entertainment platform with over 1 billion users. What that means is that our content is primarily driven by user-to-user content. That is users generating great ideas, creating things and putting them out into the world, and then other users looking at them. What that means in regulatory terms is that we are a user-to-user service, which is described under the Online Safety Act and regulated by Ofcom in that way. It is worth saying that TikTok is about an eight-year-old company. Prior to the Online Safety Act, we were regulated directly as a video sharing platform under the former EU regulations. We have consistently seen ourselves in that space. We would not describe ourselves as social media, for the simple reason that social media is generally driven by what is called a social graph. A social graph is an algorithm that looks at all your friends, what they are talking about and watching, and then shows you that. TikTok is defined by a content graph. What that means is we are looking at what you are watching and then finding other things that you might be interested in. It is a technical difference in the way in which the algorithm works, but ultimately they are both user-to-user services.

GD
Rebecca Stimson23 words

Instagram and Facebook are social media companies defined, as Giles has said, as user-to-user services under the OSA for the purposes of that.

RS
Chair31 words

You both define yourselves as user-to-user services, regulated by Ofcom under the Online Safety Act. Do you feel that you neatly fit into those categories when it comes to the regulation?

C
Giles Derrington82 words

Yes, broadly speaking. We do not describe ourselves as social media, in the common parlance, because of that content algorithm, but absolutely, for practical purposes. It is worth saying that the debate on how the Online Safety Act works predates TikTok existing. TikTok launched after the first Green Paper on the OSA, so we have been part of that conversation right the way through our existence and it feels like that is an appropriate way to regulate millions of pieces of content.

GD
Rebecca Stimson74 words

We are a social media company, just to draw that distinction. Yes, I think it works well under the OSA. Obviously, it is still being implemented and not all the duties are in force yet. You can already see that things are evolving, and things like AI and chatbots are starting to put a bit of strain on some of those definitions, but broadly speaking it is a definition that makes sense for us.

RS
Chair26 words

That is very helpful. We just wanted to make sure that we know who you are and what you do before we start asking our questions.

C
Cameron ThomasLiberal DemocratsTewkesbury173 words

Would you mind if I opened with a couple of very well-known psychological experiments? In 1967, the blue eyes/brown eyes experiment was run by third-grade teacher Jane Elliott. She pitted friends against each other by explaining that those with blue eyes were superior to those with brown eyes. She gave the blue-eyed students special treatment, and highlighted the mistakes made by those with brown eyes. She then stepped back. Within a very short time, those pupils were visiting physical harm upon their own friends. The Stanford prison experiment split a friendship group of older students into prisoners and guards, and with these older students the experiment yielded similar results. The Milgram experiment demonstrated that even compassionate, conscientious people can be persuaded, remarkably easily, to inflict serious harm, potentially even death, upon others when an environment enables them to do so. I remember the first time that I saw my daughter, and my mind changed immediately. My primary focus in life was to protect that person. I wonder if either of you have children.

Rebecca Stimson5 words

I don’t have children, no.

RS
Giles Derrington7 words

I do. They are five and two.

GD
Cameron ThomasLiberal DemocratsTewkesbury25 words

My daughter is five too. If you believed that another person was influencing your child to harm themselves, would you want to know about it?

Rebecca Stimson1 words

Yes.

RS
Giles Derrington1 words

Absolutely.

GD
Cameron ThomasLiberal DemocratsTewkesbury45 words

Do you know who Jools Sweeney was? Giles Derrington indicated assent.

Jools lived just a couple of miles away from me, until he took his own life. If you found your child the way that Jools’s mother found him, would you want to know why?

Giles Derrington2 words

Absolutely, yes.

GD
Cameron ThomasLiberal DemocratsTewkesbury29 words

Are you aware that TikTok has declined to release Jools’s social media history to his parents and that she is currently taking TikTok to court in the United States?

Giles Derrington195 words

I will start by saying that I completely empathise with the point you are making about the horror of the situation that occurred. We have spoken to Jools’s mother on a number of occasions about this issue. There is an ongoing court case, so there is a limit to what I can say, as you would expect, because it is important that those things play out. It is worth saying, though, that when it comes to the question of data, we have talked to her about that, and about the sequence of events. We have a legal duty under GDPR to delete certain data after a certain amount of time. As you would expect, we have to follow that, and that does impact on some of those issues. There is legislation going through at the moment directly related to this, on which we actively engage with the Department for Science, Innovation and Technology, the Home Office, and others to see if we can improve that situation in the long run. But there is not anyone who would be opposed to finding a good solution to this that means that data is preserved in such circumstances.

GD
Cameron ThomasLiberal DemocratsTewkesbury9 words

Are you saying that that data has been deleted?

Giles Derrington86 words

I do not want to go too much into the court case and exactly what is going to happen, as I am sure you appreciate. We have had that conversation directly and will continue to go through the process of the court case. I am saying that there are mechanisms by which data has to be deleted. In terms of when police investigations start and they come to us to ask for data, if that happens after a period of time, then data could be deleted.

GD
Cameron ThomasLiberal DemocratsTewkesbury39 words

You mentioned GDPR. Where regulations prevent you from sharing that data, do you think that it is incumbent on this House to pass legislation to regulate such bodies as TikTok so that they do share that information with parents?

Giles Derrington97 words

I think it is about finding the correct way of approaching these two competing legal issues—the importance, on which I think we would all agree, of data protection and GDPR duties to delete data that is being used by companies, and making sure that it is preserved in cases of harm where police, families and others may need access. One of the debates that is happening within Parliament and with Government is about exactly what the triggers for that are, so that companies like ours know in what circumstances things may need to be accessed in future.

GD
Cameron ThomasLiberal DemocratsTewkesbury10 words

Are you familiar with the red versus blue school wars?

Giles Derrington1 words

Yes.

GD
Cameron ThomasLiberal DemocratsTewkesbury6 words

What does that mean to you?

Giles Derrington154 words

The BBC has described this as a phantom trend, so something that was not actually happening but was talked about online. This was something that we were alerted to by a number of police forces—we work directly with police forces and have direct lines of contact with a number of police forces—who raised concerns about a very small number of posts on our platform that suggested an attempt to create violence between two schools. It is worth saying that immediately upon being told about this by the police, we acted to remove that small number of accounts. It is also worth saying—and I think this is where the phantom trend issue comes to the fore—that after there was reporting of a small number of accounts, we saw significant increases of people searching for that content, creating similar content and, quite rightly, posting things warning others about it. That actually made the issue much bigger.

GD
Cameron ThomasLiberal DemocratsTewkesbury5 words

Did it start on TikTok?

Giles Derrington21 words

We don’t believe so. We found no evidence at all of any co-ordination of genuine attempts at violence on the platform.

GD
Cameron ThomasLiberal DemocratsTewkesbury5 words

What action did TikTok take?

Giles Derrington129 words

We removed all the accounts that we had identified—again, we are talking a very small number—when the issue first came to us. As the issue grew because of the reporting and because people were talking about it, we scaled up direct teams. We have direct teams that can work on those kinds of escalations. We also took action to, for example, ban a range of search terms that might lead to that kind of content, to reduce the ability for people to see it, and where people were searching for that content, to put at the top of those what we call an interstitial giving information on violence misinformation, our community guidelines and so on, to signpost people away from that kind of behaviour and towards more constructive things.

GD
Cameron ThomasLiberal DemocratsTewkesbury18 words

Have you done any digging into the economic cost of the school closures that occurred because of this?

Giles Derrington151 words

I have not seen any. To be clear, as far as we are aware, this was, as the BBC described it, a phantom trend. There were never actually any attempts at violence. That is our understanding of it from direct conversations with police. We took very fast action to remove any and all content around that. One of the things that there is always a challenge with when it comes to these kinds of things is that we saw genuine parents and others posting warnings to people saying, “Don’t be so stupid as to get involved in this,” but sometimes you also see stuff that is disingenuously purporting to warn people to try to perpetuate it. We had to take policy choices about where to draw the line and, in that instance, we drew a very high line in removing content, but we saw it grow over the period of time.

GD
Natasha IronsLabour PartyCroydon East223 words

I am an MP in Croydon, and the red versus blue trend had a significant impact on some local schools. One school had a lot of children in one year who did not turn up because they were so afraid of what was going to happen. I appreciate that this is about content on your platform. You are very clear that you do not want anyone under the age of 13; you are very clear about what kind of audiences you want. In the evidence that you submitted, you cite a study by Oxford Economics that found that “60% of British TikTok users had visited somewhere, purchased a product or…engaged in their community” based on the stuff they had seen on TikTok. That is very positive, but what analysis have you done of the negative impact of these sorts of things? Yes, on the police having to deal with this sort of thing, but we also saw young people running around Clapham because they had managed to convene each other via TikTok and other social media platforms. What responsibility do you take for the editorial content on your platforms? We have everything from very high-end news outlets to people trying to create trouble on our streets. What is your responsibility, as TikTok, to take ownership and editorialise what people see on your platform?

Giles Derrington444 words

There are a couple of things there, so forgive me if I break them down briefly. First, it is absolutely true, and we have seen numerous examples—we can talk about some of them—that people take positive actions in their community or in other ways because of stuff they have seen on the platform. You are fundamentally right that where bad content may appear there is also that risk. That is why we have very clear community guidelines to actively remove that content and signpost people to good behaviour, support where necessary—a whole range of things. Our primary responsibility is to remove that bad content. The second element where we have a responsibility is to make sure that that bad content gap—the vacuum of that stuff being removed—is filled by good content. We work with a whole range of creators, publishers and others to make sure there is high-quality news, information and support. For example, we did a campaign last year with Women’s Aid with popular male creators calling out misogyny or violence against women, to fill that void with that kind of stuff. I want to talk about one specific issue when it comes to the Clapham issue. To be really clear, we spoke directly to the police forces about that, and there was absolutely no evidence that that was on our platform at all in any circumstances. Indeed, the police corrected their statement saying it was on us because they did not find any content. So far, we have not seen any content other than bystander footage from people who were there, including news organisations filming it. We have not seen anything that suggested it, co-ordinated it or anything else. The one final thing I will say is that we have entire teams within our trust and safety team who are looking for risks off platform—looking at encrypted messenger groups, the dark web and so on—to try to find things before they happen. One of the things we do consistently see, and we speak quite a lot to Government and others about, is a lot of the co-ordination—we certainly saw this during the school wars issue—is usually happening on those encrypted services away from our platform, with people encouraging others to post stuff on platforms like TikTok. Yes, we can knock things down, but we do think there is a better way of working within Government, within police and with other platforms to make sure that there is greater visibility and understanding of where those risks may arise so that we can move quickly to stand up things before they appear rather than having to deal with them slightly afterwards, although very quickly.

GD
Natasha IronsLabour PartyCroydon East110 words

I guess the question is—and it is for Meta as well; thank you for being here, and I am not trying to ignore you—who is responsible for content on your platforms? I appreciate that they are user-to-user generated, but even if it is participation videos showing that and helping to stoke and encourage others to join in, or to seed fear or to seed people feeling unsafe, what responsibility do you have, as platform providers, for that content? As we have said, from your own analysis, what you do has a real-world impact, both good and bad. You can take the good, but you also have to acknowledge the bad.

Giles Derrington252 words

As I say, I think our primary responsibility is to remove things and be looking for things to remove. That is why we set very clear community guidelines and why we spent $2 billion last year on trust and safety teams to continuously identify new trends, bad actors, and the evolving nature of different harms to take them down. That is our primary responsibility, and under the Online Safety Act we work very closely with Ofcom on how we test those responsibilities, how we are measuring up and what our practices and processes are. On the Clapham issue, I want to be really clear again that anything that is stoking, perpetuating or endorsing violence is against our community guidelines and comes down. There is a challenging balance, and it is an interesting part of the debate: if we have news footage on the platform of something happening, we think it is probably right that that should be available and put on the platform. If that is bystander footage filmed by someone else, is it fundamentally different if that bystander footage is under a BBC or Sky News label versus the label of an individual creator who has posted it? That genuinely is a challenge for us. We take a high bar. As I said, on things like the school wars, we actively set a very high bar for what we would and would not accept. But when it is literally people filming something that is happening in the street, it becomes challenging.

GD
Chair110 words

Do you think are desperately unlucky in TikTok? It seems that whenever there is a large-scale issue with something happening—Clapham is a separate issue, but when it comes to schools, we have had the red versus blue and, in America, the skull-breaker craze. When my youngest son was at school a couple of years ago, there was a point when children were strangling each other to the point of passing out. This was all put at the door of TikTok. Why is everyone putting it at the door of TikTok? Are you desperately unlucky? What are you doing to make people think that you are responsible for these viral crazes?

C
Giles Derrington204 words

I think there are three things worth saying there. First, we are a very large platform with a lot of content, and while we take a very proactive approach to taking things down and set very strict community guidelines—in the UK, 99.9% of everything we removed in the last quarter was done proactively, i.e. we found it, not police, media and so on—there are always going to be bad actors who evolve and attack all platforms. That is point one: there is a genuine thing about the nature of evolving harms, and that goes to my point about how we engage with some of those services where co-ordination of that happens. Secondly, very candidly—Praesidio Safeguarding, which is run by former academics at Harvard, looked in detail a few years ago at the dangerous challenge trend—an awful lot of this is very small amounts of content that is then perpetuated by people getting concerned about them, and that grows the salience of the issue. That is not the same as there being lots of content on the platform. To be clear, some of the challenges you mentioned have never trended on our platform and have never been at a level of significant reach and access.

GD
Chair12 words

Why am I having this conversation with you and not with Rebecca?

C
Giles Derrington119 words

Candidly, we are the newest platform. We know for a fact that people like writing about TikTok—people are interested in TikTok, including the media—and that does drive some of the way that coverage sits. To go back to the Clapham example, the communication from the police originally said Snapchat and TikTok. They sort of just assumed that it was TikTok, but when we asked them about it, they said, “Well, we haven’t found anything. We’ve just found it on Snapchat.” There is a challenge there for us to educate and work with people to make sure that they understand what is and is not on the platform, and what we are doing to resolve it and to work together.

GD
Chair17 words

How often do you find yourself talking to the police about incidents that are attributed to TikTok?

C
Giles Derrington53 words

We have a dedicated law enforcement team who spend a lot of time talking to them, not just about things that may happen but also to plan for things that are not there yet, look at trends, look at things on other platforms and try to prevent them coming on to the platform.

GD
Chair16 words

How often do you guys find yourselves in conversations with the police about these viral things?

C
Giles Derrington58 words

I would be wary of giving a specific number because these things ebb and flow. We speak to them about a whole range of things, and some of those are very small, like individual direct harm incidents—actually, those are sometimes the things that are most important. There is a range of ways that we would speak to them.

GD
Cameron ThomasLiberal DemocratsTewkesbury25 words

Rebecca, Instagram is huge. It also has a young audience. Why do you think we are having this conversation with Giles and not with you?

Rebecca Stimson126 words

There might be a couple of reasons, although I would not want to give the impression that I am trying to tell you that nothing harmful ever happens on Instagram, because that would not be a genuine statement to make. Instagram is a personalised service that we provide, so you get fewer trends happening, if that makes sense. People come to Instagram to follow their friends, family and things they care about—their favourite football team, favourite creators or whatever it might be—and what they see on our platform is very much driven by that. Your Instagram feed is going to be very different from mine. It is driven less by what is trending and viral at the time and more by what you want to see.

RS
Cameron ThomasLiberal DemocratsTewkesbury14 words

So TikTok is a platform—as far as you observe—curated specifically to create that virality.

Rebecca Stimson34 words

I cannot speak to TikTok—Giles would have to give you a better idea of how that works—but that might be why there are definitely differences in what we see sometimes between the two platforms.

RS
Cameron ThomasLiberal DemocratsTewkesbury8 words

Giles, do you have any thoughts on that?

Giles Derrington91 words

You are right that TikTok does drive trends. For clarity, the vast majority of the content that we see trending on the platform is overwhelmingly positive, and we are talking about very small instances. If I think of something like the red versus blue issue, when we were first alerted to it by police, there were fewer than 50 videos on the platform in any way related to it. After reporting of it, that increased by about 19,000% in terms of the searches for it. People were actively looking for it.

GD
Cameron ThomasLiberal DemocratsTewkesbury14 words

But it wouldn’t have happened the same way if it had been on Instagram.

Giles Derrington36 words

I will let Instagram answer for exactly how things would appear on their platform, but we did see, and we do regularly see, these issues across all platforms. I don’t think it is a TikTok-specific issue.

GD
Vicky FoxcroftLabour PartyLewisham North50 words

Giles, it is great to hear that it is your primary responsibility to ensure that you remove things and that you have teams looking for risks. On red versus blue, when did you first identify the problem and when did you ensure that it was properly excluded from the platform?

Giles Derrington90 words

I don’t have the timeline directly in front of me. I am happy to come back. I think I am right in saying that we initially identified it around the same time the police initially came to us and then we moved to ban the initial accounts. As you would expect, we set up an escalation team to deal with the issue. As media coverage increased the visibility of the issue, we scaled our interventions. As people start to search for it, we take action to remove those search terms.

GD
Vicky FoxcroftLabour PartyLewisham North92 words

As the MP for Lewisham North, this was a very big issue in my constituency. It was raised with me on the Thursday. Our schools were writing to parents, and we had police at all the schools monitoring it. I do not believe that all content was taken down until the following Wednesday or Thursday. It would be really useful to have the actual timelines for when you went and barred the red versus blue war stuff on your platform, because this was terrifying for young people and families in my constituency.

Giles Derrington26 words

I am happy to come back with a timeline. We acted the moment we heard from the police and had engagement about the type of content—

GD
Vicky FoxcroftLabour PartyLewisham North29 words

You might have acted on the initial posts, but you did not act on all the additional posts. I think you have already alluded to that in your evidence.

Giles Derrington26 words

We act immediately on anything we find. Again, I am happy to write to you with a detailed timeline of exactly what we took and when.

GD
Natasha IronsLabour PartyCroydon East89 words

Definitely, the commentary on it was left on your platform for quite some time. The initial posts were taken down but the thing that you said creates the hype—people talking about the thing—that stuff was left up for quite some time. I want to pick up on one thing very quickly. You talked about the accounts that posted this. Is the issue here account verification and making sure there are not bad actors basically hijacking your platform to create social unrest for no good reason other than creating mischief?

Giles Derrington96 words

No, I don’t think that is quite right. We take a range of measures to try to stop bad actors appearing and getting on to the platform already. There is a whole range of things we do. I am not going to talk about the methods and practices in public, for fairly obvious reasons, but we spend an awful lot of time within our trust and safety team identifying content, because it can come in from anyone, and enforcing rules vigorously against them. I think that is the most appropriate way to deal with these issues.

GD
Damian HindsConservative and Unionist PartyEast Hampshire79 words

You were asked why it so often seems to be you in the frame. Right at the start of this session, you were at pains to stress that you have a different algorithm, which is not the classic Facebook, as was, social graph. You can be somebody who is instantly all over the internet in a way that you cannot even on other social media platforms, and God knows they have problems enough of their own. Isn’t that true?

Giles Derrington161 words

The different ways the algorithms may work mean that we have different ways of protecting our users, and different approaches and priorities. That is absolutely correct, and you would expect that to be right. That changes the way we deal with these things. It does not necessarily change the desired outcomes, or indeed what we see. It is worth saying again that on all these issues, the co-ordination of this activity has not happened, and was not happening, on our platform. The endorsement of any form of violence is the thing we take most seriously and remove from the platform incredibly quickly. There is a challenge when it comes to things like people warning others—what we would call counter-speech—about how you deal with that, and whether that perpetuates fear or not. There is a genuine debate there, and I am interested in views on it. The way we approach it is different, rather than what we are seeking as an outcome.

GD
Damian HindsConservative and Unionist PartyEast Hampshire40 words

Indeed. I am going to ask Rebecca a similar question in a moment, but in the wake of red versus blue, have you done an analysis to see if anybody made money out of it—from the creator fund, for example?

Giles Derrington68 words

We do look at monetisation as one of those escalation things when we are standing up teams, to make sure that no one can monetise off that. Obviously, we would not want to see it. Our creator fund works slightly differently from some other platforms. To be in the creator fund, you have to be abiding by the community guidelines anyway, so it becomes a very quick decision.

GD
Damian HindsConservative and Unionist PartyEast Hampshire38 words

But you are not immune to charity scams, for example, where people are presumably making money through the creator fund. “Unmute this video, play it for 30 seconds”—these are all ways to make money out of your platform.

Giles Derrington21 words

Within the creator fund that would not be the case. We have very narrow and high-bar criteria, for those exact reasons.

GD
Damian HindsConservative and Unionist PartyEast Hampshire15 words

You are saying that you do not have charity scams, for example, on your platform.

Giles Derrington82 words

I am saying that when it comes to monetisation through what is called our creator reward programme—those individuals who get money because of the content that they post—the way we set up the requirements means that they have to be posting a certain amount of videos, but they also have specific rules on being users in good standing, so we try to isolate them. We do not have, as some other platforms do, pre-roll advertising before videos, which allows that revenue share.

GD
Damian HindsConservative and Unionist PartyEast Hampshire35 words

Really quickly, can you talk us through the criteria or the triggers for getting payouts from the creator fund for a video? Things like unmutes, duration time and so on—what is the summary, very quickly?

Giles Derrington34 words

I do not have the full list in front of me—I could happily get it for you—but effectively, you need a certain number of followers and to be getting a certain number of videos—

GD
Damian HindsConservative and Unionist PartyEast Hampshire30 words

That is to sign up. That was not the question. The question was: if you are in the programme, what are the triggers for being paid out on video watches?

Giles Derrington24 words

The way it works is basically, on the number of watches you have, you will get a certain amount of money. The important thing—

GD
Damian HindsConservative and Unionist PartyEast Hampshire62 words

What counts as a watch? I realise that these are leading questions, but you could scroll past a video, you could dwell on it for two seconds, you could watch it for five seconds, you could unmute it, you could watch it for 20 seconds, and those do not all have the same effect on how much money you get, do they?

Giles Derrington100 words

We describe a watch—and I would have to double check what the duration is. I know it is not the full watch, but it is a period of it. To be clear, because of the way that we set up the criteria for that rewards programme, if that content were breaching our community guidelines, or if any of your content was breaching community guidelines, you are instantly no longer eligible for that rewards programme. That is one of the ways we can drive good behaviour within the rewards programme, because the risk of losing your ability to monetise is significant.

GD
Damian HindsConservative and Unionist PartyEast Hampshire28 words

Rebecca, how do you—or do you—analyse after there has been one of these crazes or viral challenges? Do you analyse if money has been made and by whom?

Rebecca Stimson105 words

It can be, yes. Sometimes that is true. There was an example recently of accounts on Instagram that were filming women in the street—some of you might remember that illicit filming—and then posting that and trying to monetise that content. Clearly, when we became aware of that, those accounts were removed and we did look into whether they had been able to use—I do not remember whether they had, but we apply stricter criteria if you are trying to monetise from our platform, and if we find you in breach of any of our rules, your account and your ability to use it is removed.

RS
Cameron ThomasLiberal DemocratsTewkesbury39 words

Giles, you told one of my colleagues that the TikTok algorithm protects users, but surely, driving virality as it does, it is not popular because it protects users; it is popular because it exposes them. Is that not right?

Giles Derrington177 words

One of the ways that the virality works, and with our trust and safety team it is worth exploring this, is that when any content is uploaded to our platform, it is reviewed by our trust and safety teams against our community guidelines, and they will remove things. As posts grow more viral on the platform—as they grow and more people want to look at them—because of that process that happens, they will then go up to two further rounds of moderation, the idea being that things that are most seen on the platform have been through significantly more checks, including human checks, to make sure they are abiding by our community guidelines. That growth, and the way the algorithm slowly grows content to make sure that the highest quality stuff comes to the top, means that there is that additional ability to check and make sure that the kind of content we want to see succeeding on the platform, whether that is from small businesses or individual creators, is the stuff that is getting most traction.

GD
Cameron ThomasLiberal DemocratsTewkesbury4 words

Like the skull-crusher challenge?

Giles Derrington30 words

As I said, these things have not been viral on our platform. That is not to say there has never been content, and I think there is a big difference—

GD
Cameron ThomasLiberal DemocratsTewkesbury10 words

There was another one where children were eating detergent tablets.

Giles Derrington69 words

Again, the question of virality and how popular these videos are is very different from some of the way that they have been talked about. That is the key difference here. We want to act to take things down regardless of whether they breach our community guidelines, but we are adamant that a lot of these challenges that people have talked about have never been viral on our platform.

GD
Cameron ThomasLiberal DemocratsTewkesbury50 words

Can I just round up my initial question and take you both back to 1967? After creating that environment where the children ended up attacking each other, that teacher, Jane Elliott, stepped back in, explained the experiment and regulated the scenario. Do you think she was right to do so?

Giles Derrington12 words

I don’t know enough about the experiment to know, to be honest.

GD
Cameron ThomasLiberal DemocratsTewkesbury27 words

You are not sure that it was right that the teacher, having created that environment, then did her job and protected those children by settling it down?

Giles Derrington15 words

I just don’t know enough about the experiment. I assume so, but I would not—

GD
Rebecca Stimson73 words

I couldn’t speak to that experiment, but the age limit for Instagram is 13, and we have created a different space for everyone from 13 to 18—teen accounts—where they are defaulted into a much more restricted experience. I think what you are driving at is: if you create a space for teens, you should have responsibility to engage and make sure that is as safe as possible, and my answer would be yes.

RS
Cameron ThomasLiberal DemocratsTewkesbury11 words

To regulate that area, yes. Giles, would you agree with that?

Giles Derrington1 words

Absolutely.

GD
Cameron ThomasLiberal DemocratsTewkesbury20 words

If she had not done so, who else ought to have stepped in to ensure that that environment was regulated?

Rebecca Stimson22 words

That is what Ofcom is trying to do now about user-to-user services in the UK who are providing their services to over-13s.

RS
Giles Derrington94 words

As Rebecca said, Ofcom as the regulator is the key element here, in the modern context. We speak a lot to parents and others about these kinds of issues, and things that people see on our platform, and one of the things we try to design our safety tools to do is to encourage that conversation between parents and teens, because we know that a lot of parents want to understand what they can be saying to give their teens the best information about being safe online. It is ultimately a whole society approach.

GD
Liz JarvisLiberal DemocratsEastleigh24 words

I want to focus on parental controls. What controls do your platforms have so that parents can control how their teenagers use the platforms?

Rebecca Stimson213 words

As I just mentioned, in 2024 we began rolling out teen accounts. That defaults all users between 13 and 18 into a much more restricted experience. It responds to what we hear from parents globally are their main concerns: who can contact their child, what content they are seeing, and screen time. I am happy to go into the list, but to your specific question about parental controls, it has a built-in link to allow parental supervision so that parents can entirely tailor it to their teen. It shows them the amount of time they are spending on it, and sends them notifications about who is contacting their teen and if their teen is looking for something like suicide or self-harm, for example. They can see who is contacting them. They can also reset their child’s algorithm if they have begun looking at something they do not want them to look at. There is a full suite of controls there for parents. In the big debate that everyone is having about wellbeing online, we are very keen to make it as easy as possible for parents. Rather than those features being off and the parents having to go in, it is all on by default and they have the ability to tailor it.

RS
Giles Derrington256 words

On TikTok there are about 50 different settings for teens, to enable them to change the way that the experience works on the platform. Most of those are built into the family pairing tools as well. We do make some very specific choices about areas that we think are potentially the most harmful; we do not allow that to be the parents’ choice. For example, direct messages, which we see as a potential serious risk of peer-to-peer harm, are turned off for under-16s. They are not allowed access to direct messages at all. We have also done some interventions on screen time such as a one-hour default cap, so that after an hour of scrolling people get a message saying, “You should come off,” and a 10 pm curfew, which will guide people through a mediation to bring them off the platform. One of the other interesting things we have done with family pairing is enable parents to not just set the amount of time that a teen may be using the platform but say there are specific times when they should not be allowed to use it at all. That might be during the school day—turning it off from 9 till 3, for example. We also allow them to set the session time, so they can say, “You can have 40 minutes of screen time a day, but we don’t want you spending more than 10 minutes in an individual period on the app.” That allows people to make very tailored decisions for their teens.

GD
Liz JarvisLiberal DemocratsEastleigh51 words

Giles, unsealed documents from the court case in Massachusetts suggest that TikTok knows adoption of these tools is remarkably low. Can you confirm exactly what percentage of minor accounts on TikTok in the UK are linked to a parent through family pairing, or how many parents are engaging with parental controls?

Giles Derrington79 words

When it comes to court documents, I am not going to be able to go into discussions there. What I would say is that we spend a lot of time making sure that parents are aware of the tools that exist on our platform. For example, for any user over the age of 35, we send a notification that says, “If you have children, there are these family pairing tools available.” We also know from speaking to parents directly—

GD
Chair8 words

Did you say under the age of 35?

C
Giles Derrington12 words

Over the age of 35, so people more likely to be parents.

GD

Is 35 not a bit high?

Damian HindsConservative and Unionist PartyEast Hampshire4 words

Sign of the times.

Giles Derrington58 words

You could go lower, but that would probably be a wider demographic question than one I am prepared to answer. The other thing that we know from speaking to parents directly is that an awful lot of them will not use our family pairing tools because they are using different tools, such as handset-level or app store-level tools.

GD
Damian HindsConservative and Unionist PartyEast Hampshire5 words

Or they don’t understand them.

Giles Derrington118 words

From conversations we have had directly with parents and data we have seen, they do understand them and view them as easy to use, but a lot of them prioritise having the conversation directly with their teens over setting particular screen time caps. I think our responsibility in this is to make sure that our parental controls are the best in class, and we have been at the forefront of innovating on things like session times, being able to see block lists and so on. We think that is the right thing to do—making the best solutions available to people—but ultimately it is going to be for parents to make the choices that are particularly right for them.

GD
Liz JarvisLiberal DemocratsEastleigh12 words

But you can’t give any numbers on how many parents use them.

Giles Derrington11 words

I am afraid I do not have them available to me.

GD
Liz JarvisLiberal DemocratsEastleigh41 words

Why did your own internal documents admit that these features only exist behind a somewhat hidden series of menus? Why did TikTok choose to hide these safety entry points rather than making them prominent to ensure parents could actually find them?

Giles Derrington23 words

When it comes to documents like that—and I do not want to go into details of court cases—we have constant conversations about how—

GD
Liz JarvisLiberal DemocratsEastleigh5 words

This is publicly available information.

Giles Derrington82 words

Conversations about how we can make things more prominent and more used are exactly the kind of conversations you would expect and want us to be having. Things like the notification we send to users over the age of 35, the work we do directly with both parents and teens through things like our youth advisory council to understand how they use the app and where they see these tools being best placed are the direct consequences of some of those conversations.

GD
Chair26 words

What is the take-up of your parental controls? What percentage of children under the age of 16, for example, have parental controls engaged on their account?

C
Giles Derrington7 words

I don’t have that available to me.

GD
Chair3 words

You don’t know?

C
Giles Derrington3 words

I don’t know.

GD
Chair32 words

Giles, you are here to talk to us about children’s TV and video content, and you do not know a statistic like how many kids have the parental control engaged on their—

C
Giles Derrington36 words

I am happy to come back to the Committee. What I am saying is that we know for a fact that an awful lot of the people who do not use them are using different tools.

GD
Chair8 words

How do you know that for a fact?

C
Giles Derrington8 words

Because we have researched that directly with parents.

GD
Natasha IronsLabour PartyCroydon East8 words

Then you should have the figure, shouldn’t you?

Giles Derrington6 words

I don’t have that specific number.

GD
Natasha IronsLabour PartyCroydon East12 words

Can I ask Rebecca the same question? Do you have that figure?

Rebecca Stimson63 words

We defaulted everyone in, and in surveys we have carried out, 97% of parents have stayed in that experience, and a similar proportion—94% or 95%—said it is extremely helpful. Part of our intention was to default on. I cannot tell you the percentage of people, but the survey work we have done with parents whose children were defaulted into that is very positive.

RS
Liz JarvisLiberal DemocratsEastleigh24 words

Giles, can these tools be deactivated by the child themselves, and are they able to bypass restrictions by, for example, using a web browser?

Giles Derrington15 words

No. If their account is linked to their family pairing account, they cannot deactivate it.

GD
Liz JarvisLiberal DemocratsEastleigh24 words

Can I ask both of you how you proactively consult with parents in the UK to hear their concerns about teenagers using your platforms?

Giles Derrington100 words

As I alluded to, we have a number of ways of doing that. We have dedicated teams within our trust and safety organisation that will go out and speak directly to parents and users. We also have things like our youth advisory council, which is a global council made up of teens that also engages with their parents to get direct, more detailed feedback. We also speak to a huge number of NGOs and work directly with them, whether it is Internet Matters or others, to utilise their expertise, experts and understanding of parents, and feed that through as well.

GD
Rebecca Stimson72 words

My answer is very similar. We work with NGOs—Parent Zone, Internet Matters, and others—but we also have a policy forum where we work with groups around the world. We ask our users as well. We ask and engage with parents. Before we developed and rolled out teen accounts, we worked a lot, directly asking users who are parents on our platforms what their concerns were and how we could best address them.

RS
Liz JarvisLiberal DemocratsEastleigh6 words

Are parental controls bad for business?

Rebecca Stimson69 words

No, they are better for business, because we want our platform to be as positive and engaging for people as possible. That is what brings us users and keeps the advertisers who pay for it happy. It is entirely in line with our business incentives to have that be good. The more parents feel happy and the better experience children are having, that is in our interests as well.

RS
Giles Derrington49 words

I would entirely agree with that. Ultimately, TikTok is intended to be an entertainment platform. We are very clear that you cannot be entertained unless you are safe, and that means that feeling safe on the platform is part of enjoying and using it. So yes, I entirely agree.

GD
Liz JarvisLiberal DemocratsEastleigh23 words

Presumably, you want parents to think that you are a responsible platform. Do you think that parents think you are a responsible platform?

Giles Derrington32 words

When we speak to our users and their parents, there is a lot of positivity about these things. We are a young platform and there is absolutely education work to be done.

GD
Rebecca Stimson96 words

There are clearly concerns about young people online and what they might encounter and experience. Sadly, that is going to be a conversation that is never finished, because the threats change and the technology changes. As I said, we have had a very positive response to teen accounts and what we have done there, but I do not think that has mitigated entirely people’s concerns, and nor should it. We all need to be alive to the risks and constantly trying to look for ways to ensure that we are doing our best in that space.

RS
Damian HindsConservative and Unionist PartyEast Hampshire33 words

Rebecca, you spoke earlier about the proportion of teenagers on Instagram whose parents were linked into their accounts. What can you tell us about the average number of Instagram accounts per Instagram user?

Rebecca Stimson41 words

We work hard to ensure that the person is an authentic account. We do not want people to have fake profiles, because usually that is very directly correlated to harmful content. I do not have a number for whether people have—

RS
Damian HindsConservative and Unionist PartyEast Hampshire8 words

Other people estimate it. You have no estimates?

Rebecca Stimson10 words

I am happy to, but I just do not have—

RS
Damian HindsConservative and Unionist PartyEast Hampshire21 words

I think these are dated terms these days, but you are familiar, I take it, with the terms finsta and rinsta.

Rebecca Stimson5 words

Finsta and—no, maybe I’m not.

RS
Damian HindsConservative and Unionist PartyEast Hampshire30 words

That is a fake Instagram account and a real Instagram account. Those are old words now. Are you honestly telling me those are not terms that you are familiar with?

Rebecca Stimson49 words

It may just be that I am not familiar with those terms. If you are asking for an average of how many accounts a user might have, I am happy to go and try to find that number. I do not happen to know that in front of me.

RS
Damian HindsConservative and Unionist PartyEast Hampshire2 words

Wow. Okay.

Natasha IronsLabour PartyCroydon East10 words

Are your platforms safe for children aged 13 and above?

Giles Derrington44 words

We spend over £2 billion a year trying to make sure that that is absolutely the case. Yes, we want people to have an entertaining experience. That is in our interest and it is in our users’ interests; it is also in advertisers’ interests.

GD
Rebecca Stimson85 words

I don’t think we would ever get to that point. We work hard to make it as safe as possible, but we absolutely recognise that there are evolving risks and existing risks. I do not think it is a piece of work that will ever be done, and that is the way in which we approach it. A claim that it is safe would perhaps give parents and other people using the platform a false sense of confidence. We should all be vigilant about risks.

RS
Natasha IronsLabour PartyCroydon East33 words

So at least for Meta, the platforms are not safe for children aged 13 and above, as opposed to TikTok, which is trying to make it safe for children of 13 and up.

Giles Derrington120 words

That is absolutely our ambition. As Rebecca says, we know when we engage on bad-actor issues that bad actors evolve their mechanisms, and it is incumbent on us to keep up with the changing ways that danger may appear. That is not just a question for us as a platform. As I say, a lot of the co-ordination that happens elsewhere online is an important part of this, and we have been speaking to people about that to try to make sure that this important part of society is safe. In the same way as for the wider society, there are lots of risks and we are all working collectively to make sure that we are as safe as possible.

GD
Natasha IronsLabour PartyCroydon East159 words

This is slightly different from, say, walking across the street when somebody is speeding. In other areas of media—we had Sky and Paramount in earlier today—as a parent making decisions about where your children consume media, you are able to control by default what they see and how they consume that media, whereas yours is more wild west and perhaps we can put in some parental controls. I don’t think it is just an accepted thing that we are trying to make it safe. The question should be, how do we make it safe and just make it safe? Both of your companies have been involved in this lawsuit in America. I think TikTok settled and Meta has been found guilty of creating a platform that is intentionally addictive. What are your initial thoughts on that? I appreciate that Meta might be pushing back on the ruling, but TikTok settled. How will you make your platforms not intentionally addictive?

Rebecca Stimson177 words

We do not agree with that finding and we are in an ongoing legal process, so I am afraid I am very restricted in what I can talk about. I am happy to talk about what we do on the platform to try to keep users between 13 and 18, and everyone, as safe as possible. That is what we are focused on doing. If you look at what we have done over the years, with long investment—Facebook, in particular, is more than 20 years old and has had rules around content, safety features, parental controls and all those kinds of things that have evolved over a long time. We have 40,000 people working in safety and security. We spend multiple billions of dollars every year. I can never sit here and say it is all fine and everything is perfect. Sadly, this is an adversarial space that changes all the time. I encourage you to look at what we have done and the evidence of how it is working, but I cannot go specifically into that.

RS
Natasha IronsLabour PartyCroydon East25 words

I suppose the question is, given everything you have done—and, as you acknowledge, it is still not safe—why are people still trying to sue you?

Rebecca Stimson6 words

I cannot talk about the lawsuits.

RS
Natasha IronsLabour PartyCroydon East43 words

Not that one, but the other ones—the other people who are coming forward with other evidence. If you are doing all this and it is supposed to be making an impact, why are people still trying to sue Meta for having addictive platforms?

Rebecca Stimson11 words

I can’t speak for why people bring forward lawsuits against us.

RS
Chair23 words

Let us talk about your staff, though. The Observer reported that staff at Meta called Instagram a “drug”. What were they referring to?

C
Rebecca Stimson93 words

Those are leaked documents, mostly from quite some years ago. They do not really reflect the reality of our platforms as they are now. I cannot tell you what they were referring to. I would just encourage you, as I have just been saying, to look at what we have done and how effective that is on the platform. We have made great strides and, as I said, the comments and reflections we get back from parents support that. This is obviously a piece of work that is never going to be done.

RS
Chair37 words

Tomorrow in Parliament we will be voting on whether to ban social media for under-16s. If everything is going in the right direction, why would we be in the position that we have to consider that vote?

C
Rebecca Stimson143 words

We don’t think a ban is the right way forward. We don’t think it will be effective, and we think there is a very strong chance that it will risk promising parents something that cannot be delivered. The same warnings were given in Australia. We can see that they are having quite a bit of difficulty over there, for the same reasons everybody said. I understand that a lot of it is being driven by people’s genuine concerns about children growing up online and what are the right spaces and the age-appropriate experiences. I understand why the conversation is happening. I am glad to see that the Government are consulting on and thinking through features and functions and where children spend their time and the evidence of the risks, rather than reaching for a ban, which we don’t think is the right approach.

RS
Chair11 words

Why do you say that in Australia it cannot be delivered?

C
Rebecca Stimson177 words

One challenge around delivering it is the technical limitations on age assurance. No one will claim that age assurance is a perfect science; it is not. We have a whole range of ways in which we try to accurately identify how old people are, and we are part of lots of partnerships to try to lead the way on technology that can make that easier. One difficulty about age assurance at the moment is that apps are currently asked to do it app by app. The average teen has about 40 apps on their phone. We don’t think that is helpful for parents. We think there is a much better app store-level age solution, which would be one gateway that would be able to verify someone’s age. That would then benefit all of us as apps. That is not to say that we would then stop doing what we do, but it would be a missing piece of the jigsaw. The limitations around age are one thing that makes a ban like that quite difficult to deliver.

RS
Chair47 words

One of the findings from Australia—which, as you know, has a ban for under-16s—is that over half of the children who previously had TikTok, YouTube or Instagram accounts are still able to access their accounts on those platforms. Why do your platforms, in particular, struggle to implement—

C
Rebecca Stimson118 words

I am not sure it is us; I think that is universal. This applies to all companies that are in scope. I think there are several reasons. We have seen marketplaces spring up in Australia for people selling their ID or to lend you ID if you are trying to get back online. We have also seen what they logged-out browsing, where they no longer have an account but they search for an Instagram video and then can view it. One of the things that is most disadvantageous about that is that they are then no longer in any of the protections I was just talking about; they are accessing it in a different way, without an account.

RS
Chair5 words

How are they accessing it?

C
Rebecca Stimson34 words

If you do not have an account, you can search for a particular creator through Google search and watch videos in what we call a logged-out experience. You are not logged into our service.

RS
Chair6 words

Is this a problem with Google?

C
Rebecca Stimson26 words

That is for Google, I am afraid, but it is one way that they can circumvent the ban that we are seeing spring up in Australia.

RS
Natasha IronsLabour PartyCroydon East58 words

TikTok settled, so maybe—I don’t know—there is an admission that TikTok has an addictive algorithm, but isn’t the behaviour that we are talking about, with young people trying to get around this, demonstrating addiction? Is it not proving the point that your services are not particularly good for children’s wellbeing if they are trying to get around it?

Rebecca Stimson142 words

We have seen, and you are probably aware, that a lot of younger people are also bringing action against the ban in Australia, because it has removed them—particularly more marginalised teens, for example from LGBT+ communities or diasporas of different people living in Australia who were contacting each other. A lot of it is driven by the fact that they feel like they have lost their communities and they have lost their way. Because it is being implemented unevenly, unfortunately—despite best efforts to comply, of course—some of their friends are still on it and some of them are not. I think it is more a reflection that these platforms play an important role in young people’s lives and identities, and that if you just take it away they will try to find a way back. I don’t think we can automatically assume—

RS
Chair17 words

Whose fault is that? Is it the fault of the platforms, of the Government or of parents?

C
Rebecca Stimson2 words

What fault?

RS
Chair35 words

The Government have banned social media for under 16s, and yet you say and statistics say that a big chunk of them are still accessing it in Australia. Whose fault is that—parents, Government or platforms?

C
Rebecca Stimson6 words

These are the risks of a—

RS
Chair16 words

A one-word answer will be sufficient. We all have other things to do this afternoon. Which—

C
Rebecca Stimson42 words

These risks were flagged before. As we have seen, 40-plus UK safety organisations wrote to the Government flagging similar issues and why the UK should not proceed with a ban. The Australian authorities will have to speak for why they moved ahead.

RS
Chair11 words

What is your view? The Government, parents, platforms—who is at fault?

C
Rebecca Stimson18 words

it is unwise to pass a law that ignores the technical limitations of what is deliverable for parents.

RS
Chair6 words

So you are saying the Government.

C
Rebecca Stimson16 words

I cannot speak for how the Australians came to that decision, but that is my view.

RS
Chair6 words

The Government are at fault. Okay.

C
Anneliese MidgleyLabour PartyKnowsley139 words

As MPs we all go into schools and colleges a lot, and this is a live topic that is up for discussion. Quite often, I will be asked what I think about the ban on social media platforms for under-16s, and I ask those young people the question. This is anecdotal—it is my experience—but the majority of those young people do think that there should be more protections and that there is an issue with accessing social media. I note, Ian, that you said that yours was an entertainment platform, but I think they would include TikTok within that. Why do you think that young people themselves are saying that, and that they feel like they have had bad experiences on there and that online is becoming a bigger part of their lives? Why would they be saying that?

Giles Derrington138 words

If I may, on the ban specifically, there is probably a slight difference in approach here. We think the debate about whether there should be a 16-year-old level versus a 13-year-old level is one, ultimately, for policymakers. As the Government consultation fully recognises, there is a huge amount of complex and competing evidence, and there are direct competing interests. We definitely see that huge numbers of teens derive real value and real community, whether that is LGBT kids or kids who are interested in science, arts, music or culture who are being exposed to a huge range of exciting creators in those spaces. There is real value there, but there are absolutely trade-offs and challenges. We ultimately do think it is a question for policymakers to look at the whole of the evidence base and make policy-based decisions.

GD
Anneliese MidgleyLabour PartyKnowsley69 words

Of course there are young people who go on and create amazing content, and we should celebrate and champion those skills and the opportunities that that might give them, but they are also saying that they feel like they have had bad experiences online. What is your starting point? You are talking like it is a bed of roses. When you are talking about policymakers, it is their lives.

Giles Derrington122 words

Where we see our role and responsibility in this—again, the ultimate choice about where you set the age is for policymakers to determine, not for platforms. What we spend our time doing is focusing on speaking to those teens about the trade-off that they see and the benefits they see in some of the areas where they want us to do a bit more, and then seeking to consistently be at the forefront of innovation in that space. For example, one thing we have heard from teens is that putting the phone down at the end of the day can be hard, which is why we introduced a 10 pm curfew, which helps guide people off the platform and enables them to—

GD
Anneliese MidgleyLabour PartyKnowsley20 words

Do you feel any sense of responsibility that these young people feel that they have bad experiences on your platform?

Giles Derrington123 words

I want to be clear: the vast majority of the people we speak to, and the teens we speak to, have an overwhelmingly good experience. But yes, absolutely, and that is why we are constantly innovating in those spaces, whether it is the hour-long cap or the 10 pm curfew. We think one benefit of the way the Government are approaching this and the consultation is to look at some of that best practice and make decisions. There are other areas—I alluded to them before—where we have actively made the choice that we do not think these things can be safe for teens. For example, we have turned direct messages off for under-16s specifically because we do see the direct harm vectors there.

GD
Natasha IronsLabour PartyCroydon East28 words

You are talking about things that you have actively taken off, like direct messaging. What about infinite scrolling? Can you just stop doing that? Would that be helpful?

Giles Derrington41 words

Again, it depends how—there is a lot of complexity, and the consultation talks about this, about how you define some of those terms. We have, after 30 minutes, a prompt that will say, “Hey, you should stop watching.” That is a—

GD
Natasha IronsLabour PartyCroydon East9 words

That is a very long time to be sat—

Giles Derrington158 words

True, but that is why we give people the ability to set individual session times as well. People will make individual choices. We see that the value of the way our platform works is that discovery and window into things that you may be interested in. We will regularly speak to teens who say, “I’m vaguely interested in science and maths, and because of that, I’ve stumbled across a creator like Big Manny,” who does teaching of science content and does science experiments in a very accessible way to teens, “and I’m really engaged in that.” We have seen—you alluded to it previously—that people will then take action off the platform as well. They will say, “I have had a bit of exposure to this short-form content about education. I’m now going to go and find a course and find a way to engage with that longer term.” I think there is a benefit of that discovery mechanism.

GD
Natasha IronsLabour PartyCroydon East74 words

That is the point, isn’t it? Again, this is about young people’s content, where they find it and where they migrate to consume it. We can all accept that there are benefits to that and that groups can form, but we are talking about how we keep them safe and prevent the harms. I couldn’t really get to this point: are your parental controls by default, or do parents have to opt into them?

Giles Derrington24 words

You have to link an account, because we do not necessarily know who someone’s parent is by dint of them being on the platform—

GD
Natasha IronsLabour PartyCroydon East83 words

Who am I to tell you how to do anything in your business, but surely part of the negotiation with a teenager to have an account could be, “You need to have this linked to a parent.” Thirty-five might be a bit too high, but surely you can do things to ensure that it is not basically like sticking seatbelts in your car after you have bought it, but designed in in the first place. Why would you not do that by default?

Giles Derrington141 words

This is one of the benefits of the consultation. It is looking at some of these things in the round. On the idea of a default requirement for parental control, plenty of parents say, “Actually, that’s not how I want to parent my child, and I want to be able to make that choice.” There are also edge cases. If you are a young LGBT kid from a devoutly religious family, there may be challenges with that as well. We do have to think about these things in the quantum, and that is what the consultation is driving at. Our responsibility within that consultation is to share some of the best practice we see and, ultimately, allow policymakers to make choices about where you draw those lines, because there will always be benefits and downsides to any choice in this space.

GD
Natasha IronsLabour PartyCroydon East37 words

So you are outsourcing that to us to decide where we will draw that line for you. Will you be able to produce the data that we need to make those decisions in the best possible way?

Giles Derrington90 words

It is quite right that the Government and Parliament have decision-making power over the way platforms work. That is what the Online Safety Act was there to create on the content side when it comes to some of these issues. That is why we are genuinely open to the conversation and for policymakers to make their choices. Absolutely, we will engage with that. We speak directly to Ofcom, and have done since we first existed, about some of that data and allow them to provide the best possible evidence base.

GD
Chair28 words

We need to make a bit of progress; otherwise, we will never get out of this room. I would like to conclude by 1.15 pm at the latest.

C
Vicky FoxcroftLabour PartyLewisham North65 words

The discussion on social media and the challenges with being able to ensure that young people under 16 are banned from it is quite timely. It links to what you are doing to make sure that under-13s are not on social media, as both your platforms say they should not be. How many do you think are getting through who should not be getting through?

Giles Derrington51 words

As we have talked about, yes, under-13s are not allowed on the platform. It is not designed as an experience for under-13s. We are still among the only platforms that publish transparently every quarter how many under-13 accounts we remove. We spend a lot of time thinking about new ways to—

GD
Vicky FoxcroftLabour PartyLewisham North5 words

How many do you remove?

Giles Derrington16 words

I have it somewhere. Bear with me. I will find it for you in two moments.

GD
Vicky FoxcroftLabour PartyLewisham North8 words

And has the number of removals been increasing?

Giles Derrington68 words

Forgive me. Globally, we removed in the last quarter 13.7 million attempts by under-13s to get access to our platform. That is 7.8% of all the removals for all kinds of reasons that we do on the platform. I could not give you a definitive answer on whether that has gone slightly up or down. I think, off the top of my head, it has broadly remained static.

GD
Vicky FoxcroftLabour PartyLewisham North12 words

What is that number as a percentage of the number of users?

Giles Derrington18 words

I don’t think I would be able to work that out on the fly, but we can certainly—

GD
Vicky FoxcroftLabour PartyLewisham North6 words

Could you send that in writing?

Giles Derrington8 words

Yes, broadly speaking, certainly from the UK side.

GD
Chair23 words

To help you out, Giles, Internet Matters says that 32% of children aged nine to 12 use TikTok. Do you recognise that number?

C
Giles Derrington34 words

I don’t think it is a number I recognise. Again, our view on this is that we are transparently removing every under-13 account that we find. We do not want any on the platform.

GD
Chair37 words

Sorry, Vicky, I apologise for interrupting you. I am just trying to get a handle on this number. Internet Matters says 32% of children aged nine to 12 use TikTok. What would you put that percentage as?

C
Giles Derrington35 words

Again, because we are seeking to remove every single account we find, there are not accounts that we know of who are under 13 on the platform because, when we find them, we remove them.

GD
Chair8 words

You say there are no accounts under 13.

C
Giles Derrington15 words

I wouldn’t say that. I am saying that when we find them, we remove them.

GD
Chair15 words

When you find them, you get rid of them, so there can’t be that many.

C
Giles Derrington80 words

I wouldn’t want to hazard a guess. Again, we are constantly looking at ways, through age assurance and other mechanisms, to improve those numbers. It is also worth saying that some of that data gets a little bit messy. We know, for example, of under-13s who will engage with their parents and watch TikTok on their parents’ account with their parent there, which is a slightly different thing but may turn up in external data as users of the platform.

GD
Vicky FoxcroftLabour PartyLewisham North13 words

What are you doing to make sure that under-13s are not using it?

Giles Derrington107 words

We have a range of mechanisms. If you will forgive me, I will not go into the detail of them, for the fairly obvious reason that we do not want people to try to get around them, but every account that goes on to the platform we will check against a range of mechanisms to be able to identify, and then as people engage in content, look at content or post, all those things will factor into making that assessment. It is also worth saying that we start from the basis, before we are sure, of putting people in an age-appropriate experience anyway, regardless of known age.

GD
Vicky FoxcroftLabour PartyLewisham North32 words

I understand why you do not want to describe the range of mechanisms, but would you share that with the Committee in private? Is that something we are allowed to ask for?

Giles Derrington42 words

I think there is certainly some of it we could do. It is a very long list of things we do, and some of those will be IP technology, but I am happy to go away and ask what we can provide.

GD
Vicky FoxcroftLabour PartyLewisham North4 words

Fantastic. Go on, Rebecca.

Rebecca Stimson401 words

This all comes down to age assurance. I have said it already. No matter what any of us thinks is the best way forward on online safety, if we cannot better improve the accuracy of finding out how old people are, it is difficult to make it work. We have done a few things. One thing we think about with age assurance is that there is no one foolproof way and so you need a toolkit. We are using quite a lot of AI detection. It will look at an account, and it will be able to observe things about what is happening. If someone has put in an age saying they are an adult, it is getting quite good—it is not fully there yet—at saying, “No, all of their friends are this. There seems to be a birthday there that does not look right,” and giving us a signal. We do a couple of things with Yoti about facial estimation. If we have a clear signal that you are under 13, the account is found and removed immediately. If there is a bit of uncertainty, we can do facial estimation through Yoti with a video selfie, and their technology can say whether it thinks that person is or not. They also do document verification, so asking for formal ID for people. We take a proportionate approach to that, because there are some questions about us having that data—that is why we do it through a third party—and the Electoral Commission estimates that 3 million people in the UK do not have any formal verification. The last one, which probably is the most exciting, is that we are a founder of a new non-profit called the OpenAge Initiative. This goes a little bit to the app store comment I made earlier. It is looking at ways that you might be able to verify your age officially on your device, no matter what you do from that point—what apps you are downloading and so on. That has to be easier. We hear loud and clear from parents, as they move around dozens of apps and we all work in different ways, that it is very complicated, so we think that has quite a lot of promise as part of how we are trying to not only accurately identify people who are legitimately using our platform, but try to keep 13-year-olds off it.

RS
Chair25 words

We need to keep answers nice and short and pithy now because we do not have much time; otherwise, we will be here all day.

C
Natasha IronsLabour PartyCroydon East82 words

We have already touched a bit on the social media consultation, but the other side of this coin is the benefits of social media, and the benefits of content on social media. You talked about educational things. We are thinking about things like public service broadcasters, given that we, as the public, have all paid for that content. What kinds of things would you consider, or have you considered, around prioritising that content over other content? Would your platforms consider doing that?

Giles Derrington177 words

If we think about something like the BBC, we see ourselves, as I described, as a discovery platform. We do something fundamentally quite different from what the PSBs generally do in their traditional content, in terms of it being vertical video and short-form content. We see incredible value in the partnership that we have with the likes of the BBC to do basically two things. The first is to act as a signpost and a window for the things that they are doing. For example, we developed a tool called Spotlight. When there are big watercooler moments on the BBC—for example, we launched this with “Celebrity Traitors”—we know that a lot of people on the platform will make content and will be really excited about “Celebrity Traitors”. We have enabled the BBC to tag that content, which will then provide a direct link to iPlayer to watch the show. We see that hitting millions and millions of video views. The data suggests that over 50% of the people using that to go to iPlayer are under 25.

GD
Natasha IronsLabour PartyCroydon East9 words

Does that cost the BBC more money to use?

Giles Derrington189 words

No, these are tools that we developed for them. Fiona Campbell, the commissioner for youth audiences, said that that relationship is the future of television. The other thing that is important to talk about is that we also see ourselves as a pipeline for the talent that comes to some of these organisations, from which an awful lot of people traditionally would have been excluded. Shini, one of the new “Blue Peter” presenters, started on TikTok. She grew her account on TikTok, and she learned the skills of presenting on the platform. Because of her profile on the platform, she was then able to be spotted and talent-scouted and moved through. We see a number of examples of that. Another is a comedian called Henry Rowley. We took him to the Edinburgh festival. He was a big presence on the platform but had never done comedy in public. We took him and a number of other creators to the Edinburgh festival. They did their first ever stand-up shows. Because of that, he has directly translated and is now in Paramount’s “Robin Hood”. Part of our responsibility is that pipeline.

GD
Natasha IronsLabour PartyCroydon East127 words

There is a lot of talk, especially around children’s content—and when we talk about children, we are talking about 13-year-olds for your platform. Especially when it comes to things like YouTube, in the context of a framework for what we consider high-quality content, we have picked up in this inquiry that there are a lot of short, snappy things that are not helping our children’s brains develop well. Is there any scope within your platforms or your community guidelines or the frameworks that you can put in place to promote someone—a PSB or a user themselves—who is creating content that is of high quality and good for wellbeing, development or education? Is there a way of promoting that above everything else, or would you consider doing that?

Rebecca Stimson100 words

That is what our recommender system is designed to do. It is designed to reward high-quality original content from authentic sources. As I said at the very beginning, it is also a personalised experience. Once we have found and removed all the content that is harmful and violates our rules, the whole system is designed to do exactly what you want. We are really pleased that in all of the UK, PSBs have enormous presence on Instagram and Facebook, which we provide for free and drives huge traffic and engagement to their content. Our system is designed that way already.

RS
Giles Derrington163 words

I would say two things, very quickly. First, we spend a lot of time speaking to traditional content makers about the tools we can create to help them get in that shop window. Things like Spotlight are part of that discussion. That is real value for us. The other thing is that the reason users come to TikTok is that it is a user-to-user service with user-generated content. Often things boil up through that, which may not be the things that would be assumed by traditional commissioners to get through. For example, classical music has exploded on TikTok over the last couple of years. We saw that trend happening. Because of that, we were able to go out to South Bank University and launch a programme called Crescendo, which will help 10 young female classical musicians begin and solidify their careers in the classical music industry. I would worry that, due to the traditional content, you would lose some of that user-generated value.

GD
Natasha IronsLabour PartyCroydon East59 words

I broadened that out to include other content creators who are making higher-quality content. If you have content creators on your platform who you deem to be making higher-quality content, as Instagram does, are you promoting them—or will you in the future—above other stuff that perhaps is not as high-quality or is not helping our children’s wellbeing and development?

Giles Derrington96 words

The creator rewards program is entirely designed to support those creators to produce more and to continue to produce. Because of the meritocracy element of the platform, that stuff is rising to the top anyway. There are certain elements where we will support directly—for example, sports verticals, particularly around big, seminal issues. For example, with the World cup coming up, how can we support those creators to know what is going on, know what the trends are and then create content that will succeed? That is where we see the value of our interventions and support.

GD
Anneliese MidgleyLabour PartyKnowsley78 words

How do you ensure, in this wild west of the algorithm, that public service broadcasting—as Natasha said, we are talking about taxpayers’ money—is promoted more than AI slop of freak people or, worse, inappropriate content or content that could be seen as dangerous or not good? What is the starting point here? It sounds like it happens and then you move backwards rather than having the starting point as something that is high-quality and moving forward from there.

Giles Derrington227 words

From a TikTok point of view, I would not quite agree with that characterisation. When we speak to those traditional broadcasters, of which the PSBs will be part, we see it as our role to help and work with them to identify what tools are available and build out new tools and direct our costs. Spotlight is a great way of doing that. It has seen significant success. The BBC has talked about how important that has been in driving an audience that has moved away from the PSBs back towards those seminal shows. They are succeeding on their merits. The BBC has over 45 different accounts on TikTok, serving a whole range of processes, and they will regularly be at the top of searches for a whole range of content, and regularly and prominently within people’s feeds organically. When it comes to certain high-risk events—Iran and so on—we will work to make sure that if people are searching for news on the platform, they are directly getting as top results those high-quality, fact-checked broadcasters. In the general sense of the entertainment sector, they see the value as well in being alongside those genuine user-to-users who also get prominence, not least because they are the presenters, broadcasters and creators of the future, and they want them to succeed and be able to engage with them as well.

GD
Anneliese MidgleyLabour PartyKnowsley34 words

How is that defined against misinformation? As soon as the public service broadcasting has gone away, the next thing that comes up could be a narrative that is presented as fact but is false.

Giles Derrington72 words

We are coming back to the safety argument. It is worth saying that we do not allow misinformation on the platform. We continue to use fact-checkers directly who will fact-check content on the platform. During those high-profile events when we know people will be searching for correct information, we will also be making sure that they see the traditional, most trusted news broadcasters highly prominent on searches and in feeds as well.

GD
Natasha IronsLabour PartyCroydon East8 words

Who determines what is misinformation on your platform?

Giles Derrington127 words

We have a number of ways. There are some things we know are misinformation. We have a misinformation bank of things that we know not to be true that moderators can directly engage against. We work with fact-checkers around the world, such as Reuters in the UK. When we see new claims, our moderators can send something to Reuters, which can make three decisions: “This is true and should be allowed”; “This is untrue and should be removed from the platform”; or—and there are plenty of cases where it is not entirely clear—we can make a choice to remove that from the recommended feed. Effectively, it will not be pushed to people, but it still exists on the basis of free speech until those claims are verified.

GD
Natasha IronsLabour PartyCroydon East23 words

What about a crazy AI video of something that did not happen but is not necessarily a news report? Who is checking that?

Giles Derrington38 words

All AI on the platform has to be labelled regardless, but if it is misinformation, it does not matter whether it is AI or not: it will be removed from the platform. It will be fact-checked and removed.

GD
Chair16 words

Has either of you had any conversations with Ofcom about prominence for news on social media?

C
Rebecca Stimson40 words

Our algorithm is designed to deliver personal experience. If you want to follow lots of news platforms—and we have hundreds—it would be prominent in your newsfeed. We have not had conversations about a more blanket approach to prominence for news.

RS
Chair26 words

When it comes to the news content that is suggested over and above what people have clicked to follow, is there any prominence for trusted PSBs?

C
Rebecca Stimson68 words

Yes. PSBs would be verified. They come with a blue tick; they are very clearly what you are looking at. As your colleagues have just been alluding to, where it is something false, misleading or poor quality, it is either removed from the platform or is downranked so that it has no engagement. That is why I am really pleased that PSBs are using our tools so successfully.

RS
Giles Derrington134 words

I am not aware of having any direct conversations with Ofcom on that, although I will check with colleagues. When we talk about news and some of the Ofcom data, it is worth understanding what we mean by news. An awful lot of the news content that people are looking for on TikTok—this is from Ofcom’s data—is celebrity or sports content as opposed to political news, for example. There is a slight disparity in some of those issues. As I say, the PSBs are highly prominent on the platform. The BBC News account has about 15 million followers currently, and they have a number of other accounts that will serve news as well. Again, they are verified, but also, when we see those high-risk events, we will actively ensure that people see them first.

GD
Chair69 words

Rebecca, you just said that the BBC is verified, so it will have prominence, but is it right that influencers like Jake Paul or streamers like Adin Ross, who has previously been banned from platforms for antisemitism, are given the same verified status as trusted news providers? Is that true? A verified status is a verified status, and that does not differentiate between a super-trusted PSB news provider and—

C
Rebecca Stimson63 words

They can serve slightly different purposes. Verification on authoritative, formal, recognised institutions and organisations is one thing. Then, often, individual creators want to use it because it helps reduce impersonation of their accounts. Obviously if you are banned from our platform or you have had previous content violations, you would not be able to use any of those tools or our monetisation tools.

RS
Chair14 words

Would there be a difference in the algorithmic prominence between those different verified accounts?

C
Rebecca Stimson82 words

As I said, the algorithm is designed to find and remove harmful content and to amplify best-quality, original content to people, and after that it is driven by what you in particular want to see. Yes, in the sense that if you are an unverified account promoting loads of clickbait, spammy, terrible content, it will be significantly downranked, if not removed if it violates rules, whereas we are incentivising the best possible creators and content to reach the top of that newsfeed.

RS
Chair21 words

To be clear, there would be no difference in algorithmic push between something like BBC News and someone like Adin Ross?

C
Rebecca Stimson42 words

It is tailored to each individual person. At a broad level, verified, authentic accounts that are meeting our rules will be higher up than ones that are not, but what you might see as an individual will be tailored to you specifically.

RS

Giles, TikTok has talked about helping teenagers engage with educational and informative content. Can you make an estimate of what proportion of the content on TikTok you could describe as that?

Giles Derrington126 words

It is hard to define that in the full panoply of the millions and millions of videos. “Education” can mean a range of things to a range of people, so there is a genuine challenge there. What I can share—this is data that we will publish in a couple of months’ time, so it is a sneak peek for the Committee—is that we are doing some work with Public First, which suggests that about 18 million hours-worth of educational content is being watched on the platform in the UK each year. Significant amounts of what is being watched is there. We also have a direct STEM feed, and we know that about a third of under-18s use that dedicated feed of educational content once a week.

GD

Did you say 18 million?

Giles Derrington8 words

I think it is 18.1 million hours, yes.

GD

How can you quality assure that educational content? How do parents know it is decent content?

Giles Derrington76 words

With the STEM feed, creators can seek to opt into that, but all that content is then checked to make sure it is high quality. When we get into the wider definition of education, from survey work we have done, we know that learning how to tie a tie, learning how to cook a recipe, and maths might all be considered education, so it is a little harder to find a pure metric of fact-checking quality.

GD

You talked earlier on about pointing users to iPlayer. I do not know if you know about the BBC initiative, “Other Side of the Story”.

Giles Derrington4 words

A little bit, yes.

GD

What can TikTok do to promote that initiative, which helps young audiences develop critical thinking, identify misinformation and that kind of thing?

Giles Derrington332 words

There are a couple of things that we do and can do. First, we work a lot, both speaking to our users and to the BBC and others, on how news is evolving and how young people might want to engage with news. One really interesting thing that may be worth noting is that we are increasingly seeing younger audiences say, “We want our news served to us in a more benevolent way”—a little bit more compassionate, a little bit less straight facts, because of the way they have engaged with society more generally. That can help shape the way that broadcasters like the BBC choose to present things to that new audience they are trying to reach. I think things like “Other Side of the Story” are part of an indication of that response to audience demands. That has happened over the years. Going back to the 1960s, BBC broadcasts were very different from what we see today. Secondly, as I say, in those high-profile events, making sure that when people are searching, they are finding that content first is important. Finally, we will do a whole load of work directly with the BBC and others on media training in partnership. It is not PSBs or broadcasters but, with the elections going on at the moment, we have our election hub. On any piece of content that has “#election” or a whole range of other hashtags, there will be a signpost to get the facts about the election. That will take you to a site that has been written with the Electoral Commission and, not to betray confidences too much—it is currently in the process of making them—will then have videos directly from the Electoral Commission on it, so that people can get facts. In those circumstances, we have often taken information from BBC fact-checking about how to get media literacy and put that on those pages as well, so people are being directly driven to that content in those instances.

GD
Chair22 words

You have been very patient with us today. Thank you for giving us so much of your time. That concludes today’s session.

C
Culture, Media and Sport Committee — Oral Evidence (HC 1338) — PoliticsDeck | Beyond The Vote