What is Skibidi Toilet? The horrific YouTube movies giving parents nightmares

[ad_1]

The first episode, posted in February 2023, was Crazy Frog via David Cronenberg: a disembodied head singing a nonsense song (“Skibidi dop dop dop yes yes”, and so on) while leering menacingly from a dirty latrine. But in the months since, creator Alexey Gerasimov has expanded the project into a grindingly bleak dystopian saga, in which hordes of sentient toilet-people battle an army of humans with CCTV cameras for heads on a smouldering wasteland. 

In one recent instalment, a giant camera-headed humanoid plunges a circular saw into one of his enemies’ faces, sending blood pluming into the air. Whatever your views on cartoon violence, Tom and Jerry this is not. Yet thanks to a combination of algorithmic insistence and the natural viral forces of primary-school chatter, it’s become an enormous playground craze.

Even as someone who writes about moving images for a living, I hear about all of these fads from my own primary-age children: the school run is now regularly followed by a hasty Googling session, when I work out what gibberish or worse is currently on trend, and decide whether or not it’s allowed. (For what it’s worth, nothing mentioned in this piece is, though before bedtime yesterday, they did both beg to help me research.)

Yet change may be at hand. One feature of the Online Safety Act, passed towards the end of last year, is a new mandatory code of practice for online platforms around protecting children not only from illegal content, but material deemed unsuitable too. This is due to come into force by spring 2025, after more urgent concerns like videos which promote suicide and self-harm have been addressed.

For YouTube, explains Rani Govender, the NSPCC’s senior policy officer for child safety online, this will almost certainly entail the introduction of “age assurance” measures – mechanisms which prevent younger users from accessing material that falls outside the Ofcom-approved standard for their demographic bracket. Sites which fail in this regard will be liable for fines of up to £18 million or 10 per cent of their annual turnover – whichever sum is higher – and can also be blocked by Ofcom outright.

“Getting this right,” says Govender, “is the key to getting everything right. You can’t protect children online unless you can identify who the children actually are. So we would expect any platform which hosts this sort of material to put robust age assurance measures in place.”

The algorithms pushing certain types of content over others will also be scrutinised. Currently, anyone with a passing knowledge of the platform will know that it tends to shepherd viewers towards increasingly extreme content in order to hold their attention.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *