It only takes a few days for something like this to be old news, but I couldn't let it pass without comment. It remains to be seen howe effective it will be, and I can't help but think it underestimates how deep this problem runs, and exactly what is happening.
The changes won’t prevent people from uploading offensive content to YouTube, which hoovers up hundreds of hours of new video per minute. But they are meant to make it hard for the people who upload that stuff to make money from it. And they are an important symbolic change for YouTube, which was founded on the idea that anyone can use the platform, and has spent years trying to entice video makers to find audiences and create careers on the site.
What Youtube, and tech companies in general can't seem to grasp is the social aspect of technology. One of the more thoughtful pieces I have read on certain fringe workings of YouTube was written by James Bridle at the end of last year. That article address concerns I could identify with personally. The ability to combine automation with disturbing attention grabbing techniques aimed at children has, over time, made the some of the most bizarre and unsettling elements of that platform some of the most lucrative. It might be enabled by technology, but this is a social issue. As Bridle writes,
I’m trying to understand why, as plainly and simply troubling as it is, this is not a simple matter of “won’t somebody think of the children” hand-wringing. Obviously this content is inappropriate, obviously there are bad actors out there, obviously some of these videos should be removed. Obviously too this raises questions of fair use, appropriation, free speech and so on. But reports which simply understand the problem through this lens fail to fully grasp the mechanisms being deployed, and thus are incapable of thinking its implications in totality, and responding accordingly.
The first is the level of horror and violence on display. Some of the times it’s troll-y gross-out stuff; most of the time it seems deeper, and more unconscious than that. The internet has a way of amplifying and enabling many of our latent desires; in fact, it’s what it seems to do best. I spend a lot of time arguing for this tendency, with regards to human sexual freedom, individual identity, and other issues. Here, and overwhelmingly it sometimes feels, that tendency is itself a violent and destructive one.
The second is the levels of exploitation, not of children because they are children but of children because they are powerless. Automated reward systems like YouTube algorithms necessitate exploitation in the same way that capitalism necessitates exploitation, and if you’re someone who bristles at the second half of that equation then maybe this should be what convinces you of its truth. Exploitation is encoded into the systems we are building, making it harder to see, harder to think and explain, harder to counter and defend against. Not in a future of AI overlords and robots in the factories, but