Recode | YouTube is kicking “tens of thousands” of video-makers out its advertising program

It only takes a few days for something like this to be old news, but I couldn't let it pass without comment.   It remains to be seen howe effective it will be, and I can't help but think it underestimates how deep this problem runs, and exactly what is happening.

From Recode

The changes won’t prevent people from uploading offensive content to YouTube, which hoovers up hundreds of hours of new video per minute. But they are meant to make it hard for the people who upload that stuff to make money from it. And they are an important symbolic change for YouTube, which was founded on the idea that anyone can use the platform, and has spent years trying to entice video makers to find audiences and create careers on the site.

What Youtube, and tech companies in general can't seem to grasp is the social aspect of technology.  One of the more thoughtful pieces I have read on certain fringe workings of YouTube was written by James Bridle at the end of last year. That article address concerns I could identify with personally. The ability to combine automation with disturbing attention grabbing techniques aimed at children has, over time, made the some of the most bizarre and unsettling elements of that platform some of the most lucrative. It might be enabled by technology, but this is a social issue. As Bridle writes,

I’m trying to understand why, as plainly and simply troubling as it is, this is not a simple matter of “won’t somebody think of the children” hand-wringing. Obviously this content is inappropriate, obviously there are bad actors out there, obviously some of these videos should be removed. Obviously too this raises questions of fair use, appropriation, free speech and so on. But reports which simply understand the problem through this lens fail to fully grasp the mechanisms being deployed, and thus are incapable of thinking its implications in totality, and responding accordingly.

The first is the level of horror and violence on display. Some of the times it’s troll-y gross-out stuff; most of the time it seems deeper, and more unconscious than that. The internet has a way of amplifying and enabling many of our latent desires; in fact, it’s what it seems to do best. I spend a lot of time arguing for this tendency, with regards to human sexual freedom, individual identity, and other issues. Here, and overwhelmingly it sometimes feels, that tendency is itself a violent and destructive one.

The second is the levels of exploitation, not of children because they are children but of children because they are powerless. Automated reward systems like YouTube algorithms necessitate exploitation in the same way that capitalism necessitates exploitation, and if you’re someone who bristles at the second half of that equation then maybe this should be what convinces you of its truth. Exploitation is encoded into the systems we are building, making it harder to see, harder to think and explain, harder to counter and defend against. Not in a future of AI overlords and robots in the factories, but right here, now, on your screen, in your living room and in your pocket.

Addressing the monetisation is a start, but as Bridle was apt to point out, these are big problems built right into the infrastructure. And, not just the technical infrastructure. Whether you won’t to believe it or not, technology is developed by people who make decisions, and thereby coded with intentionality. Kids like Logan Paul — and he really is still a kid — have been socialised by this media. There is no exceptionalism here. Bridle goes on,

And right now, right here, YouTube and Google are complicit in that system. The architecture they have built to extract the maximum revenue from online video is being hacked by persons unknown to abuse children, perhaps not even deliberately, but at a massive scale. I believe they have an absolute responsibility to deal with this, just as they have a responsibility to deal with the radicalisation of (mostly) young (mostly) men via extremist videos — of any political persuasion. They have so far showed absolutely no inclination to do this, which is in itself despicable. However, a huge part of my troubled response to this issue is that I have no idea how they can respond without shutting down the service itself, and most systems which resemble it. We have built a world which operates at scale, where human oversight is simply impossible, and no manner of inhuman oversight will counter most of the examples I’ve used in this essay. The asides I’ve kept in parentheses throughout, if expanded upon, would allow one with minimal effort to rewrite everything I’ve said, with very little effort, to be not about child abuse, but about white nationalism, about violent religious ideologies, about fake news, about climate denialism, about 9/11 conspiracies.

Before the yelling of ’keep you politics out of technology’ starts, I’ll nix it up front by pointing out that notion is, in itself, political. I recommend reading Bridle’s essay, whether or not you have children of your own. You can find it here