
Came across yet another crop of articles recently which painfully highlight the problems with having social-media companies manage their own moderation. This one <https://arstechnica.com/gaming/2022/09/youtube-age-restriction-quagmire-exposed-by-78-minute-mega-man-documentary/> shows the inconsistencies in the criteria that YouTube uses to apply age restrictions to uploaded videos. Not only are the criteria mysterious, but the appeal process is opaque, and too often the only way to get results is by appeal to the popular fanbase. Assuming you have one. Related to that, Facebook earlier banned any promotion on its platform of a re-release of a movie called “Beautiful Blue Eyes” <https://arstechnica.com/tech-policy/2022/09/holocaust-filmmaker-says-meta-did-not-completely-reverse-ad-ban/>, seemingly not because of its content (partly set during the Nazi Holocaust), but because of the title. An appeal was rejected with “our decision is final”. Only for the ban to be lifted later, absolutely nothing to do with any public outcry, you can take our word on that. Then there is this coronial inquest <https://arstechnica.com/tech-policy/2022/09/coroner-lists-instagram-algorithm-as-contributing-cause-of-uk-teens-death/> into the self-inflicted death of a young teen who had been watching way too many posts on Pinterest and Instagram promoting self-harm. Pinterest at least had the humility to admit that their moderation system was not satisfactory and needed to be improved. But Meta (parent company of Instagram and Facebook) sent their “head of health and well-being” to testify that the content was classified as “safe” according to their standards. Free speech between mature adults is one thing, keeping impressionable youngsters safe is quite another.