Quis Custodiet Ipsos Custodes?

Came across yet another crop of articles recently which painfully highlight the problems with having social-media companies manage their own moderation. This one <https://arstechnica.com/gaming/2022/09/youtube-age-restriction-quagmire-exposed-by-78-minute-mega-man-documentary/> shows the inconsistencies in the criteria that YouTube uses to apply age restrictions to uploaded videos. Not only are the criteria mysterious, but the appeal process is opaque, and too often the only way to get results is by appeal to the popular fanbase. Assuming you have one. Related to that, Facebook earlier banned any promotion on its platform of a re-release of a movie called “Beautiful Blue Eyes” <https://arstechnica.com/tech-policy/2022/09/holocaust-filmmaker-says-meta-did-not-completely-reverse-ad-ban/>, seemingly not because of its content (partly set during the Nazi Holocaust), but because of the title. An appeal was rejected with “our decision is final”. Only for the ban to be lifted later, absolutely nothing to do with any public outcry, you can take our word on that. Then there is this coronial inquest <https://arstechnica.com/tech-policy/2022/09/coroner-lists-instagram-algorithm-as-contributing-cause-of-uk-teens-death/> into the self-inflicted death of a young teen who had been watching way too many posts on Pinterest and Instagram promoting self-harm. Pinterest at least had the humility to admit that their moderation system was not satisfactory and needed to be improved. But Meta (parent company of Instagram and Facebook) sent their “head of health and well-being” to testify that the content was classified as “safe” according to their standards. Free speech between mature adults is one thing, keeping impressionable youngsters safe is quite another.

On 01/10/2022 12.00, Lawrence D'Oliveiro wrote:
Came across yet another crop of articles recently which painfully highlight the problems with having social-media companies manage their own moderation.
It has been a while since I studied Latin (the pain has (almost) receded) but surely describing, or even imagining, such organisations as watchmen/guards/custodians/caretakers, is the larger part of the problem - yet this is exactly our government's attitude: 'we can trust them to do right...'! See also: - New Zealander's personal data held by government agencies is quite possibly stored 'in the cloud' off-shore and by organisations primarily subject to other jurisdictions, rather than New Zealand's Privacy or other applicable legislation. - school children whose 'education' is being tracked, similarly with no opportunity to opt-out. Question: Do these corporates' operations better-align with 'the American Dream' or the Kiwi sense of 'fairness'? Of the two, who is 'dreaming' most/worst? -- Regards =dn

On Sat, 1 Oct 2022 12:22:09 +1300, DL Neil wrote:
... surely describing, or even imagining, such organisations as watchmen/guards/custodians/caretakers, is the larger part of the problem - yet this is exactly our government's attitude: 'we can trust them to do right...'!
There is something to be said for regulating with a light-touch, or taking a hands-off attitude, in the early days of some new industry. I think even ten years ago, the social-media platforms could claim the benefit of the doubt in terms of their impact (positive or negative) on society. But those days are past.
- New Zealander's personal data held by government agencies is quite possibly stored 'in the cloud' off-shore and by organisations primarily subject to other jurisdictions, rather than New Zealand's Privacy or other applicable legislation. - school children whose 'education' is being tracked, similarly with no opportunity to opt-out.
We need our own Max Schrems ...
participants (2)
-
DL Neil
-
Lawrence D'Oliveiro