Mass shooting in New Zealand spurs calls for moderation on social media

A series of anti-Muslim terror attacks in Christchurch, New Zealand has horrified the world today, in no small part thanks to the fact that the terrorists tailor-made the atrocity to go viral on multiple social media platforms. In the aftermath of the tragedy, big tech needs to examine if it's doing enough to limit the spread of hate online.

The terrorists targeted mosques in a series of shootings that left 49 dead and many more wounded.  Several people have already been arrested and one man charged with murder. A gunman identifying himself as an Australian and espousing racist anti-immigrant views live-streamed the massacre at Al Noor mosque to Facebook, and a video of the shootings were also posted to YouTube, Twitter, and Instagram. 

The videos, along with a manifesto that appeared on online messageboard 8chan (itself an infamous anything-goes hangout for right-wing extremists) displayed both a deep knowledge of online right-wing extremist Internet culture and a social-media savviness calculated to get as much attention online as possible.

A shooting for social media

The shooters video, for example, has him remind watchers ''Remember, lads, subscribe to PewDiePie,'' referring to megastar YouTuber Felix Kjellberg, who's gotten heat (and praise from right-wing extremists) for promoting anti-Semitism in his videos. Kjellberg has galvanized his 89 million followers to evangelize his channel in order to stay the most popular on YouTube, and, since he was mentioned by the killer, has been forced to condemn it publicly, thus spreading the news to his huge following, many of which are children.

The manifesto attached to the shootings also name drops more cultural phenomenon designed to appeal to Internet-savvy youth. Even Fortnite gets a shout-out. But more significantly, it choc-full of extreme right wing racist talking points, referencing ''white genocide'', and the ''14 words'', a Neo-Nazi mantra. It also expresses admiration for other white supremacist killers. Anyone with a passing familiarity with online communities or poltical pundits may recognize these talking points: buzzwords that are very common in the 'alt-right' videos that you're all too likely to have show up in your YouTube recommendations, and in the same hate communities that plague Twitter, toxic subreddits and 8chan's infamous /pol channel. 

The title of the manifesto 'The Great Replacement', itself referenced a YouTube video by right-wing pundint Lauren Southern. In the aftermath of the shooting, Southern's video was taken down for a few hours before being made public again.

The responsibility of big tech

The shooter or shooters want to stay in the news and it couldn't have been done better if they had their own SEO team. Both the video and the manifesto have spread far and wide online, and despite efforts from online platforms to suppress them, they keep popping up again and again. YouTube and Twitter host multiple copies, file sharing sites with little to no oversight host the manifesto.

There's no way to highlight this without first discussing the atrocity itself, but this phenomenon is illustrative of a criticism that has long been leveraged at big tech platforms. Moderation is woefully inadequate to the point of being harmful. A video uploaded to Facebook or Twitter is auto-played by default, something which advertisers love but led to people being involuntarily exposed to this horror, and empowered the terrorists to achieve their goal of attention. Once information is out there, there are no controls to stop people uploading it again and again. And algorithms designed to help something go viral and reach as many people as possible (again, for the benefit of advertisers) can be exploited and hijacked for online hate.

YouTube has had a problem with hosting and amplifying racist content for years, but it's never had a good moderation policy, and it's algorithms have previously come under fire for facilitating child exploitation

Facebook, on the other hand, under-pays, under-informs and over-works its moderators to the point of complete ineffectiveness. On Twitter, the attacks were praised by many white supremacist accounts, which persist on Twitter despite multiple calls for the social media giant to ban neo-Nazis from the platform. The insane game of whack-a-mole social media giants are playing to suppress these videos didn't have to be this way if they hadn't prioritized growth and attention before all else.

Social media platforms are finely tuned social engineering machines designed to focus mass attention as much as possible for profit, but we are increasingly shown how these systems can be exploited for hatred abuse, and murder. The major platforms — Facebook, Twitter, and Google —  have such a monopoly on the spread of information on our connected world that can't easily wash their hands of this. It's not just about controlling types of multimedia like today's video: the enabling of the hate groups that these terrorists came from and are calling upon has got to stop.

The New Zealand Council of Victim Support Groups has created a relief fund for the victims of the Christchurch shootings. You can donate here

Let's block ads! (Why?)



Read More Open link https://ift.tt/2TOxvJD

Related Posts :

0 Response to "Mass shooting in New Zealand spurs calls for moderation on social media"

Posting Komentar