Stay intimated with the recent happenings and occurrences all over the world...your satisfaction is our priority.

Saturday, 3 November 2018

A human being at Facebook manually approved the idea of targeting ads to users interested in "white genocide"

A year ago, Facebook apologized for allowing advertisers to target its users based on their status as "Jew haters" and blamed an algorithmic system that automatically picked up on the most popular discussions on the platform and turned them into ad-targeting segments.

At the time, Facebook promised that they would put humans in the loop, creating human-AI centaurs that would subject the AI's lightning-speed assessments and vast capacity to read and analyze the conversations of billions of people to oversight by sensitive, sensible human beings who would not allow outright fascism and fascist-adjacent categories to surface as ad categories to be targeted for recruitment by violent extremists.

They kept their promise: now humans have to sign off on every ad category that Facebook generates. So when the ad category AI looked at the myriad of Facebook groups devoted to "white genocide" (the conspiracy theory that holds that non-white people are "outbreeding" white people, through a mix of unchecked fertility and "interbreeding") -- such as "Stop White South African Genocide," "White Genocide Watch" and "The last days of the white man"-- it automatically created the "white genocide" ad targeting niche.

And then a human Facebook worker approved this category and made it available for use by anyone with an ad purchasing account on the platform.

What's more, Facebook's algorithm was smart enough to suggest some related keywords that someone who wants to reach "white genocide" fans could use: "RedState," "Daily Caller," and a thoroughly debunked conspiracy theory about the plight of white South African farmers that was greatly favored by Robert Bowers, the man who opened fire in Pittsburgh a synagogue last week.

After The Intercept bought some ads targeted at "white genocide" users, Facebook finally suspended the category, confirming that "marketers" had used the category to target the group with "news coverage," and also that a human being had signed off on the category's creation to begin with.

There are lots of conclusions to draw from this. Here are a few:

1. Facebook is undertraining and underresourcing the human people who are supposed to serve as "guard-rails" on the algorithmic creation of ad categories. They need to spend more money on this. Possibly a lot more.

2. Facebook's affinity-tracing algorithm is smart enough to identify the places where fascists and white supremacists lurk on its platform: the fact that it knew that Tucker Carlson's Daily Caller was also a haven for white supremacy tells us that Facebook could, if it wanted to, closely target the human beings whose job it is to monitor discussions for terms-of-service violations.

Facebook draws a distinction between the hate-based categories ProPublica discovered, which were based on terms users entered into their own profiles, versus the “white genocide conspiracy theory” category, which Facebook itself created via algorithm. The company says that it’s taken steps to make sure the former is no longer possible, although this clearly did nothing to deter the latter. Interestingly, Facebook said that technically the white genocide ad buy didn’t violate its ad policies, because it was based on a category Facebook itself created. However, this doesn’t square with the automated email The Intercept received a day after the ad buy was approved, informing us that “We have reviewed some of your ads more closely and have determined they don’t comply with our Advertising Policies.”

Still, the company conceded that such ad buys should have never been possible in the first place. Vice News and Business Insider also bought Facebook ads this week to make a different point about a related problem: that Facebook does not properly verify the identities of people who take out political ads. It’s unclear whether the “guardrails” Leathern spoke of a year ago will simply take more time to construct, or whether Facebook’s heavy reliance on algorithmic judgment simply careened through them.

Facebook Allowed Advertisers to Target Users Interested in “White Genocide” — Even in Wake of Pittsburgh Massacre [Sam Biddle/The Intercept]

(via /.)

Share:

Popular Posts

Powered by Blogger.