Facebook isn't very good at selling you things on behalf of its advertisers, so the company has to gather as much data as possible on you and use it keep you clicking as much as possible in the hopes of eventually scoring a hit with its targeting system, and that means that it often commits unwitting -- but utterly predictable -- acts of algorithmic cruelty.
The best example of this is Facebook's "memories" and "celebration" tools, which memorialize the anniversaries of content that attracted a lot of attention from the people in your social graph. Often this is stuff that's legitimately good news, like graduations and other moments of personal triumph, but inevitably, it also includes the most tragic moments in your life. In 2014, Eric Meyer described how Facebook greeted him on the anniversary of his young daughter's tragic death with a picture of her surrounded by dancing figures. Facebook promised to fix it, and never did: last year, Facebook woke up Patrick Gerard on the anniversary of his mother's funeral with an animation of cartoon characters literally dancing on his mother's grave.
But it's not just grief that inappropriately triggers Facebook's celebration/memories algorithms. Terror recruiting groups who have scored major PR wins (often because they've won a key battle by murdering their enemies, or because they've posted a particularly grisly atrocity videos) are getting extra mileage out of their victories, courtesy of Facebook's algorithms.
According to a five-month-long, 3,000-page National Whistleblowers Center study of terror groups on Facebook, the celebration/memories algorithm is auto-generating anthology pages that celebrate and frame the most effective terror messages created by extremists, giving them much-needed help in promoting their message to their base.
The Whistleblower's Center said it filed a complaint with the US Securities and Exchange Commission on behalf of a source that preferred to remain anonymous.
"Facebook's efforts to stamp out terror content have been weak and ineffectual," read an executive summary of the 48-page document shared by the center.
"Of even greater concern, Facebook itself has been creating and promoting terror content with its auto-generate technology."
Survey results shared in the complaint indicated that Facebook was not delivering on its claims about eliminating extremist posts or accounts.
Whistleblower Says Facebook Generating Terror Content [AFP/Security Week]
(via /.)