Recently I wrote you with some serious concerns about your company’s unclear, frequently unfair, and entirely unfortunate practice of flagging promoted posts on the grounds that they are “political.” Here’s some of what I said:
Once Facebook deems your post to be “political” in nature, they will not let your promoted post run until you agree, through an automated authorization form, to register as a political organization. The form has proven to be easily exploitable, and what’s more, Facebook’s decision to err on the side of over-labeling content as “political” has led to many organizations being asked to classify as something they are not.
I haven’t heard from you, so I doubt you read it. That’s understandable. I know you’re a busy guy, and you’ve got a lot of fires to put out these days. But a lot of other people did read it, and they had a lot to say about it.
In fact, nothing we’ve ever published here at CreativeFuture has received as much engagement on Facebook as our letter about political ad filtering… on Facebook.
Clearly, we’ve struck a nerve with Facebook users, and it’s not difficult to see why.
Rather than develop a filtration policy that targets truly bad political actors – rogue foreign governments, Nazi sympathizers, racist organizations, and the like – with nuance and precision, you have elected to double down on a semi-automated catch-all that is suppressing legitimate voices… including ours!
Your community of more than two billion people increasingly needs you to use a scalpel to carve out systemic problems that threaten us all – yet you keep hitting us over the head with a hammer. It is not a good look, and it is eroding our already iffy trust in your company.
But this letter is not to rehash my arguments. More than 700 (and counting) people saw our piece about the tenuous line between content moderation and censorship and they decided they wanted to join the conversation. Since you haven’t joined it yet, I thought I might let you know what they’ve been saying.
You might want to know, for instance, that our follower Donny Lombardi said, “I use Facebook but I don’t trust it,” and that follower Winnie Hancock reflected, “Zuckerberg & FB have only one allegiance: their own self-interest. Not leaning left or right. Simply inward.” And, I should tell you that Vicki Shea thinks that “Facebook censors everyone” and that, generally speaking, she and her fellow members “have no voice anymore” on your platform.
Ouch. But then, that’s the impression you give when you censor, say, paid ads because they somehow pertain to vague “issues of national importance” such as “crime,” “energy,” and “health.” Yes, every decent person wants you to rid the platform of hate mongers and election meddlers – but no, we don’t want you to throw legitimate content out the window because it somehow got caught by some half-assed algorithm. That’s when people like Vicki don’t feel like they have a voice anymore – and they are right.
Our follower Elwood Wells told us he had a “photo tagged as inappropriate. It was a picture of snow with the line, ‘remember this when complaining about the heat.’” Why was this tagged as inappropriate? Our best guess is that your cyborg moderator saw the words “complaining” and “heat” in Elwood’s caption and assumed he was sparking some controversial discussion about climate change. That would be incorrect – but who gets to argue with a cyborg?
Such are the perils of a vague, poorly executed, largely human-free content moderation strategy! Clarity takes a backseat to confusion. As our follower Dean Fiora put it, “Facebook taught me that nipples are worse than Nazis.” Linda Gardner added that “There is no rhyme or reason to any of FB’s policing behavior,” while Ray Ganong wrote, “How do you [moderate content at] a company serving 3 billion users posting 19-20 items each a day? The company is too big.”
We can’t attest to the accuracy of Ray’s posting statistics, but his sentiment is spot-on. A lot of people are saying you’re “too big.” Of course, you still haven’t hired the volume of content moderators that such a humongous platform needs. But why not? It’s not as if you guys aren’t hugely profitable!
“Obviously,” commented our follower Kathleen Marie Cook, all this unfair censoring of ads and other content “could have been avoided with a little human interaction and solution-oriented customer service.”
We couldn’t agree more! A little human interaction and solution-oriented customer service is really not too much to ask from one of the world’s most profitable companies. But is it impossible? We hope not – because the most frustrating part about all this is that when it does happen, there’s nothing we can do about it! There is nobody we can talk to, no one who will help us figure out why our content was caught in Facebook’s fishing net.
The closest thing we’ve gotten to an answer is to be told that if we want to run “political” ads, we should “become authorized” by registering on Facebook as a “political” organization. Well, we aren’t running political ads, and we’re not a political organization, and neither are millions of others who, like us, have seen their otherwise legitimate posts blocked on political grounds.
Sigh. Maybe another of our followers, Mike Hatcher, had it right when he commented, “When you can lie under oath and not be charged, odds are it doesn’t matter how you treat your customers.” He may be referring to your Congressional hearing back in April of last year, when, it’s been alleged, you made false claims under oath about things like, oh, how much control users really have over their data.
But who’s going to hold Facebook accountable? The government doesn’t seem to be able to figure out how to do it. But how about your advertisers?
Just look at what Marie Barile Matzek had to say: “The sheer amount of ads [on Facebook] doesn’t make me want to buy stuff… just the opposite.” Or Keith Vincent: “I purchase nothing I see on Facebook. If I can’t trust the platform to have integrity, I can’t trust the advertisers.”
Yikes! If users feel this way, it sure doesn’t seem like advertisers are going to have confidence in your platform going forward, does it?
Of course, even as more and more people lose trust in you, you keep raking in the profits. Even scandal upon scandal hasn’t kept your immense market value down for long. But read the tea leaves.
We’re advertisers, too, of course, through our “promoted posts.” And since you operate the dominant social platform, we may have the same choices for the moment that other advertisers have – i.e., none. I know CreativeFuture sure sometimes wishes we could spend our ad dollars elsewhere, and I can’t imagine we’re alone.
I’ll leave you with one last comment, Mark, from one Tricia Ward: “I use Facebook for dog rescue,” she told us. “So many lives have been saved this way.”
I just wanted to end on a positive note, a reminder that it’s not all rotten out there in Facebook land. Unlike certain content moderation systems, we’re trying not to lose sight of the good things as we seek to fix the bad. But the bad things have got to get under control!
Header Photo: Frederic Legrand – COMEO / Shutterstock.com / CreativeFuture