Facebook has a growing PR problem caused by its many irresponsible business practices. So they are responding boldly – by declaring they are for “regulation of the internet.” Their “Born in ’96” ad campaign is intended to convince all of us that they are a responsible company that supports wise regulatory policies to address the worst problems on the internet.
Have you seen their ads? Difficult to miss if you spend any time online.
Do you believe them? We don’t!
Steering wide to avoid the specifics of their record of scandal and strife, Facebook’s ads are cast with three different users. Each of them was born in 1996, the same year the Communications Decency Act (CDA) went into effect. And each one makes what sounds like a commonsense argument: since the internet has changed dramatically during their lifetime, the laws governing it need to be updated. If only Congress would act, the ads suggest, Facebook and the rest of the internet could clean up its act.
Facebook says that Congress should legislate on user privacy, election interference, data portability, and Section 230 of the Communications Decency Act. Let’s take a look at Facebook’s behavior in each of these areas.
Facebook says it wants “consistent data protection standards that work for everyone,” but its concerns for user privacy are just a façade for its main business model: greedily mining personal data to place targeted ads.
Ad sales account for 98% of Facebook’s revenue. While its service appears “free” to users, the personal information they collect is invaluable in the internet economy. In the first half of 2021, targeted ads brought Facebook over $54 billion, according to its first and second quarterly financial reports.
Those are much bigger bucks than what Facebook was pulling in back in 2018, at which time Members of Congress were already looking askance. In Senate testimony, Zuckerberg insisted, “… you control and own the information and content that you put on Facebook.” But Senator Tester (D-MT) had his number: “You’re making about 40 billion bucks a year on the data. I’m not making any money on it. It feels like you own the data.”
In other words, Facebook’s making an awful lot of money by controlling and using valuable property that, they insist, still belongs to you. And they haven’t been such great custodians of it, either…
When your property is stolen from someone else’s possession, you expect to be informed. So, did Facebook let 87 million users know when it learned their data was wrongfully obtained by Cambridge Analytica in 2015?
Nope. News broke in 2018 that Cambridge Analytica paid $800,000 to buy user data from a third-party app. When Facebook found out in 2015, it merely told Cambridge Analytica to delete the data – and pretended to believe they would. Meanwhile, the precious data was being weaponized to influence the 2016 U.S. election.
Afterward, Facebook became a proud supporter of the Honest Ads Act, which would entitle users to know the funding sources of political ads. But Facebook still had a massive political misinformation problem in 2020.
At a March 2021 Congressional hearing, House Energy and Commerce Committee Chairman Frank Pallone drew the inevitable conclusion about Big Tech: “The dirty truth is that they are relying on algorithms to purposefully promote conspiratorial, divisive, or extremist content so they can rake in the ad dollars.”
Facebook says it cares about protecting democratic elections, but (in)actions speak louder than words. Repeatedly, Facebook shows that it only wants to generate as much ad revenue as possible, as quickly as possible, for as long as possible.
It’s not as eager to give up its dirty tricks as it would like to appear.
Facebook says it wants to promote data portability. That sounds nice … except that Facebook is actually trying to deflect growing scrutiny on its business as an illegal monopoly.
If you own something – like, say, your data – then you should be able to take it with you wherever you go. Ever so respectful of the law, Facebook agrees in its pro-regulation campaign: “If you share data with one service, you should be able to move it to another. This gives people choice and enables developers to innovate.”
There’s one major problem: as of today, there are no other social media services comparable to Facebook, especially since Facebook’s size allows it to consume or defeat potential competitors. Senator Lindsey Graham (R-SC) suggested as much in 2018, when he asked Mark Zuckerberg (with a timbre of doubt), “Is there an alternative to Facebook in the private sector?”
Zuckerberg could only answer evasively.
In a 2020 report, the House Antitrust Committee found damning evidence that Facebook had long ago cornered the market in social media. According to one of the social media giant’s internal documents, it surpassed Myspace in 2009. By 2011, Facebook commanded 95% of the time that people were spending on social media. Facebook pleading with the government for regulation is hollow altruism. It knows full well that if data portability were actually made a reality, countless alternative social media platforms competing for our data would appear overnight. And if we had true choice of where we took our friends and our photos and everything else locked behind Facebook’s gates, wouldn’t many of us make a move to a less toxic platform?
While Zuckerberg likes to list YouTube and Twitter among his competitors, they do not represent viable alternatives to Facebook. As the House Antitrust Committee concludes, none of Zuckerberg’s so-called competitors truly offers “the core functionality of Facebook or its family of products.” And we at CreativeFuture can attest to this from our everyday online activities – there is simply no alternative to Facebook.
Monopoly, in real life, is not a game, but Facebook plays the deflection card like it is. Pleading for data portability is a brilliant tactic – but we’re not buying its sincerity.
In this badly bungled show of virtue, Facebook is actually calling for revisions to the Communications Decency Act. These unspecified changes will “ensure that tech companies are held accountable for combatting child exploitation, opioid abuse, and other types of illegal activity.”
Even when it became law back in 1996, in the earliest days of the public internet, Section 230 was never meant to permit the heinous acts that Facebook allows to flourish on its platform. We know it, Facebook knows it. Moreover, Facebook knows we know it – but they keep lying to us anyway.
If Facebook wanted illegal activities on its services to stop, they would stop them. But the ad revenue generated from illegal activities would stop, too. So, how would Zuckerberg then be able to fill his Scrooge McDuck swimming pool?
Facebook vs. Us
Supposedly, the purpose of Zuckerberg’s “free” service is “to try to help connect everyone around the world and to bring the world closer together,” as he told senators in 2018. It has become abundantly clear that Facebook’s true purpose is to generate as much profit as possible, regardless of who it harms.
Virtuously calling for regulation while doing little to change their behavior is a win-win strategy for Facebook. While regulations are delayed, Facebook continues to profit from illegal activity on its platforms. When regulations come, as appears increasingly likely, then Facebook can continue to pretend it is part of the solution – instead of the source of the problem.
We’re fairly certain that Facebook has no interest in changing its bad behavior. We’ve been waiting a long time to be proven wrong. Since Facebook says regulations would help, let’s hear from Facebook what issues – specifically – they think could be addressed through regulations, and which new rules they would actually support. One big issue we would nominate for their repair list: the continued rampant piracy of creative content on their platform.
But in the meantime, Zuck, there’s no reason to wait for new laws that may never come (since you and your Big Tech colleagues push back against any meaningful regulation). Put your (unfathomably large amounts of) money where your mouth is, and do something that doesn’t harm our society for once. It’s long past due.