OPINION | Zuckerberg's response to criticism of Facebook falls short
- Former employee turned whistle-blower, Frances Haugen, says Facebook knowingly harmed its users.
- Mark Zuckerberg, Facebook co-founder and CEO, denies that it did this.
- Haugen's revelations come amid growing calls to break up the social media group.
- For more stories go to www.BusinessInsider.co.za.
At any other time, dealing with a major outage would have been Mark Zuckerberg's biggest problem.
Aside from getting its 2.9 billion users back online after a six-hour-long outage, the co-founder and CEO of the world’s largest social media group also had to deal with the revelations of former employee turned whistle-blower, Frances Haugen, who, in a series of interviews and an appearance before the US congress, said that Facebook knowingly harmed its users.
Haugen not only testified, but she also brought internal documents to prove her case that, when given the choice, Facebook chose profits ahead of child safety and the spreading of misinformation.
“The choices being made inside of Facebook are disastrous for our children or our public safety for privacy and for our democracy. And that is why we must demand Facebook changes,” she told a US Senate subcommittee.
As witnesses go, Haugen is not what Facebook would have wanted. She could explain the inner workings of Facebook and exposed its shortfalls in a way that was easy to follow.
It was the kind of testimony that could turn notions of breaking up Facebook into action.
Haugen's appearance should not be seen in isolation. Both the left and the right in US politics are losing patience with the group’s ability to sway discourse. It also follows the release of the Netflix documentary The Social Dilemma which brought to light the negative impact social media is having on society.
If Facebook were to be broken up, the goings-on last week would have been key to why it happened. Though Haugen's testimony would have played a part, my guess is how Facebook and Zuckerberg responded to the revelations would have also played as big a role.
A force for good
Zuckerberg sensed the vulnerability the group was in. In response, he made public a note he sent to staff arguing that its research has been “mischaracterised”, and that it did care for the welfare of children and that it was a force for good in the world.
To prove that Facebook was acting in the good faith, he said it that it was not only committed to doing more research, it was also going to make more of its research publicly available.
He also stressed that the advertising group was not only willing to work with authorities, it also urging the US congress to update legislation that governs the internet.
“I don't believe private companies should make all of the decisions on their own.”
There was some also some introspection on the part of Zuckerberg. “I've spent a lot of time reflecting on the kinds of experiences I want my kids and others to have online, and it's very important to me that everything we build is safe and good for kids.”
Though he stresses that he also wants his children to be safe, it should be noted that it's not the same as asking if its critics are right.
In essence, Zuckerberg's response to his critics was just a broad explanation of how he and Facebook sees itself.
Looking at what he had to say, I wonder if the decision makers at Facebook have fallen prey to the thing, they often been accused of doing, which is creating communities where their opinions are reflected back to them. This is where, instead of encouraging dialogue with opponents, they are turned into the “other” and their arguments are ignored or derided.
“Groupthink” in any organisation is dangerous but also quite natural. In newsrooms for instance, when an outsider questions why something was covered in a certain way, it’s not uncommon to hear a journalist respond to this criticism by saying, “people just don’t get how media works.”
The right questions
In showing Facebook’s research, Zuckerberg is hoping it can get its critics to understand how it works. But maybe that’s the problem. Has Facebook become blind to understanding how it actually works?
To me, the real question is not the bona fides of its research, but rather how ethics are built into Facebook’s decision making. Are there, for instance, any ethical guiderails that push it away from harming society? Are dissenting voices encouraged? Is there an ethics panel that has the power to overrule managerial decisions?
Put another way, what’s the danger of it acting like the space agency, Nasa, where its growing tolerance of risk led to two of its space shuttles being destroyed in flight?
Nasa’s mission is to send people to space and bring them back safely. At its height there were 8,000 people employed in the shuttle programme, all of whom were committed to doing just this.
Despite the amount of resources and people invested in the programme, it was a growing blindness to issues around safety that led to the destruction of the shuttles.
Zuckerberg says it has employed people just to fight the spread of harmful content, but like Nasa before it, just committing people to deal with safety concerns is not enough. Safety in social media is not just about the resources put into limiting the spread of dangerous content, it also requires an examination of possible ethical blind spots.
For Facebook to demonstrate that it takes seriously the troubling concerns that have been raised over the past few years, it must do more than say its detractors are wrong. It must make a real attempt to answer the hard questions posed to it.
Get the best of our site emailed to you every weekday.
Go to the Business Insider front page for more stories.