Facebook Battles Election Interference, Internal CriticismDeparting CSO Urges Major Culture Change - More Transparency, Storing Less Data
Editor's note: On July 26, Facebook held an earnings call in which it offered greater detail on its security and privacy plans and investments, and previewed their impact on the company's long-term profitability (see Facebook's Security and Privacy Overhaul Comes at a Price).
See Also: Top 50 Security Threats
Facebook has been rocked by revelations that its platform was abused by nation-states and others as part of an effort to interfere in elections and political discourse around the world.
In response, the company has promised to hone its "news feed" to flag fake news and apply artificial intelligence to the problem of nation-state actors. But at least one insider suggests that wholesale cultural changes, especially involving data privacy practices, will be necessary for Facebook to remain viable.
In recent months, Facebook has been the focus of multiple federal probes - as well as an investigation by the U.K. data privacy watchdog - over its mishandling of tens of millions of users' private details, which ultimately ended up in the hands of Cambridge Analytica, among others (see Facebook: 87M Accounts May Have Been Sent To Cambridge Analytica).
In the wake of the scandal triggered by the public being alerted to Cambridge Analytica and others having obtained Facebook users' personal data and potentially used it to target them with advertising and disinformation campaigns, Facebook CEO Mark Zuckerberg appeared before Congress in April to answer questions. But many privacy watchers have suggested that Congress is unlikely to issue any new laws as a result (see Senators Raise Issue of Regulating Facebook).
Nevertheless, Facebook faces ongoing pressure to ensure that its platforms no longer become weaponized for use against its users. In Europe, Facebook must also continue to prove that it is sufficiently safeguarding users' privacy and complying with the EU's General Data Protection Regulation, which mandates that companies not store personal information for any longer than is necessary.
Facebook Promises Machine Learning
In a Tuesday call with reporters, Facebook officials said the company is using a variety of techniques, including applying artificial intelligence, to battle efforts by the Russian government and others to use the social network as a platform for information warfare, including election interference campaigns, Reuters reports.
But it remains to be seen whether artificial intelligence - in this case, better described as machine learning - will be effective at countering fake news and information warfare campaigns being run by nation-states.
On the Tuesday call, Tessa Lyons, manager of Facebook's core "news feed," told reporters that the company is continuing to refine its fact-checking process, and said that once stories are labeled as being false, users see a warning if they try to share it, which has led to an 80 percent decrease in the distribution of such stories, Reuters reported.
No 'Mass Exodus' Predicated
Despite the ongoing Cambridge Analytica data scandal and information warfare reports, users don't appear to have exited the social network. In March, market researcher Forrester said that based on its consumer research, it predicted there would be no "mass exodus" by either users or advertisers, especially as Facebook came into compliance with GDPR, which has been enforced since May.
Forrester noted that users are alert to social networks being used to convey fake news. And from a business standpoint, "Facebook's family of Instagram, FB Messenger, and WhatsApp is alive and well," Forrester said. "In the long term, we aren't saying it's business as usual. Facebook will have to clamp down on user data privacy - and with new GDPR parameters."
CSO: 'Be Willing to Pick Sides'
To make this happen, some senior officials inside Facebook have been urging CEO Mark Zuckerberg to overhaul the company's culture.
CSO Alex Stamos, in an internal memo posted on March 23, just days after the Cambridge Analytica scandal broke, said "tens of thousands of small decisions made over the last decade" had led to the company being in the position it is in.
"We need to build a user experience that conveys honesty and respect, not one optimized to get people to click yes to giving us more access," Stamos wrote in his memo, which was first published on Tuesday by BuzzFeed. "We need to intentionally not collect data where possible, and to keep it only as long as we are using it to serve people."
Stamos said Facebook needed to get better at listening - not only to users but also its own employees - especially "when they tell us a feature is creepy or point out a negative impact we are having in the world."
The CSO, who is set to depart Facebook later this year, said that the privacy enhancements needed to be made at the expense of short-term profit gain and that executives needed to be willing to take a clear moral stand. "We need to be willing to pick sides when there are clear moral or humanitarian issues. And we need to be open, honest and transparent about our challenges and what we are doing to fix them," Stamos said.
Zuckerberg Sparks Further Controversy
When it comes to combating fake news and hate speech, it's still not clear how proactive Facebook might be, what rules it has set for itself or whether they will be broadly effective.
Last week, for example, in an interview with Recode, Zuckerberg sparked controversy after suggesting that Holocaust denials shouldn't be removed from the site.
"I'm Jewish and there's a set of people who deny that the Holocaust happened," he told Recode. "I find it deeply offensive. But at the end of the day, I don't believe that our platform should take that down because I think there are things that different people get wrong. I don't think that they're intentionally getting it wrong. ... Everyone gets things wrong, and if we were taking down people's accounts when they got a few things wrong, then that would be a hard world for giving people a voice and saying that you care about that."
After Recode's report ran, Zuckerberg attempted to clarify his remarks. "Our goal with fake news is not to prevent anyone from saying something untrue - but to stop fake news and misinformation spreading across our services," he told Recode. "And of course if a post crossed the line into advocating for violence or hate against a particular group, it would be removed."
Chaos in Myanmar
Facebook, however, has its work cut out for it as it attempts to battle abuse of its platforms around the world.
Intelligence agencies in the Czech Republic, France, Germany, the Netherlands and other European countries have been warning of Russian efforts to use Facebook - among other platforms and outlets - as outlets for propaganda and targeted campaigns designed to cause chaos and undermine their political systems.
Facebook has also been a vehicle for hate speech in other parts of the world, including surrounding the mass migration from Myanmar - aka Burma - known as the Rohingya crisis. Researchers said they found extensive anti-Muslim campaigns being run via the site and blamed the hate speech and fake news for stoking "chaos" and confusion in the country, the Guardian reported.
Scant Time Before US Mid-Term Elections
The U.S. mid-term elections in November, meanwhile, are rapidly approaching. The country's intelligence chiefs have warned that not only did Russia attempt to interfere in the 2016 elections, but its interference efforts never stopped. So with only about three months to go until the elections, Facebook's efforts may be too little, too late.
And while the Department of Homeland Security has begun briefing states on attack trends and Congress has recently allocated funding to help, some states say that the help is too little, too late. They also say that they're getting no guidance on countering information warfare operations by Russia (see Will Congress Lose Midterm Elections to Hackers?).
Top Facebook Attorney to Exit
Stamos isn't the only senior executive inside Facebook who's announced his exit.
On Tuesday, Facebook's top attorney, Colin Stretch, announced in a Facebook post that he'll be leaving the company, which he joined in 2010, by the end of this year.
Stretch said he'd relocated with his family to Washington several years ago, but suggested that changes being put in place by CEO Mark Zuckerberg required someone at the helm of the company's legal department to be located at headquarters.
"As Facebook embraces the broader responsibility Mark has discussed in recent months, I've concluded that the company and the legal team need sustained leadership in Menlo Park," Stretch said.
Stretch has been an unusually visible general counsel. Last fall, he was the only representative from Facebook to appear before congressional committees that were investigating alleged Russian interference in the 2016 U.S. elections and beyond (see Senate Grills Tech Giants Over Russian Fake News).
Stretch's departure announcement follows that of Elliot Schrage, who headed Facebook's public policy and communications efforts until he announced his departure last month.