5 Min Read
(Reuters) - Facebook Inc's FB.O new content oversight board will include a former prime minister, a Nobel Peace Prize laureate and several constitutional law experts and rights advocates among its first 20 members, the company announced on Wednesday.
The independent board, which some have dubbed Facebook’s “Supreme Court,” will be able to overturn decisions by the company and Chief Executive Mark Zuckerberg on whether individual pieces of content should be allowed on Facebook and Instagram.
Facebook has long faced criticism for high-profile content moderation issues. They range from temporarily removing a famous Vietnam-era war photo of a naked girl fleeing a napalm attack, to failing to combat hate speech in Myanmar against the Rohingya and other Muslims.
The oversight board will focus on a small slice of challenging content issues including hate speech and harassment and people’s safety.
Facebook said the board’s members have lived in 27 countries and speak at least 29 languages, though a quarter of the group and two of the four co-chairs are from the United States, where the company is headquartered.
The co-chairs, who selected the other members jointly with Facebook, are former U.S. federal circuit judge and religious freedom expert Michael McConnell, constitutional law expert Jamal Greene, Colombian attorney Catalina Botero-Marino and former Danish Prime Minister Helle Thorning-Schmidt.
Among the initial cohort are: former European Court of Human Rights judge András Sajó, Internet Sans Frontières Executive Director Julie Owono, Yemeni activist and Nobel Peace Prize laureate Tawakkol Karman, former editor-in-chief of the Guardian Alan Rusbridger, and Pakistani digital rights advocate Nighat Dad.
Related Coverage
Nick Clegg, Facebook’s head of global affairs, told Reuters in a Skype interview the board’s composition was important but that its credibility would be earned over time.
“I don’t expect people to say, ‘Oh hallelujah, these are great people, this is going to be a great success’ - there’s no reason anyone should believe that this is going to be a great success until it really starts hearing difficult cases in the months and indeed years to come,” he said.
The board will start work immediately and Clegg said it would begin hearing cases this summer.
The board, which will grow to about 40 members and which Facebook has pledged $130 million to fund for at least six years, will make public, binding decisions on controversial cases where users have exhausted Facebook’s usual appeals process.
The company can also refer significant decisions to the board, including on ads or on Facebook groups. The board can make policy recommendations to Facebook based on case decisions, to which the company will publicly respond.
Initially, the board will focus on cases where content was removed and Facebook expects it to take on only “dozens” of cases to start, a small percentage of the thousands it expects will be brought to the board.
“We are not the internet police, don’t think of us as sort of a fast-action group that’s going to swoop in and deal with rapidly moving problems,” co-chair McConnell said on a conference call.
The board’s case decisions must be made and implemented within 90 days, though Facebook can ask for a 30-day review for exceptional cases.
“We’re not working for Facebook, we’re trying to pressure Facebook to improve its policies and its processes to better respect human rights. That’s the job,” board member and internet governance researcher Nicolas Suzor told Reuters. “I’m not so naive that I think that that’s going to be a very easy job.”
He said board members had differing views on freedom of expression and when it can legitimately be curtailed.
John Samples, vice president of the libertarian Cato Institute, has praised Facebook’s decision not to remove a doctored video of U.S. House Speaker Nancy Pelosi. Sajó has cautioned against allowing the “offended” to have too much influence in the debate around online expression.
Some free speech and internet governance experts told Reuters they thought the board’s first members were a diverse, impressive group, though some were concerned it was too heavy on U.S. members. Facebook said one reason for that was that some of its hardest decisions or appeals in recent years had begun in America.
“I don’t feel like they made any daring choices,” said Jillian C. York, the Electronic Frontier Foundation’s director of international freedom of expression.
Jes Kaliebe Petersen, CEO of Myanmar tech-focused civil society organization Phandeeyar, said he hoped the board would apply more “depth” to moderation issues, compared with Facebook’s universal set of community standards.
David Kaye, U.N. special rapporteur on freedom of opinion and expression, said the board’s efficacy would be shown when it started hearing cases.
“The big question,” he said, “will be, are they taking questions that might result in decisions, or judgments as this is a court, that go against Facebook’s business interests?”
Reporting by Elizabeth Culliford in Birmingham, England; Editing by Tom Brown and Matthew Lewis