Opinions

AI cant fix Facebook

BERLIN — Facebook is failing to contain hate speech. Nowhere is this more apparent than in Myanmar, where the companys systems miss thousands of posts that incite violence against the Rohingya community that was the target of acts of genocide last year.

Facebooks struggles in conflict countries are the result of its own business model and highlight the limitations of applying tech solutions to societal problems. The companys unwillingness to pay attention to the political context of the countries in which it operates and its insistence on artificial intelligence where it should rely on human solutions can only end badly.

When violence broke out in Myanmar last August, Democracy Reporting International monitored an influential Facebook group run by supporters of a well-known extremist Buddhist group called Ma Ba Tha. The group had been relatively passive for some time, but sprang into action within hours of attacks perpetrated by the Arakan Rohingya Salvation Army (ARSA). The groups Facebook posting patterns — and the 470,000 shares of posts that either explicitly or implicitly called for the expulsion of Rohingya people from Myanmar — suggested that the group was using the attacks as an opportunity to spread hatred online against the Rohingya, who soon began fleeing the country.

Basic knowledge of Myanmar, a country with a long history of ethnic and religious conflict, would have made it obvious that the group should be monitored. Surprisingly, this was not obvious to Facebook, which only shut down the group eight months later, in April.

Facebook has a track record of aggressively moving into developing countries and offering a stripped-down web service through a limited Facebook app. This was its strategy in Myanmar until 2017, and it was so successful that for many people Facebook and the open internet became indistinguishable. But what looked like a commercial dream quickly became a nightmare in a country divided by many ethnicities and little experience with free speech under a decades-long dictatorship.

Facebook should open offices in every country where it has a significant operation, and should invest in classical conflict and political analysis.

The problem is that Facebook has not opened offices in many of the developing countries where it dominates the public sphere. It has no presence in Myanmar, where it boasts more than 14 million users, and theres no one on the ground to notice when things go wrong. Nor does Facebook listen to those who might help. When activists warned the company of problems with its product in Myanmar well ahead of the militarys “clearance operations” last year, the company waited until Western media highlighted the same issues before taking action.

The problem is compounded by Facebooks belief that every problem can be solved with a purely technological solution. But just as these solutions failed in Myanmar, they are likely to fail in many conflicts to come.

Facebook relies on artificial intelligence to find and root out hate speech that violates its so-called community standards. This approach may seem fair, because it is blind to who says what and simply looks for certain words and word combinations. It can work in certain circumstances. But in Myanmar, Facebook has already discovered the limits of this approach.

When it removed posts that contained a specific word, the result was a lot of needless collateral censorship. A vast amount of hate speech also fell through the cracks because of technical issues: Facebook users in Myanmar posted in two different fonts online, for example, and shared hateful, dehumanizing images and videos, which fall beyond the reach of AI understanding.

Rohingya refugees fleeing Myanmar for Bangladesh in December 2017 | Ed Jones/AFP via Getty Images

An actor-blind approach is clearly attractive to Facebook because it means the company cannot be accused of political bias. The promise is that AI will treat all offenses equally. But this simply does not work in practice.

The United Nations has adopted binding guidance on businesses responsibility to respect human rights and has made clear that companies must track the human rights impact of their products and remedy serious problems as quickly as possible. Facebook CEO Mark Zuckerberg told U.S. Congress that artificial intelligence would solve the problem of accurately tracking hate speech in 5 to 10 years. But that is not a speedy remedy — it is an admission of prolonged corporate negligence.

There may be tech solutions to these issues, but as long as they dont work, Facebook and other platforms have a moral — as well as legal — obligation to invest in human solutions.

Facebook should open offices in every country where it has a significant operation, and should invest in classical conflict and political analysis. It should move quickly to shut down groups or pages that manifestly overstep community standards and fuel violent conflict, and do more to support local ecosystems of fact-checkers and social media monitors to reduce the possibility of abuse on its own platform.

The tech giant likes to say it is in the business of creating a better world, but it is a profit-making operation like any other. It does not need to become an NGO. It simply needs to accept the responsibilities of any large business in a volatile and violent world.

Ray Serrato and Michael Meyer-Resende work for Democracy Reporting International, a Berlin-based NGO supporting political participation.

Read this next: Germanys Islam problem

Original Article

[contf] [contfnew]

Related Articles

Opinions

Biden’s mistake about Trump

(CNN)Joe Biden is making a mistake.The President-elect is right in believing that...

Opinions

Ginsburg flap shows Supreme Court, justices are too important

Ruth Bader Ginsburg has died, and the country — or at least...

Opinions

Iraq makes most politicians look bad. Joe Biden’s record is actually a bit above average.

During the Republican National Convention, we heard an argument that will surely...

Opinions

We’re trapped in an obscene distortion of democracy. But we don’t have to be: Ellis Cose

The United States is a famously and proudly diverse coun­try. But because...