Home » Technology » The board of directors says Facebook “lost” an important rule for three years

The board of directors says Facebook “lost” an important rule for three years


Facebook “lost” an important guideline for three years and only noticed it after the supervisory board had started to deal with the issue, according to the latest decision of the board of directors. In its decision, the board questioned Facebook’s internal policies, saying the company should be more transparent about whether other key policies may have been “lost”.

The underlying case comes from an Instagram post about Abdullah Öcalan in which the poster “encouraged readers to talk about Öcalan’s incarceration and the inhuman nature of solitary confinement.” (As the board of directors notes, Öcalan is a founding member of the Kurdistan Workers’ Party, which Facebook has officially described as a “dangerous organization”.)

Facebook initially removed the post because Facebook users are prohibited from praising dangerous organizations or individuals or showing their support. However, Facebook also had “internal guidelines”, some of which arose as a result of the discussions about Öcalan’s imprisonment, which “enable discussions about the conditions of detention for persons classified as dangerous”. However, this rule was not applied, even after the first objection from the user. Facebook told the board that it “accidentally failed to carry over” this part of its policy when it switched to a new review system in 2018.

Although Facebook had already admitted the mistake and resumed the post, the board said it was “concerned” with how the case was handled and that “a major political exception” had effectively fallen through the cracks for three years.

“The board is concerned that Facebook lost specific guidance on an important exemption for three years,” the group wrote. “Facebook’s policy of not removing content with ‘assist’ for certain people while hiding important exceptions from the public has allowed this bug to go unnoticed for long periods of time. Facebook only learned that this policy was not being applied because the user decided to appeal the company’s decision to the board of directors.

The board also reprimanded Facebook for not being transparent about how many other users may have been affected by the same issue. Facebook told the board that it was “technically not feasible” to determine how many other posts were inadvertently removed. “The actions taken by Facebook in this case indicate that the company does not respect the right to redress and violates its corporate human rights policy,” said the board of directors.

The case shows how Facebook’s complex rules are often characterized by instructions that users cannot see, and how the board of directors has repeatedly urged the company to make all of its policies clearer to users.

Although it has only picked up a handful of cases so far, the board of directors has repeatedly criticized Facebook for failing to adhere to its own rules. “You can’t just invent new unwritten rules when it suits you,” said co-chairwoman of the board, Helle Thorning-Schmidt, to reporters after she said Facebook was wrong to impose an “indefinite” ban on Donald Trump. The board has also criticized Facebook for failing to alert users to key parts of its policies such as its “satire exception”. It pushed the company to clarify its policy on hate speech and how it deals with speech by politicians and other high-profile figures.

In this case, Facebook has 30 days to respond to the Supervisory Board, including several recommendations to further refine its “Dangerous Persons and Organizations” policy and to update its transparency reporting process.

All products recommended by Engadget are selected by our editorial team independently of our parent company. Some of our stories contain affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.