How War in Ukraine Roiled Facebook and Instagram

Meta, which owns Facebook and Instagram, took an unusual step last week: it has suspended some quality restrictions that ensure users’ posts from Russia, Ukraine and other Eastern European countries comply with its rules.

Under the change, Meta has temporarily stopped tracking whether its workers who monitor Facebook and Instagram posts from those areas are accurately enforcing its content guidelines, said six people familiar with the situation. That’s because workers could not comply with the shifting rules regarding what kind of posts were allowed about the war in Ukraine, they said.

Meta has revised more than half a dozen content policies since Russia invaded Ukraine last month. The company has approved posts about the conflict that it could normally remove – including some from Russian President Vladimir V. Putin’s death and calls for violence against Russian troops – before changing his mind or drawing new guidelines, people said.

The result has been internal confusion, especially among content moderators who patrol on Facebook and Instagram for text and images with gore, hate speech and incitement to violence. Meta has sometimes changed its rules on a daily basis, causing whiplash, people said, who are not authorized to speak in public.

The confusion over the content guides is just one way that the meta has been roiled by the war in Ukraine. The company has also argued with the Russian and Ukrainian authorities over the fight for information about the conflict. And internally, it has dealt with dissatisfaction with its decisions, including Russian employees concerned for their safety and Ukrainian workers who want the company to tighten its grip on Kremlin-affiliated organizations online, three people said.

Meta has previously faced international strife – including the genocide of the Muslim minority in Myanmar over the past decade and clashes between India and Pakistan – with varying degrees of success. Now the biggest conflict on the European continent since World War II has become a litmus test of whether the company has learned to police its platform during a major global crisis – and so far, that work seems to be ongoing.

“All elements of the Russia-Ukraine conflict have long been around: calls for violence, disinformation, propaganda through the state media,” said David Kaye, a law professor at the University of California, Irvine and a. Former UN Special Rapporteur. “The mystery to me is that they have no game plan to deal with it.”

Meta spokesman Danny Lever declined to comment directly on how the company handled material decisions and employee concerns during the war.

After Russia invaded Ukraine, Meta said he had set up a round-the-clock special operations team, staffed by native Russian and Ukrainian-speaking personnel. It also updated its products to help civilians in the war, including facilities that direct Ukrainians to reliable and verified information to find accommodation and refugee assistance.

Meta chief executives Mark Zuckerberg and Sheryl Sandberg, chief operating officer, have been directly involved in the response to the war, said two people familiar with the effort. But as Mr. While Zuckerberg is focused on transforming Meta into a company that will lead the digital world of so-called metavars, many of the responsibilities surrounding the conflict have slipped – at least in public – to Nick Clegg, the president of global affairs.

Last month Mr. clegg Announced That the meta will ban the pages of Russia Today and Sputnik within the European Union, which are Russian state-controlled media following requests from Ukraine and other European governments. Russia retaliated by cutting off access to Facebook within the country, the company claiming discrimination with the Russian media and then blocking Instagram.

This month, the President of Ukraine Volodymyr Zelensky Meta praised To move quickly to limit Russian war propaganda on its platform. Meta also took swift action to remove the “deepfake” video edited from its platforms in which Mr. Zelensky bows to Russian forces.

The company has also made high-profile mistakes. It allowed A group called the Ukrainian Legion To run advertisements on its platform this month for the recruitment of “foreigners” for the Ukrainian army, in violation of international law. He later removed the ads – which were shown to the United States, Ireland, Germany and elsewhere – because, according to Meta, the group may have misrepresented its relationship with the Ukrainian government.

Internally, Meta also began changing its content policies to deal with the fast-moving nature of the post about the war. The company has long banned posts that could incite violence. But Feb. 26, two days after Russia invaded Ukraine, Meta informed its content mediators – who are usually contractors – that it would allow a call for Mr’s death. Putin and “Ukraine calls for violence against Russians and Russian troops in the wake of the invasion,” according to policy changes, which were reviewed by The New York Times.

This month, Reuters reported on a meta shift with A headline Which suggested that posts calling for violence against all Russians would be tolerated. In response, Russian authorities labeled Meta’s activities as “extremist.”

Shortly afterwards, Meta reversed the course and said it would not allow its users to call for the death of heads of state.

“Things are moving fast in Ukraine,” he said. Clegg wrote in an internal memo that was reviewed by The Times and first reported by Bloomberg. “We try to think about all the results, and we keep our guidance under constant review because the context is always evolving.”

Other meta-compatible policies. This month, it made a temporary exception to its hate speech guide so that users could post about “expelling Russians” and “explicit exclusion against Russians” in 12 Eastern European countries, according to internal documents. But within a week, Meta tweaked the rule that it should only apply to users in Ukraine.

Consistent adjustments have confused user-monitoring intermediaries in Central and Eastern European countries, said six people familiar with the situation.

The policy changes were drastic because mediators were usually given less than 90 seconds to determine whether images of corpses, mutilation videos or direct calls to violence violated the meta rules, they said. In some cases, they added, the mediators were shown posts about the war in Chechnya, Kazakh or Kyrgyz, even though they did not know the language.

Ms. Lever declined to comment on whether it hired content moderators specializing in those languages.

Emerson t. Brooking, a senior associate at the Atlantic Council’s Digital Forensic Research Lab who studies the spread of online contamination, said Meta had trouble getting war materials.

“In general, the content moderation policy is intended to limit violent content,” he said. “But war is an exercise in violence. There is no way to sanitize war or pretend it is anything different.

Meta has also faced employee complaints about its policy shift. At a meeting this month for workers with ties to Ukraine, workers asked why the company waited until the war. Take action Against Russia Today and Sputnik, the two attendees said. Russian state activity was at the heart of Facebook’s failure to secure the 2016 U.S. presidential election, they said, and that doesn’t mean those outlets continued to operate on the Metana platform.

While Meta has no employees in Russia, the company held a separate meeting this month for workers with Russian connections. Those employees said they were concerned that Moscow’s action against the company would affect them, according to an internal document.

In discussions on Meta’s internal forums, which were viewed by The Times, some Russian employees stated that they had deleted their workplace from their online profiles. Others wondered what would happen if they worked in the company’s offices in places with extradition treaties to Russia, and “what kind of risks would be posed by working in Meta not only for us but for our families.”

Ms. “The heart goes out to all our employees who have been affected by the war in Ukraine, and our teams are working to make sure they and their families have the support they need,” Lever said.

At a separate company meeting this month, some employees expressed dissatisfaction with the changes made to speech policies during the war, according to an internal poll. Some have questioned whether the new rules are necessary, calling the changes “a slippery slope” that is “used as proof that the West hates the Russians.”

Others asked about the impact on Meta’s business. “Will Russian sanctions affect our revenue for the quarter? Future quarter?” Read a question. “What is our recovery strategy?”

Similar Posts

Leave a Reply

Your email address will not be published.