Sunday , October 2 2022

Facebook admits "serious mistake" after Edwin Tong questioned failure to remove hatred in Sri Lanka



[ad_1]

LONDON: A Facebook manager has acknowledged that the company "made a mistake" in order not to remove an entry raising rashat in Sri Lanka. An international hearing on fake news and disinformation in London was held on Tuesday, November 27th.

Mr Allan, Facebook Vice President of Political Solutions, was questioned by Singapore MP Edwin Tong over a post written in Sinhalese in March, which required all Muslims to be killed. Mr Tong asked if the post violates social media company terms of use.

Allan agreed that it did.

"It was set in a time when there were significant tensions between the people of Sri Lanka and Muslims, which caused damage to property, deaths too. Claims. Damage to mosques. And eventually it resulted in the Sri Lankan government declaring an emergency. you agree? "Mr. Tong said.

Mr Allan replied, "Yes."

"Would you agree that in connection with the kind of tension that occurs in Sri Lanka, would it be difficult to set up such a place far, split up the tensions or tense these tensions even more and share society?" Asked Tong.

Mr Allan replied: "Yes, it's high priority content for us to remove."

When Mr Tong asked why Facebook refused to take down the current post, Allan also said it was a "simple mistake" of a Facebook employee, even after it was noted by Sri Lankan's communications minister.

At this time Mr Tong restricted. He claimed there was no mistake and Facebook had responded to a user that the mail did not violate its social standards.

Mr. Whole Was Disagree "It was a mistake," he said. "I just want to be clear that someone has made a mistake in the review."

He also disagreed with Mr Tong's subsequent question of whether this case showed that Facebook "can not rely on making the right assessment" about what can be displayed on the platform.

"We make mistakes … serious mistakes; our responsibility is to reduce the number of mistakes," he says.

"We invest heavily in artificial intelligence, where we would exactly create a glossary of hats language in all languages."

He added: "The best way to solve this is a dictionary of hate speech in Sinhalese, which is revealed to a singular-speaking reviewer, who can make sure we do the job properly."

Mr Tong replied, "Mr. Allan, in this case, while an apology may be that your users or reviewers do not understand Sinhalese, when you have the communications minister in Sri Lanka who tells you that this is hate speech and to take it down and you are reviewing It's your people reviewing it and you said hundreds of thousands of people are reviewing it, but they do not seem to follow the same philosophy you've expressed in your own policy. "

The post was stopped just from circulating after the Sri Lankan government blocked Facebook.

When Mr Tong asked if governments would have to resort to such measures to address the question of intentional online skills, Allan said that Facebook would "prefer not to."

"This is where I think openness must be … and I hope you have a constructive relationship with my colleagues in Singapore who work with these issues," says Allan.

"I want us to be in a position where we share the good and the bad, about how we think we are doing, in full expectation that you will always push us for better."

To this answer, Mr. Tong said, "We are looking forward to it, for what happened, for example in Sri Lanka, and there are several others too, should not be allowed to happen ever."

Mr Allan said, "No, and as an employee on Facebook I'm ashamed that things like this happen and they do and they should not."

ON FACEBOOK AND CHOICE

Mr Tong joined two parliamentarians from Singapore, Pritam Singh and Sun Xueling at the London hearing. Mr. Singh asked Allan what Facebook did to fight the prospect that the election would be "manipulated".

Allan said that "for every important choice", Facebook now creates a "war room" – a task force consisting of specialists whose work is to understand the risks of that choice and use the necessary tools and technologies to address these risks.

Mr. Singh asked if smaller countries would also fall under Facebook's "war room" concept.

"In an ideal world, it's every choice, everywhere all the time. Our current resource, which I believe, allows us to look at all national elections," says Allan.

"So if it's a national election in Singapore, it would be covered, for example."

He added: "We had a similar workforce around the Latvian election. So we look at each election if the country is large or small, at national level. And then the question is that we can expand it to regional and local elections."

Mr Singh also asked if Facebook would consider working with local electoral authorities and political party representatives to remove or flag posts that would endanger the political process.

"We think it's important. And again I want to repeat … the people who decide if the election is free and fair are you, and your authorities and the political parties," says Allan.

"So we want to do what is necessary for everyone to get the confidence that the election is free and fair – and we can not do that on our own.

"We can do tools, we can work with you, but in the end we must engage you to meet the common goal that we positively favor negative choices in your country."

[ad_2]
Source link