Post a Comment Print Share on Facebook

Facebook has minimized internal initiatives designed to weaken the content extremist

Do good in connecting people with people. Between the principle display, and the reality created by the algorithms of Facebook, the gap is wide and deep. Studie

- 24 reads.

Facebook has minimized internal initiatives designed to weaken the content extremist

Do good in connecting people with people. Between the principle display, and the reality created by the algorithms of Facebook, the gap is wide and deep. Studies conducted in-house, around the years 2017 and 2018, have starkly demonstrated to the leaders of the social network. But they have minimized the scope of the proposed solutions, when they have not refused net, reports the Wall Street Journal. Anything to weaken the scope of the recent attacks from Donald Trump, to that Facebook - and social networks in general - are against the content republicans.

"The myth of the most resistant about Facebook is that the social network was not aware of the damage that it creates," commented on Twitter the New York Times columnist Kevin Roose. "But in reality, he always knew what they were and the harm it could do to society." And for good reason: the network had all the data on the behavior of users at its disposal, on the contrary, researchers external to the company.

Thus, a sociologist, employee at Facebook was seen as early as 2016 that there was a problem. She had found that a third of the groups carrying on the German policy, which was broadcasting racist content, conspiracy and pro-Russian. These groups had a disproportionate influence thanks to a few of their members hyperactive. The study had even shown that "64% of people who join extremist groups have done so because of our algorithms of recommendation". In conclusion, the sociologist warned that "our recommendation system feeds the problem".

Between 2017 and 2018, a team of engineers and researchers has been established internally to deal with this problem. It was called "Common ground" ("ground"). Other entities called "Integrity teams" ("team Integrity") have also been put in place. The proposed policy was not to encourage people to change opinion, but to promote the content that creates "empathy and understanding".

Problems, suggestions for improvement, such as expanding the circle of recommendations, would have decreased "engagement" of users. In a word, they would have been less active on the social network. The team "Common ground" had taken her side and had qualified himself his proposals "anti-growth", calling on Facebook to "adopt a posture of moral".

Fear of criticism from the right

Other discovery researchers from Facebook: the presence much more imposing on the network of the publications of the extreme right wing, compared to the groups of the extreme left. In this context, take steps to reduce the titles "traps click" would have disadvantaged these sites ultraconservateurs. The proposals of the team "Common Ground" have been rejected, the leaders of Facebook fear of criticism from the right for censorship.

A proposal has been partially followed, however: minimize a little the influence of users hyperactive. In fact, some of the profiles, spent around 20 hours per day on the network - probably fake profiles of political propaganda - and were thus favors algorithms of Facebook. Pushed to take action, especially by the scandal in Cambridge Analytica, the american giant has since taken other steps, among which to promote the content shared by a wide user base, and not only activists, or, on the contrary, the disadvantage of online sources spreading false news.

We are no longer the same company

A spokesperson for Facebook.

"We are no longer the same company, defended a spokesperson for Facebook cited by the Wall Street Journal. We have learned a lot since 2016, we created a big team of "integrity" and strengthened our policies to limit the spread of content that is hateful." In February, Facebook has also launched a fund of $ 2 million to fund academic studies on the effects of Facebook on the polarization of the society.

SERVICES: save money using a promo code Canon

The editorial team conseilleHarcèlement, pornography,... The big problems of moderation of the French application YuboComment books conspiracy take advantage of the algorithms AmazonPeut we accuse Facebook of censoring or amplify the movement of the "yellow vests"?SujetsFacebookRéseau socialAucun comment

there are currently no comments on this article.
Be the first to give your opinion !

I write a comment
Avatar
Your Name
Post a Comment
Characters Left:
Your comment has been forwarded to the administrator for approval.×
Warning! Will constitute a criminal offense, illegal, threatening, offensive, insulting and swearing, derogatory, defamatory, vulgar, pornographic, indecent, personality rights, damaging or similar nature in the nature of all kinds of financial content, legal, criminal and administrative responsibility for the content of the sender member / members are belong.