General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsHow Facebook got addicted to spreading misinformation
This article provides a serious discussion of why Facebook continues to be a major conduit for spreading misinformation. This also explains why Republicans are trying to redirect the conversation from combatting misinformation to combatting "bias." Of course, a focus on bias tends to treat facts and "alternative facts" as having equal weight and thus promoting a false equivalency under which which misinformation can thrive.
https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation/
The reason is simple. Everything the company does and chooses not to do flows from a single motivation: Zuckerbergs relentless desire for growth. Quiñoneros AI expertise supercharged that growth. His team got pigeonholed into targeting AI bias, as I learned in my reporting, because preventing such bias helps the company avoid proposed regulation that might, if passed, hamper that growth. Facebook leadership has also repeatedly weakened or halted many initiatives meant to clean up misinformation on the platform because doing so would undermine that growth.
In other words, the Responsible AI teams workwhatever its merits on the specific problem of tackling AI biasis essentially irrelevant to fixing the bigger problems of misinformation, extremism, and political polarization. And its all of us who pay the price.
When youre in the business of maximizing engagement, youre not interested in truth. Youre not interested in harm, divisiveness, conspiracy. In fact, those are your friends, says Hany Farid, a professor at the University of California, Berkeley who collaborates with Facebook to understand image- and video-based misinformation on the platform.
teach1st
(5,964 posts)Thanks for posting this, TomCADem!
But this approach soon caused issues. The models that maximize engagement also favor controversy, misinformation, and extremism: put simply, people just like outrageous stuff. Sometimes this inflames existing political tensions. The most devastating example to date is the case of Myanmar, where viral fake news and hate speech about the Rohingya Muslim minority escalated the countrys religious conflict into a full-blown genocide. Facebook admitted in 2018, after years of downplaying its role, that it had not done enough to help prevent our platform from being used to foment division and incite offline violence.
While Facebook may have been oblivious to these consequences in the beginning, it was studying them by 2016. In an internal presentation from that year, reviewed by the Wall Street Journal, a company researcher, Monica Lee, found that Facebook was not only hosting a large number of extremist groups but also promoting them to its users: 64% of all extremist group joins are due to our recommendation tools, the presentation said, predominantly thanks to the models behind the Groups You Should Join and Discover features.
Mosby
(17,218 posts)Good find.
Stevano
(24 posts)Because right wing nuts use Facebook more, when Facebook gives them the fake news they crave.
Johnny2X2X
(21,357 posts)Sure, the right falls for more fake news, but Dems are not immune either, and another important aspect is that even smart people fall for it. This isn't just idiots being duped, it works on the most educated and intelligent too.