HomeLatest ThreadsGreatest ThreadsForums & GroupsMy SubscriptionsMy Posts
DU Home » Latest Threads » Forums & Groups » Main » General Discussion (Forum) » How Facebook got addicted...

Sun Mar 14, 2021, 09:07 PM

How Facebook got addicted to spreading misinformation

This article provides a serious discussion of why Facebook continues to be a major conduit for spreading misinformation. This also explains why Republicans are trying to redirect the conversation from combatting misinformation to combatting "bias." Of course, a focus on bias tends to treat facts and "alternative facts" as having equal weight and thus promoting a false equivalency under which which misinformation can thrive.

https://www.technologyreview.com/2021/03/11/1020600/facebook-responsible-ai-misinformation/

By the time thousands of rioters stormed the US Capitol in January, organized in part on Facebook and fueled by the lies about a stolen election that had fanned out across the platform, it was clear from my conversations that the Responsible AI team had failed to make headway against misinformation and hate speech because it had never made those problems its main focus. More important, I realized, if it tried to, it would be set up for failure.

The reason is simple. Everything the company does and chooses not to do flows from a single motivation: Zuckerberg’s relentless desire for growth. Quińonero’s AI expertise supercharged that growth. His team got pigeonholed into targeting AI bias, as I learned in my reporting, because preventing such bias helps the company avoid proposed regulation that might, if passed, hamper that growth. Facebook leadership has also repeatedly weakened or halted many initiatives meant to clean up misinformation on the platform because doing so would undermine that growth.

In other words, the Responsible AI team’s work—whatever its merits on the specific problem of tackling AI bias—is essentially irrelevant to fixing the bigger problems of misinformation, extremism, and political polarization. And it’s all of us who pay the price.

“When you’re in the business of maximizing engagement, you’re not interested in truth. You’re not interested in harm, divisiveness, conspiracy. In fact, those are your friends,” says Hany Farid, a professor at the University of California, Berkeley who collaborates with Facebook to understand image- and video-based misinformation on the platform.

4 replies, 912 views

Reply to this thread

Back to top Alert abuse

Always highlight: 10 newest replies | Replies posted after I mark a forum
Replies to this discussion thread
Arrow 4 replies Author Time Post
Reply How Facebook got addicted to spreading misinformation (Original post)
TomCADem Mar 2021 OP
teach1st Mar 2021 #1
Mosby Mar 2021 #2
Stevano Mar 2021 #3
Johnny2X2X Mar 2021 #4

Response to TomCADem (Original post)

Sun Mar 14, 2021, 09:52 PM

1. This is an important article

Thanks for posting this, TomCADem!

If a model reduces engagement too much, it’s discarded. Otherwise, it’s deployed and continually monitored. On Twitter, Gade explained that his engineers would get notifications every few days when metrics such as likes or comments were down. Then they’d decipher what had caused the problem and whether any models needed retraining.

But this approach soon caused issues. The models that maximize engagement also favor controversy, misinformation, and extremism: put simply, people just like outrageous stuff. Sometimes this inflames existing political tensions. The most devastating example to date is the case of Myanmar, where viral fake news and hate speech about the Rohingya Muslim minority escalated the country’s religious conflict into a full-blown genocide. Facebook admitted in 2018, after years of downplaying its role, that it had not done enough “to help prevent our platform from being used to foment division and incite offline violence.”

While Facebook may have been oblivious to these consequences in the beginning, it was studying them by 2016. In an internal presentation from that year, reviewed by the Wall Street Journal, a company researcher, Monica Lee, found that Facebook was not only hosting a large number of extremist groups but also promoting them to its users: “64% of all extremist group joins are due to our recommendation tools,” the presentation said, predominantly thanks to the models behind the “Groups You Should Join” and “Discover” features.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to TomCADem (Original post)

Sun Mar 14, 2021, 11:05 PM

2. Worth reading the whole thing.

Good find.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to TomCADem (Original post)

Mon Mar 15, 2021, 09:58 AM

3. Fake news is profitable to Facebook

 

Because right wing nuts use Facebook more, when Facebook gives them the fake news they crave.

Reply to this post

Back to top Alert abuse Link here Permalink


Response to Stevano (Reply #3)

Mon Mar 15, 2021, 10:04 AM

4. Works on lefties too

Sure, the right falls for more fake news, but Dems are not immune either, and another important aspect is that even smart people fall for it. This isn't just idiots being duped, it works on the most educated and intelligent too.

Reply to this post

Back to top Alert abuse Link here Permalink

Reply to this thread