MIT Technology Review offers a look inside Facebook.
Facebook uses algorithms and machine learning programs to maximize user engagement (keep users hooked). It seems people are most interested in the divisive stuff and rumors. So Facebook promotes that sort of socially-damaging stuff.
In 2017, Chris Cox, Facebook’s longtime chief product officer, formed a new task force to understand whether maximizing user engagement on Facebook was contributing to political polarization. It found that there was indeed a correlation, and that reducing polarization would mean taking a hit on engagement. In a mid-2018 document reviewed by the Journal, the task force proposed several potential fixes, such as tweaking the recommendation algorithms to suggest a more diverse range of groups for people to join. But it acknowledged that some of the ideas were “antigrowth.” Most of the proposals didn’t move forward, and the task force disbanded.[A] former employee…no longer lets his daughter use Facebook.