Jenny Kane / AP
Facebook is making changes to give users more choices than what posts they see in their news feeds, as the social media company defends itself from accusations that it fuels extremism and political polarization.
The changes, announced Wednesday, include making it easier for people to switch feeds to a “Latest” mode, where the latest posts appear first, and let users select up to 30 friends or pages to prioritize. Users can now restrict who can comment on their posts.
The goal is to “give people real transparency in how systems work and let people pull the levers,” Nick Clegg, Facebook’s vice president of global affairs and communications, told NPRs Morning edition. “You can actually override the algorithm and cure your own news feed.”
Facebook has been escalating scrutinized by the impact of the platform on society since the Jan. 6 attack on the U.S. Capitol by a pro-Trump mob, which was planned and documented on social media, including Facebook.
Many critics have reset the role of Facebook’s algorithms, which determine which posts users view and which groups and accounts they are recommended to join or follow, and how they can push people towards more inflammatory content.
In the interview with NPR and in a post of 5000 words that was published on Wednesday, Clegg applauded that criticism.
Central to many of the criticisms from Facebook’s critics is the idea that its algorithmic systems actively encourage the sharing of sensational content and are designed to keep people in endless scrolls, “Clegg wrote in the Medium post.
While “content that evokes strong emotions will always be shared,” he acknowledged, he said it was because of “human nature” – not Facebook’s algorithms.
“Facebook systems are not designed to reward provocative content. In fact, key parts of these systems are designed to do the opposite,” he wrote.
Clegg disputed allegations that social media contributed to political partisanship, saying that academic research into the case had been “mixed”. He also defended the benefits of social media, from personal advertising to “a dramatic and historic democratization of speech.”
Clegg told NPR that his intention was not to blame Facebook users for the split on the platform, but to highlight the “complex” interactions between humans and technology.
“It’s silly to say it’s all users’ fault, but it’s fair to say it’s somehow a faceless machine’s fault,” he said.
“People want simple answers to what are complex questions. I still urge us to try to deal with the complexity of this, and not … reduce it to a faceless machine that we owe to things that are sometimes deep inside society. myself. “
Facebook CEO Mark Zuckerberg gave a similar defense of the platform last week at a congressional hearing on the spread of extremism and misinformation on social media.
When lawmakers pressured him on whether Facebook was responsible for the January 6 attack, Zuckerberg blamed the rebels and former President Donald Trump.
“I think the responsibility lies with the people who took action to break the law and rebel,” he said. “And secondarily with the people who spread that content.”
Facebook’s intensified efforts to defend the platform and promise users greater transparency and control come as the prospects for new Internet regulations become greater.
Several bills are circulating that will try to hold companies, including Facebook, more responsible for the content posted by users, and the real consequences of online activity.
Facebook itself is calling for reforms in a comprehensive advertising campaign, and Zuckerberg posted his vision for updating regulations in his testimony last week.
“Everyone agrees that new road rules must be written,” Clegg told NPR.
Editor’s note: Facebook is among NPR’s financial supporters.