U.S. President Joe Biden believes social media platforms have a responsibility to “stop amplifying untrustworthy content,” the White House said even as it declined to comment directly on a decision by Facebook Inc’s oversight board to keep a suspension in place for former President Donald Trump.
Interestingly, according to a Reuters report, Facebook Inc’s oversight board said on Wednesday that the company did not answer questions about whether its algorithms amplified inflammatory posts by the then-U.S. President Donald Trump and contributed to the deadly siege on the Capitol in January.
“The (present) president’s view is that the major platforms have a responsibility related to the health and safety of all Americans to stop amplifying untrustworthy content, disinformation and misinformation, especially related to COVID-19, vaccinations and elections,” White House spokeswoman Jen Psaki told reporters on Wednesday.
Facebook’s independent oversight board had recommended the company review how it might have potentially contributed to the violence and the false narrative of election fraud.
Many Democrats and other critics have said Trump’s posts helped fuel the attack, which led to five deaths. Facebook indefinitely banned Trump from posting in the wake of the violence and asked for further guidance from the oversight board, a 20-person panel funded by the company to review content moderation decisions.
The board found Facebook’s indefinite suspension of Trump’s account was arbitrary because it did not follow a clear published procedure. It called on the company to develop new rules within six months that would lead to either Trump’s reinstatement or some other penalty.
Kicking the decision on what to do with Trump back to Facebook drew condemnation from both sides of the political spectrum.
The company’s role in promoting posts by Trump is important to understand, the board said, because measures short of banning his account could have been enough to limit the risk of violence.
Facebook for years has been criticized for designing its News Feed algorithms in ways that promote divisive and inflammatory content.
But the question about whether any internal analysis had been conducted since January was among seven out of 46 questions that Facebook declined to answer.
“This makes it difficult for the Board to assess whether less severe measures, taken earlier, may have been sufficient to protect the rights of others,” the decision stated.