Why it matters?

Facebook is set to deploy more rigid algorithmic tools to limit the spread of viral content on its platform. This is according to a new report by the Wall Street Journal that highlights measures that have been tested in recent elections, including in Sri Lanka and Myanmar, to mitigate disinformation campaigns.

The move is in preparation for the November U.S. presidential election, which is set to pose some unique challenges. The company believes that there’s tremendous potential for upheaval spurred on by the spread of disinformation during election season.

According to a recent blog post by Facebook CEO Mark Zuckerberg, the country is deeply divided along political lines. Zuckerberg has raised concerns about the recent surge in politically-motivated acts of violence as well as misinformed statements by public figures. He has alluded to these as indicators that democracy could be undermined if information is not regulated.

The following is the statement issued by Andy Stone, the company’s Policy Communications Director, in relation to this. “While I will intentionally not link to the New York Post, I want be clear that this story is eligible to be fact checked by Facebook’s third-party fact checking partners. In the meantime, we are reducing its distribution on our platform.” Facebook currently relies on a network of third-party fact-checking organizations to authenticate such information. They include the Associated Press, the Daily Caller, and France-Presse.