YouTube is pulling back its election misinformation policy introduced in 2020, shortly after the results of the US Presidential elections. The company says this will enable open political discussion and debate during the ongoing election season on the platform. The updated policy is already in effect.

YouTube says the misinformation policy didn’t make a meaningful impact

Introduced in December 2020, YouTube’s election misinformation policy allowed the company to remove videos that promoted lies or false information about the 2020 US Presidential elections. This came after controversies around the results. There have been unproven allegations of voter fraud, with YouTube being one of the most popular platforms used to spread such claims. The company said that it will not allow videos that misled users about the outcome of the elections.

Over the past two years, YouTube has removed “tens of thousands” of such videos from the platform. While this move did curb the spread of misinformation, the company says that the policy has had some unintended effects. It reduced political speech “without meaningfully reducing the risk of violence or other real-world harm”. As such, it is walking back that policy, allowing creators to share political content that may be factually incorrect, potentially misleading viewers.

“With 2024 campaigns well underway, we will stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections,” YouTube said in a blog post. “The ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society – especially in the midst of election season.”

See also  Frontier Airlines Adjusts Its Policy For Families Sitting Together

YouTube isn’t changing most other election-related policies

While the election misinformation policy may be going away, YouTube will keep all other election-related policies intact. The company states that it will not tolerate content “aiming to mislead voters about the time, place, means, or eligibility requirements for voting”.  YouTube will also remove videos that make false claims to discourage voting, share misinformation about the validity of voting by mail, and encourage others to “interfere with democratic processes”. This is in addition to other policies against hate speech, harassment, and incitement to violence.

Additionally, YouTube will ensure that viewers see election-related news and information from “authoritative sources” prominently in search and recommendations. The company says such sources are usually more popular among viewers. “Our 2020 election information panels, with relevant context from voting locations, to live election results, were collectively shown over 4.5 billion times,” the blog post added. “We’ll have more details to share about our approach towards the 2024 election in the months to come.”

Source link