YouTube is taking steps to reduce the spread of content that promotes conspiracy theories and other fringe content.
In a blog post Friday, the company says it will work to reduce recommendations of those videos, though it will not remove the videos from the platform.
“[YouTube will be] taking a closer look at how we can reduce the spread of content that comes close to — but doesn’t quite cross the line — of violating our Community Guidelines," the blog post says. “To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
YouTube says the shift will only impact around 1% of all videos on the platform, though with more than 300 hours of video uploaded to the platform every minute, that may still represent a significant amount of video content.
The company will use a combination of machine learning and human reviewers to make the change, which will happen gradually beginning in the U.S.