YouTube is improving recommendations to tackle clickbait and borderline content with improved analytics of users' viewing habits and reducing the spread of videos violating their community guidelines.
They discussed their efforts on their blogpost.
When recommendations are at their best, they help users find a new song to fall in love with, discover their next favorite creator, or learn that great paella recipe. That's why YouTube updated their recommendations system all the time—they want to make sure they’re suggesting videos that people actually want to watch.
You might remember that a few years ago, viewers were getting frustrated update with clickbaity videos with misleading titles and descriptions (“You won’t believe what happens next!”). They responded by updating their system to focus on viewer satisfaction instead of views, including measuring likes, dislikes, surveys, and time well spent, all while recommending clickbait videos less often. More recently, people told that they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles. YouTube now pulls in recommendations from a wider set of topics—on any given day, more than 200 million videos are recommended on the homepage alone. In fact, in the last year alone, they’ve made hundreds of changes to improve the quality of recommendations for users on YouTube.
Also Read: YouTube for Android is rolling out Voice Search
They’ll continue that work this year, including taking a closer look at how they can reduce the spread of content that comes close to—but doesn’t quite cross the line of—violating Community Guidelines. To that end, they’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.
While this shift will apply to less than one percent of the content on YouTube, they believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community. To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube. As always, people can still access all videos that comply with Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results. They think this change strikes a balance between maintaining a platform for free speech and living up to their responsibility to users.
This change relies on a combination of machine learning and real people. They work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations. These evaluators are trained using public guidelines and provide critical input on the quality of a video.
This will be a gradual change and initially will only affect recommendations of a very small set of videos in the United States. Over time, as systems become more accurate, they'll roll this change out to more countries. It's just another step in an ongoing process, but it reflects their commitment and sense of responsibility to improve the recommendations experience on YouTube.