YouTube ‘hate’ barely works, according to new study on recommendations | Engadget

If you’ve ever felt it’s hard to “untrain” YouTube’s algorithm from suggesting a specific type of video once it’s slipped into your recommendations, you’re not alone. In fact, it can be more difficult than you think to get YouTube to accurately understand your preferences. One major issue, according to new research by Mozilla, is that YouTube’s in-app controls like the “Dislike” button, are largely ineffective as a tool for controlling suggested content. According to the report, these buttons “prevent less than half of the algorithm’s unwanted recommendations”.

Mozilla researchers used data collected from RegretsReporter, a browser extension that allows people to “donate” their recommendations data for use in studies like this one. Overall, the report was based on millions of recommended videos, as well as anecdotal reports from thousands of people.

Mozilla tested the effectiveness of four different controls: “Don’t Like” button, “Don’t Like,” “Don’t Recommend Channel”, and “Remove from Watch History.” The researchers found that these had varying degrees of effectiveness, but that the overall effect was “small and insufficient.”

Of the four controls, the most effective was ‘Don’t recommend from channel’, which blocked 43 percent of unwanted recommendations, while ‘Disinterested’ was the least effective and only blocked about 11 percent of unwanted suggestions. The “Dislike” button was about 12 percent the same, and about 29 percent of the “Remove” button was removed from your watch history.

In their report, the Mozilla researchers note the significant lengths that study participants said they would sometimes go to to block unwanted recommendations, such as watching videos while logged out or while connected to a VPN. The researchers say the study highlights YouTube’s need to better explain the controls to users, and give people more proactive ways to decide what they want to see.

“The way YouTube and many platforms work is that they rely on a lot of passive data collection in order to infer what your preferences are,” says Becca Rex, a senior researcher at Mozilla who co-wrote the report. “But it’s kind of a paternalistic way of working where you kind of make choices on behalf of people. You can ask people what they want to do on the platform versus just watching what they do.”

Mozilla’s research comes amid growing calls for major platforms to make their algorithms more transparent. In the United States, lawmakers have proposed bills to curtail “opaque” recommendation algorithms and hold companies accountable for algorithmic bias. The European Union is even more advanced. The recently passed Digital Services Act will require platforms to explain how recommendation algorithms work and open them up to outside researchers.

All products recommended by Engadget are handpicked by our editorial team, independently of the parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publication.

#YouTube #hate #barely #works #study #recommendations #Engadget

Leave a Comment

Your email address will not be published.