Reports YouTube algorithm doesn’t care if videos aren’t ‘bad’

Screenshot of YouTube with mouse over button

YouTube has already stopped videos from showing the number of their non-fans Received, but disapproval of a video doesn’t seem to change the number of similar videos the platform recommends.
picture: Xiu (stock struggle)

My YouTube recommendations are full of old reboots of Gordon Ramsay’s kitchen nightmares. It might be my fault to get drunk one night and watch an entire episode. Let me tell you, if there’s one thing I don’t want in my feed anymore, it’s the famous Brit who rips another chef off while the most obnoxious sound effects in the world (Pra ryee) in the background. I hated a lot of these videos, but now I have Hell’s Kitchen pop up on my page, and I feel more and more of a “raw” steak that Ramsay spurs and eats.

But obviously I’m not alone with my YouTube recommendations issues. Report from the Mozilla Foundation Monday released It claims, based on survey and crowdsourcing data, that “DislikeThe comment tools” and “channel not recommend” do not actually change video recommendations.

Well, there are two points here. One of them is that users always feel that the controls provided by Google-owned YouTube do not It actually makes a difference. Second, based on data from users, the controls offer “little” impact on recommendations which means “most of the spam videos are still leaking”.

The organization relied on its own data Reporter regrets Browser plug-in tool that allows users to block specific YouTube videos from appearing in their feed. The report says it based its analysis on 2,757 survey participants and 22,722 people who allowed Mozilla to access more than 567 million video recommendations taken from the end of 2021 to June 2022.

Although researchers acknowledge that survey participants are not a representative sample of YouTube A wide and diverse audience, a third of those surveyed said that using YouTube controls didn’t seem to change their video recommendations at all. One user told Mozilla that they would report the videos as misleading or spam and would come back to their feed later. Respondents often said that blocking one channel would only lead to recommendations from similar channels.

YouTube’s algorithm recommends users the videos they don’t want to watch, often worse than just the old Ramsay cable. A 2021 report by Mozilla claimed, based on mass user data, that people browsing the video platform regularly recommend violent content, hate speech, and political misinformation.

In this latest report, Mozilla researchers found that pairs of videos including these users were rejected, such as a Tucker Carlson Screed, will result in another recommended Fox News YouTube video. Based on a review of 40,000 pairs of videos, often when one channel is blocked, the algorithm simply recommends very similar videos from similar channels. Using only the “dislike” or “disinterested” buttons prevented 12% and 11% of unwanted recommendations, respectively, compared to a control group. Using the “Do not recommend channel” and “Remove from watch history” buttons was most effective in correcting users’ feeds, but only 43% and 29%, respectively.

“In our data analysis, we determined that YouTube user control mechanisms are insufficient as tools to prevent unwanted recommendations,” the Mozilla researchers wrote in their study.

“Our controls do not filter entire topics or viewpoints, as this can have negative effects on viewers, such as creating echo chambers,” YouTube spokeswoman Elena Hernandez told Gizmodo in an email statement. The company said it doesn’t prevent recommending all content from related topics, but it also claims to push “trusted” content while blocking “borderline” videos that come close to violating content moderation policies.

in 2021 blog postAnd the Cristos Goodrow – YouTube’s Vice President of Engineering – wrote that their system is “constantly evolving” but providing transparency into the algorithm “isn’t as simple as listing a formula for recommendations” because their systems take clicks into account, watch Time, survey responses, engagement, likes, and dislikes.

Of course, just like any social media platform out there, YouTube has struggled to create systems that can battle the full range Bad or even predatory content It is uploaded to the site. One book is coming Shared exclusively with Gizmodo He said YouTube came close to making billions of dollars in advertising revenue for dealing with strange and disturbing videos that are recommended for children.

While Hernandez claimed that the company expanded its reach API dataSyrian Poundokesperson added: “The Mozilla report doesn’t take into account how our systems actually work, so it’s difficult for us to gather many insights.”

But that’s a criticism that Mozilla also puts Google on its feet, saying the company doesn’t provide enough access to allow researchers to assess what influences YouTube’s secret sauce, AKA their algorithms.

#Reports #YouTube #algorithm #doesnt #care #videos #arent #bad

Leave a Comment

Your email address will not be published.