Business

YouTube’s change to the ‘dislike’ button doesn’t work


When YouTube changed their implementation of the dislike button last year, most people that consume content on the platform weren’t happy. Effectively, the still kept the dislike button, but viewers of a video wasn’t able to see how many dislikes the video had, or what the ration of likes to dislikes was. Many users chose what they consumed by using that information, which obviously wasn’t an option anymore. A new study by Mozilla has found that even when users tell YouTube they aren’t interested in certain types of videos, similar recommendations keep coming.

Read: Intel Celeron and Pentium are no more

Using video recommendations data from more than 20,000 YouTube users, Mozilla researchers found that buttons like “not interested,” “dislike,” “stop recommending channel,” and “remove from watch history” are largely ineffective at preventing similar content from being recommended. Sometimes it tends to work better (although it isn’t clear what leads to the difference in performance), but even in the best circumstances more than half the recommendations similar to what a user said they weren’t interested in.

The Mozilla researchers used volunteers for this research, mostly from RegretsReports, which is a Mozilla browser extension that overlays a general “stop recommending” button to YouTube videos viewed by participants. Users were randomly assigned to different groups which would interact with the buttons in different ways. So, different signals were sent to YouTube each time they clicked the button placed by Mozilla — dislike, not interested, don’t recommend channel, remove from history, and a control group for whom no feedback was sent to the platform.

Data from more than 500 million video recommendations was gathered, with the researchers creating over 44,000 pairs of videos — one “rejected” video, plus a video subsequently recommended by YouTube. Researchers then assessed pairs themselves or used machine learning to decide whether the recommendation was too similar to the video a user rejected.

“YouTube should respect the feedback users share about their experience, treating them as meaningful signals about how people want to spend their time on the platform,” researchers write.

“Importantly, our controls do not filter out entire topics or viewpoints, as this could have negative effects for viewers, like creating echo chambers,” Elena Hernandez, YouTube spokesperson, said when asked about it. “We welcome academic research on our platform, which is why we recently expanded Data API access through our YouTube Researcher Program. Mozilla’s report doesn’t take into account how our systems actually work, and therefore it’s difficult for us to glean many insights.”



Source link

Leave a Reply

Your email address will not be published.