Mozilla has published 28 stories from anonymous YouTube users who were left distraught by content that was recommended to them or someone they knew.
The publication is an attempt to get Google to be more transparent about the YouTube algorithm it uses to identify which content a user will likely want to watch next.
Mozilla fellow and former YouTube engineer, Guillaume Chaslot, has investigated YouTube’s recommendation system.
He said that it is unclear whether the issues that arise stem from the algorithm itself or from users’ actions.
“We can see that there are problems, but we have no idea if the problem is from people being people or from algorithms,” Chaslot stated.
Graphic content and conspiracy videos
One YouTube user told a story about viewing a video of a rescued street dog and thereafter receiving suggestions of videos of abused dogs, with gratuitous thumbnails covering the homepage.
Another tells of a parent who let their child watch “Thomas the Tank Engine” episodes, which eventually led to YouTube recommending videos of graphic train wrecks.
Several of the anecdotes detail seemingly mundane searches that quickly plunge into videos about conspiracy theories.
Mozilla’s vice president of advocacy, Ashley Boyd, said the company is trying to show YouTube that the current system worries users.
“What we’re trying to do is provide a window into the consumer concern about this,” Boyd stated.
She also said that the fact that people don’t shun YouTube over these suggestions, does not necessarily mean there are no problems.
“There’s a question about whether people care about this problem because they haven’t left the platform. We don’t think that’s a good measure of whether people care,” said Boyd.
According to CNET, YouTube attracts 2 billion users every month, with 70% of time spent viewing on the platform attributed to recommendations.