The YouTube Rabbit Hole Is Nuanced

Maybe you have an image in your mind of people who have been brainwashed by YouTube.

You can picture your cousin who likes to watch videos of pampered animals. Then out of the blue, YouTube’s algorithm pops up a terrorist recruitment video at the top of the app and continues to suggest more extreme videos until it is persuaded to take up arms.

A new analysis adds importance to our understanding of YouTube’s role in spreading beliefs that are out of the mainstream.

A group of scholars has found that YouTube rarely suggests videos that show conspiracy theories, extreme bigotry or quick science to people who have shown little interest in such material. And they are unlikely to follow such computerized recommendations when they are offered. Terrorist pipelines from kittens are extremely unusual.

This is not to say that YouTube is not a force to be reckoned with. The paper also found that research volunteers who already have a fanatical view or follow YouTube channels that frequently display fringe beliefs are more likely to find or recommend more videos in the same line.

The findings suggest that policymakers, Internet executives and the public should focus less on the potential dangers of leading a stranger to extremist ideology on YouTube, and that YouTube should focus more on ways to validate the views of those who already have such attitudes. Can help harden. Beliefs

“We understand that social media facilitates the demand supply of extreme perspectives,” said Brendan Nyan, one of the co-authors of the paper and a professor at Dartmouth College who studies misconceptions about politics and health care. “Even a few people with extreme ideas can do serious harm to the world.”

People watch over a billion hours of YouTube videos every day. There are perennial concerns that Google-owned sites could amplify extremist voices, mute legitimate expressions, or both, similar to the concerns surrounding Facebook.

This is only part of the research, and I will mention below some of the limitations of the analysis. But the interesting thing is that research challenges the binary notion that either YouTube’s algorithm risks turning any of us into monsters or the weird things on the internet do little harm. Neither can be true.

(You can read the research paper here. A version of it was also previously published by the Anti-Defamation League.)

Digging into the details, about 0.6 percent of research participants accounted for about 80 percent of the total viewing time on YouTube channels that were classified as “extremist,” such as far-right individuals David Duke and Mike Sernovich. (YouTube banned Duke’s channel in 2020.)

Most of them find the video not accidentally but by following web links, clicking on videos from the YouTube channels they have subscribed to, or following YouTube’s recommendations. One of the four videos that YouTube recommended to people watching the extreme YouTube channel was other videos like it.

Only 108 times during the research – about 0.02 percent of all video visits observed by researchers – followed a computerized suggestion for an out-of-the-mainstream channel when someone watching a relatively traditional YouTube channel had not already subscribed.

Analysis suggests that the majority of viewers of YouTube videos promoting fringe beliefs are people who want to see them, and then YouTube feeds them more similarly. The researchers found that the number of spectators was much higher among volunteers expressing high levels of racial or ethnic resentment, as measured by their responses to surveys.

“Our results indicate that YouTube continues to provide a platform for alternative and extreme content to reach a weaker audience,” the researchers wrote.

Like all research, this analysis has warnings. The study was conducted in 2020, when significant changes were made to reduce YouTube’s recommendation for videos that were harmful to the public. This makes it difficult to know whether the pattern that researchers see in the YouTube recommendations will be different from previous years.

Even independent experts have not yet rigorously reviewed the data and analysis, and the research has not examined in detail the relationship between watching YouTubers like Laura Loomer and Candace Owens, some of whom researchers described as having “alternative” channels. , And viewership of extreme videos.

More study is needed, but these findings suggest two things. First of all, YouTube may be credited with making changes to reduce the way it pushed people into mainstream views that they weren’t intentionally looking for.

Second, there is a need to talk more about how far YouTube should go to reduce exposure to potentially extreme or dangerous ideas for those who are inclined to trust them. Even a small minority of YouTube’s viewers who regularly watch extreme videos are millions.

Should YouTube make it harder, for example, for people to link to fringe videos – something they’ve considered? Should the site be made difficult for those who subscribe to extremist channels to automatically watch those videos or something like that? Or is the status quo right?

This research reminds us that social media can be a mirror of chaos in our world and constantly fight with complex ways that can strengthen and resist simple explanations. There is none.


Tip of the week

Brian X ChenThe New York Times consumer tech columnist is here to break down all you need to know about online tracking.

Last week, listeners of the KQED Forum radio program asked me questions about Internet privacy. Our conversation highlights how many people were concerned about monitoring their digital activity and how confused they were about what they could do.

Here’s a rundown that I hope will help on-tech readers.

There are two broad types of digital tracking. “Third-party” tracking That’s what we often find creepy. If you visit the shoe’s website and it logs in to what you’ve seen, then you may be seeing ads for those shoes elsewhere online. Repeatedly across many websites and applications, marketers compile a record of your activity for you to target ads.

If you are concerned about this, you can try a web browser like Firefox or Brave that automatically blocks this type of tracking. Google says its Chrome web browser will do the same in 2023. Last year Apple gave iPhone owners the option to say no to this type of online surveillance in apps and Android phone owners will have the same option at some point.

If you want to go the extra mile, you can download a tracker blocker, such as an app called uBlock Origin or 1Blocker.

Focus on the squeeze on third-party tracking “First-party “data collectionThe website or application monitors it when you use its product.

If you search for Chinese restaurant directions in the mapping app, the app assumes you like Chinese food and allows other Chinese restaurants to advertise you. Many people find this less creepy and potentially useful.

If you want to avoid first-party tracking other than using a website or app, you have no choice. You may also use an application or website without logging in to reduce the amount of information collected, although this may limit what you can do there.

  • Barack Obama’s crusade against false information: The former president is starting to spread the message about the dangers of lying online. My colleagues Steven Lee Myers and Cecilia Kang reported that she was “engaging in a heated but inconclusive discussion about how to best restore trust online.”

  • Elon Musk’s fund is obviously safe: The chief executive of Tesla and SpaceX gave details of loans and other lending commitments for their nearly $ 46.5 billion offer to buy Twitter. Twitter’s board should decide whether to accept, and Musk has indicated that it wants to let Twitter’s shareholders decide for themselves instead.

  • Here are three ways to reduce your tech costs: Brian Chen has tips on how to identify which online subscriptions you want to trim, save money on your cellphone bill, and decide when you need a new phone (and maybe not).

Welcome to Penguin Chick’s First Swim.


We want to hear from you. Let us know what you think about this newsletter and what else you would like us to explore. You can contact us ontech@nytimes.com,

If you haven’t already received this newsletter in your inbox, Please sign up hereYou can also read Past on take column,

Similar Posts

Leave a Reply

Your email address will not be published.