Algorithmic radicalization

The algorithmic radicalization (or radicalization pipeline) hypothesis is the concept that algorithms on popular social media sites such as YouTube and Facebook drive users toward progressively more extreme content over time, leading to them becoming radicalized to extremist political views.[1][2][3]

As of 2021, research is ongoing to discover whether this is a real, measurable phenomenon.[4][5]

References

  1. "The Websites Sustaining Britain's Far-Right Influencers". bellingcat. 2021-02-24. Retrieved 2021-03-10.
  2. Camargo, Chico Q. "YouTube's algorithms might radicalise people – but the real problem is we've no idea how they work". The Conversation. Retrieved 2021-03-10.
  3. E&T editorial staff (2020-05-27). "Facebook did not act on own evidence of algorithm-driven extremism". eandt.theiet.org. Retrieved 2021-03-10.
  4. "Study of YouTube comments finds evidence of radicalization effect". TechCrunch. Retrieved 2021-03-10.
  5. Ribeiro, Manoel Horta; Ottoni, Raphael; West, Robert; Almeida, Virgílio A. F.; Meira, Wagner (2019-12-04). "Auditing Radicalization Pathways on YouTube". arXiv:1908.08313 [cs.CY].

See also


This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.