Negative consequences of information gatekeeping through algorithmic technologies: An Annual Review of Information Science and Technology (ARIST) paper

Rarely any study investigates how information gatekeeping through the solutions and services enabled by algorithms, hereafter referred to as algorithmic technologies (AT), creates negative consequences for the users. To fill this gap, this state‐of‐the‐art review analyzes 229 relevant articles from...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:Journal of the Association for Information Science and Technology Ročník 76; číslo 1; s. 262 - 288
Hlavní autoři: Potnis, Devendra, Tahamtan, Iman, McDonald, Luke
Médium: Journal Article
Jazyk:angličtina
Vydáno: Hoboken, USA John Wiley & Sons, Inc 01.01.2025
ISSN:2330-1635, 2330-1643
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Rarely any study investigates how information gatekeeping through the solutions and services enabled by algorithms, hereafter referred to as algorithmic technologies (AT), creates negative consequences for the users. To fill this gap, this state‐of‐the‐art review analyzes 229 relevant articles from diverse academic disciplines. We employed thematic analysis to identify, analyze, classify, and reveal the chain reactions among the negative consequences. We found that the gatekeeping of information (text, audio, video, and graphics) through AT like artificial intelligence (e.g., chatbots, large language models, machine learning, robots), decision support systems (used by banks, grocery stores, police, etc.), hashtags, online gaming platforms, search technologies (e.g., voice assistants, ChatGPT), and Web 3.0 (e.g., Internet of Things, non‐fungible tokens) creates or reinforces cognitive vulnerability, economic divide and financial vulnerability, information divide, physical vulnerability, psychological vulnerability, and social divide virtually and in the offline world. Theoretical implications include the hierarchical depiction of the chain reactions among the primary, secondary, and tertiary divides and vulnerabilities. To mitigate these negative consequences, we call for concerted efforts using top‐down strategies for governments, organizations, and technology experts to attain more transparency, accountability, ethical behavior, and moral practices, and bottom‐up strategies for users to be more alert, discerning, critical, and proactive.
ISSN:2330-1635
2330-1643
DOI:10.1002/asi.24955