AI chatbot accountability in the age of algorithmic gatekeeping: Comparing generative search engine political information retrieval across five languages

This study investigates the performance of search engine chatbots powered by large language models in generative political information retrieval. Applying algorithmic accountability as a central theme, this research (a) assesses the alignment of artificial intelligence (AI) chatbot responses with ti...

Full description

Saved in:
Bibliographic Details
Published in:New media & society
Main Authors: Kuai, Joanne, Brantner, Cornelia, Karlsson, Michael, Van Couvering, Elizabeth, Romano, Salvatore
Format: Journal Article
Language:English
Published: 28.02.2025
Subjects:
ISSN:1461-4448, 1461-7315, 1461-7315
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This study investigates the performance of search engine chatbots powered by large language models in generative political information retrieval. Applying algorithmic accountability as a central theme, this research (a) assesses the alignment of artificial intelligence (AI) chatbot responses with timely political information, (b) investigates the factual correctness and transparency of chatbot-sourced synopses, (c) examines the adherence of chatbots to democratic norms and impartiality ideals, (d) analyzes the sourcing and attribution behaviors of the chatbots, and (e) explores the universality of chatbot gatekeeping across different languages. Using the 2024 Taiwan presidential election as a case study and prompting as a method, the study audits responses from Microsoft Copilot in five languages. The findings reveal significant discrepancies in information readiness, content accuracy, norm adherence, source usage, and attribution behavior across languages. These results underscore the contextual awareness when applying accountability assessment that looks beyond transparency in AI-mediated communication, especially during politically sensitive events.
ISSN:1461-4448
1461-7315
1461-7315
DOI:10.1177/14614448251321162