Aligning artificial intelligence with human values: reflections from a phenomenological perspective

Artificial Intelligence (AI) must be directed at humane ends. The development of AI has produced great uncertainties of ensuring AI alignment with human values (AI value alignment) through AI operations from design to use. For the purposes of addressing this problem, we adopt the phenomenological th...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:AI & society Ročník 37; číslo 4; s. 1383 - 1395
Hlavní autoři: Han, Shengnan, Kelly, Eugene, Nikou, Shahrokh, Svee, Eric-Oluf
Médium: Journal Article
Jazyk:angličtina
Vydáno: London Springer London 01.12.2022
Springer
Springer Nature B.V
Témata:
ISSN:0951-5666, 1435-5655, 1435-5655
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:Artificial Intelligence (AI) must be directed at humane ends. The development of AI has produced great uncertainties of ensuring AI alignment with human values (AI value alignment) through AI operations from design to use. For the purposes of addressing this problem, we adopt the phenomenological theories of material values and technological mediation to be that beginning step. In this paper, we first discuss the AI value alignment from the relevant AI studies. Second, we briefly present what are material values and technological mediation and reflect on the AI value alignment through the lenses of these theories. We conclude that a set of finite human values can be defined and adapted to the stable life tasks that AI systems will be called upon to accomplish. The AI value alignment can also be fostered between designers and users through technological mediation. Upon that foundation, we propose a set of common principles to understand the AI value alignment through phenomenological theories. This paper contributes the unique knowledge of phenomenological theories to the discourse on AI alignment with human values.
Bibliografie:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0951-5666
1435-5655
1435-5655
DOI:10.1007/s00146-021-01247-4