Naturality and non-transparency of technology in the age of intelligent voice assistants

Open Access
Article
Conference Proceedings
Authors: Jaroslaw KowalskiCezary BieleKazimierz Krzysztofek

Abstract: This paper describes the psycho-social consequences of using voice assistants, based on a new look at the concept of the interface as a tool that allows the use of another tool. In analysing the history of interfaces and considering them in terms of naturalness (ease of use) and transparency (visibility and understandability to the user), it can be seen that they evolved in the direction of unnaturalness for a long time, but have again recently become more natural. One of the newest stages of this process are voice assistants. The popularization of them has significant potential to strengthen the competence and abilities of a human being. The barriers of technical ability will be weakened, and abilities and possibilities based on access to instant information will be strengthened. A user will become a super-empowered individual. At the same time, the new type of interface will gain new, unprecedented features. The price that was paid for making the technology natural is its extreme non-transparency (its obscurity, complexity and convoluted nature). It is to be expected that the amount of information about a user that an interface has will be growing. It will start a far-reaching asymmetry – where the interface is trained on a neural network and has the knowledge about millions of users at its disposal; it will know how to make is so that a particular user chooses what is in the interest of interface owner or to undertake a desired kind of action. In the modern world knowledge translates into power, and this will be the first time in history that we will be faced with a situation in which an apparatus standing behind an interface may gain an advantage over a user on such a scale. The non-transparency of this interface will cause users to understand their tool less and less, and in extreme cases the asymmetry in the relations may go so far that it will lead to a reversal of the tool-user relation. It will be a human being who is the tool used to attain a particular aim, a means to an end; the human being may become an extension of technology. A simple example is that of road traffic control in cities – an application showing traffic density and navigating to destination. A driver relies here on information received by the system. He has to trust that a certain road is the most effective. It is enough for the application to have a different aim (e.g. directing the traffic so there are no accidents) for the user’s aim (get somewhere as fast as possible) and that of the application owner to have diverged, and that the human being has become a tool in helping the system achieve the aim. Thus, this situation serve as yet another example of Postman’s “invisible technology’’ , something that we do not see, and which is supposed to be neutral by definition. However, as Martin Heidegger noted, many technologies have the greatest impact on us when they are perceived as neutral.

Keywords: Intelligent Voice Assistants, Internet Of Things, Human-Technology Interaction, Smart Environments

DOI: 10.54941/ahfe100937

Cite this paper:

Downloads
184
Visits
462
Download