Voice assistant applications have become part of the daily life of many people, as they provide the user with many important functions such as answering inquiries and questions, searching for information quickly, playing music or reminding of appointments.
Through friendly voices and pre-programmed answers to humorous questions or philosophical inquiries, the user feels that the voice assistant has become his digital companion. Read also Smart home .. a quick look at wireless standards A frantic race between companies and countries … Here are all the electric cars that we will see in 5 years Watch … 10 inventions that will shape our homes in the future I Phone , Samsung or Google .. 4 tips to discover the amazing capabilities of your phone’s camera
Esther Gornemann, at the Vienna University of Economics and Business, explained that people prefer to humanize things in order to explain processes they do not understand.
“When Cortana does not do what I ask of it, it might be because the app doesn’t want to do this,” she added. Participants in a recent study reported that the Alexa app is characterized by several qualities such as “friend” or “magician,” or even “becoming a new member in the family, sitting with them at the breakfast table in the morning,” and the trend towards imparting The human nature of the applications of the voice assistant,
Esther Gurnman pointed out that there is a social drive to humanize things, and this topic has become more important with the Corona pandemic, and added, “The user tries to compensate for the lack of social relations with others. When a person suffers from loneliness and isolation, he tends more to create Social ties between him and things. “
Professor Arvid Cabas, from Jacobs University Bremen, explained that in general, there is no need to worry if a user notices that he is talking to digital assistant applications a lot.
“We know that loneliness and isolation are the worst experiences that a person can go through. When one does not find any opportunity to talk to others, he tends to humanize things,” the German psychiatrist added.
Nevertheless, Professor Arvid Kapas emphasized the necessity of trying to engage in social interactions by other means, such as by phone contact with friends, relatives and real people.
Arvid Kabas is not surprised that children may view voice assistant applications as real creatures. “One should not be alarmed when he notices that his children talk about their toys a lot, and they think that the teddy bear has a soul,” he added.
It is noted that children’s ability to conduct complex interactions with inanimate objects is not a new thing, and the new generation of voice assistant applications can understand language much better than it was the case before, yet it is still too early to have conversations In-depth with voice assistant applications.
GBT-3 Technology
Esther Gurnman confirmed this view, but she believes that this situation may change soon through the development in the field of artificial intelligence, and added: Through the technology “GPT-3” (GPT-3) we have artificial intelligence that can formulate texts In a good way, and is characterized by creativity and surprising diversity of uses, and this language model is one of the basic components of the voice assistant applications, which we can create social contact with. This will be problematic when people start to replace their social relationships and opt for voice assistant apps.
Professor Andreas Dengel, Director of the German Research Center for Artificial Intelligence, explained that the applications of the voice assistant are only an additional means of making communication and speeding up the performance of various tasks, and therefore it will not be a useful means for the comfort of the soul and the soul.
Andreas Dingel added that the most digital assistant applications can do is pretend empathy for the user and perform very limited.
The German professor explained this by saying, “One needs negative conversations as well in order to feel sympathy, and interpersonal communication is a very complex, multi-dimensional and multi-aspect matter that is more than a conversation between the user and the voice assistant applications.”
And Andreas Dingle warned that children should not play with voice assistant applications a lot, despite all the wonderful functions they have, because these applications may have a negative impact on their ability to communicate. The German professor explained this by saying, “Communication is not limited to language only, but includes a variety of non-verbal forms of communication, such as facial expressions and gestures, and one does not learn these things when dealing with voice assistant applications.”
In addition to these risks, Professor Arvid Kabas sees some advantages of voice assistant applications, especially for the elderly, as these digital tools can provide more freedom by relying on the digital assistant as a personal companion for the elderly, and help in some matters such as Reminders of the dates of taking medicines.
Arvid Kabas added that the natural language interface is better suited for the elderly, who can no longer write well or look at screens, as the user can ask voice assistant applications to call someone simply without having to search in numbers. Or writing them, and dealing with voice assistant applications is fun for most
users.
Monitoring risks
Esther Gurnman added that voice assistant applications carry the risk of surveillance, and she explained, “I see that there is a problem when we disclose more personal information when creating a social relationship with our voice assistant applications, because that happens completely involuntarily.” We may not be aware of that. “
The companies producing voice assistant applications have developed patents that select keywords from voice inputs, which can be exploited for advertising purposes, and in the long term these companies will get to know a lot of information about users, and thus they will learn about the types of ads that can be used. Fit them.
Esther Gornemann warned that ads can be individually tailored to suit situations, and the user may not notice that their own behaviors are being manipulated, and added, “As long as the tech giants scrutinize the user down to the smallest detail, there is no transparency on this issue yet. There is a risk that the user will act as the developers of voice assistant applications want them to without
noticing.