Project "Social Interaction with voice- and touch-controlled virtual assistants"
Project leader: Dr. Henrike Helmer
Team: Dr. Mathias Barthel, Dr. Henrike Helmer, Dr. Silke Reineke
Virtual assistants such as Amazon Echo’s Alexa, Google Home, Siri and built-in voice-controlled assistants in cars are being used increasingly in everyday interactions. In our project, we analyze human interaction with virtual assistants in private settings. We will be focusing on four intertwined aspects of these types of interactions:
- Action formation: How are certain actions – such as instructions or questions – designed in interaction with virtual assistants (compared to interaction with human co-participants)?
- Routines: From a (micro-)longitudinal perspective, are there certain linguistic formats that routinize over time (i.e., do speakers tend to use certain formats that prove to be successful for the task at hand more often over time)?
- Action and intention ascriptions: When and how do users ascribe actions or intentions to the virtual assistants?
- Recipient design: The actions and routines described above can also reveal the concept that interactants hold of virtual assistants. Here we especially focus on potential changes in the recipient design of utterances over time that might reveal information of the partner model that interactants have of the virtual assistant.
We are approaching these questions within a framework of Interactional Linguistics and (multimodal) Conversation Analysis and combine these with corpus-linguistic and quantitative methods.