Apps with Attitude

This research project proposes a new paradigm of interactive technology interfaces referred to as interventional agent interfaces (IAIs). IAIs differ from standard computer interfaces in that they adjust their responses and behaviour (interventions) in response to a child’s use of an interactive technology. The project will investigate the reasons why adults are motivated to intervene in a child’s use of an interactive technology and to what extent the motive for and context of the intervention influences the child’s response. It will assess how anthropomorphising the IAI impacts children’s trust in the agent. The project will adopt an empirical approach and aims to produce a theoretical model to inform the design of future interactive technologies supporting IAIs.

The proposed IAIs are different from internet monitoring software and other rule-based tools and content filters. The motivation for the deployment of such tools is primarily security and safety and they contribute to protecting children and other users by ensuring unsuitable applications and malicious content are not accessed. IAIs have a wider motivational context and this project will analyse how a child’s existing habits and behaviour can be modified in response to an intervention made by the IAI during normal use of the technology. The motivation for such interventions by the IAI may include goals such as improving productivity, enhancing understanding or simply facilitating fun all of which fall outside the remit of rule-based systems and content filters.

Initially, the research will focus on the factors that motivate parents and carers to intervene in their child’s use of an interactive technology and it addresses a number of questions. In particular, whether the context or perceived importance of the task being undertaken is a motivational factor. For instance, is a parent more likely to intervene if a child is playing a game as opposed to completing a homework task?  On the other side of the coin to what extent is a child’s response to an intervention governed by factors other than parental authority? How relevant is context to a child? and if the intervention is initiated by an IAI rather than a human, will other factors such as the persuasiveness or obtrusiveness of the IAI influence the child’s behaviour? As children’s interaction with interactive technologies increasingly occurs across a distributed array of mobile devices rather than a small number of relatively static technologies such as the family PC or games console, is it possible for adults to effectively intervene in their child’s use of technology? and to what extent would an IAI be trusted to initiate and carry out interventions normally made by a human?

In addition to analysing the motivational rationale for IAIs this project will examine their nature. In particular, the extent to which anthropomorphism, the attribution of human characteristics to nonhuman agents affects the perception of IAIs for both children and adults. It will assess whether IAIs which support anthropomorphism are more likely to be trusted than IAIs that do not exhibit human characteristics or behaviours. Trust in the context of this research project reflects confidence in the competence of the IAI to make appropriate interventions. The project will analyse different levels of anthropomorphism to assess their effect on trust. It will look at whether children are more trusting of interventions made by IAIs which display childlike as opposed to adult characteristics. It will examine whether gender affects children’s trust in an IAI and also whether the attractiveness of an IAI influences children’s trust.

The goal of the project is to produce a model based on the empirical research outlined above. The model will look to address some of the gaps in theoretical knowledge and provide designers with a basis on which to develop future IAIs. The model will be critiqued using heuristic evaluation and further user studies.