TagAlong: Informal Learning from a Remote Companion with Mobile Perspective Sharing
Author(s)
Greenwald, Scott W.; Khan, Mina; Vazquez, Christian D.; Maes, Pattie
DownloadMain article (535.5Kb)
Metadata
Show full item recordAbstract
Questions often arise spontaneously in a curious mind, due to an observation about a new or unknown environment. When an expert is right there, prepared to engage in dialog, this curiosity can be harnessed and converted into highly effective, intrinsically motivated learning. This paper investigates how this kind of situated informal learning can be realized in real-world settings with wearable technologies and the support of a remote learning companion. In particular, we seek to understand how the use of different multimedia communication mediums impacts the quality of the interaction with a remote teacher, and how these remote interactions compare with face-to-face, co-present learning. A prototype system called TagAlong was developed with attention to features that facilitate dialog based on the visual environment. It was developed to work robustly in the wild, depending only on widely-available components and infrastructure. A pilot study was performed to learn about what characteristics are most important for successful interactions, as a basis for further system development and a future full-scale study. We conclude that it is critical for system design to be informed by (i) an analysis of the attentional burdens imposed by the system on both wearer and companion and (ii) a knowledge of the strengths and weaknesses of co-present learning.
Date issued
2015-10Publisher
12th International Conference on Cognition and
Exploratory Learning in Digital Age 2015 (CELDA)
Keywords
situated learning, remote learning, contextual memory, wearable technology, informal learning