Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

From signals to knowledge: A conceptual model for multimodal learning analytics

From signals to knowledge: A conceptual model for multimodal learning analytics Multimodality in learning analytics and learning science is under the spotlight. The landscape of sensors and wearable trackers that can be used for learning support is evolving rapidly, as well as data collection and analysis methods. Multimodal data can now be collected and processed in real time at an unprecedented scale. With sensors, it is possible to capture observable events of the learning process such as learner's behaviour and the learning context. The learning process, however, consists also of latent attributes, such as the learner's cognitions or emotions. These attributes are unobservable to sensors and need to be elicited by human‐driven interpretations. We conducted a literature survey of experiments using multimodal data to frame the young research field of multimodal learning analytics. The survey explored the multimodal data used in related studies (the input space) and the learning theories selected (the hypothesis space). The survey led to the formulation of the Multimodal Learning Analytics Model whose main objectives are of (O1) mapping the use of multimodal data to enhance the feedback in a learning context; (O2) showing how to combine machine learning with multimodal data; and (O3) aligning the terminology used in the field of machine learning and learning science. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Computer Assisted Learning Wiley

From signals to knowledge: A conceptual model for multimodal learning analytics

Loading next page...
 
/lp/wiley/from-signals-to-knowledge-a-conceptual-model-for-multimodal-learning-xYVrh0D0Jq

References (65)

Publisher
Wiley
Copyright
© 2018 John Wiley & Sons Ltd
ISSN
0266-4909
eISSN
1365-2729
DOI
10.1111/jcal.12288
Publisher site
See Article on Publisher Site

Abstract

Multimodality in learning analytics and learning science is under the spotlight. The landscape of sensors and wearable trackers that can be used for learning support is evolving rapidly, as well as data collection and analysis methods. Multimodal data can now be collected and processed in real time at an unprecedented scale. With sensors, it is possible to capture observable events of the learning process such as learner's behaviour and the learning context. The learning process, however, consists also of latent attributes, such as the learner's cognitions or emotions. These attributes are unobservable to sensors and need to be elicited by human‐driven interpretations. We conducted a literature survey of experiments using multimodal data to frame the young research field of multimodal learning analytics. The survey explored the multimodal data used in related studies (the input space) and the learning theories selected (the hypothesis space). The survey led to the formulation of the Multimodal Learning Analytics Model whose main objectives are of (O1) mapping the use of multimodal data to enhance the feedback in a learning context; (O2) showing how to combine machine learning with multimodal data; and (O3) aligning the terminology used in the field of machine learning and learning science.

Journal

Journal of Computer Assisted LearningWiley

Published: Jan 1, 2018

Keywords: ; ; ; ; ;

There are no references for this article.