Get 20M+ Full-Text Papers For Less Than $1.50/day. Subscribe now for You or Your Team.

Learn More →

Learning Analytics for Blended Learning: A Systematic Review of Theory, Methodology, and Ethical Considerations

Learning Analytics for Blended Learning: A Systematic Review of Theory, Methodology, and Ethical... Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, and Ethical Considerations https://doi.org/10.3991/ijai.v2i2.17887 ( ) Nina Bergdahl , Jalal Nouri, Thashmee Karunaratne, Muhammad Afzaal Stockholm University, Stockholm, Sweden ninabe@dsv.su.se Mohammad Saqr KTH Royal Institute of Technology, Stockholm, Sweden Abstract—Learning Analytics (LA) approaches in Blended Learning (BL) research is becoming an established field. In the light of previous critique toward LA for not being grounded in theory, the General Data Protection and a renewed focus on individuals’ integrity, this review aims to explore the use of theories, the methodological and analytic approaches in educational settings, along with sur- veying ethical and legal considerations. The review also maps and explores the outcomes and discusses the pitfalls and potentials currently seen in the field. Jour- nal articles and conference papers were identified through systematic search across relevant databases. 70 papers met the inclusion criteria: they applied LA within a BL setting, were peer-reviewed, full-papers, and were written in English. The results reveal that the use of theoretical and methodological approaches were disperse. We identified approaches of BL not included in categories of BL in existing BL literature and suggest these may be referred to as complex hybrid learning, and that ethical considerations and legal requirements often have been overlooked. We highlight critical issues that contribute to raise awareness and inform alignment for future research to ameliorate diffuse applications within the field of LA. Keywords—Literature review; learning analytics, blended learning, complex hybrid learning 1 Introduction 1.1 The emergence of learning analytics Given the wealth and complexity of learning, learning sciences have become an in- terdisciplinarity domain that includes cognitive, educational, social and computer sci- ence among others. Ten years ago, learning analytics (LA) emerged as a multidiscipli- nary research domain with the overarching premise to harness the power of data and analytics to advance our understanding of learning as well as help improve learning, 46 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … teaching and optimize the learning environments [26]. As such, LA is defined as “the measurement, collection, analysis and reporting of data about learners and their con- texts for purposes of understanding and optimizing learning and the environments in which it occurs” [83]. Interest in LA was catalysed by three main factors; firstly: the rewarding business intelligence and big-data success stories that has contributed to in- dustry growth and added competitive advantage to companies by allowing them to bet- ter understand customers and offer better recommendations (ibid.). Secondly, the avail- ability of immense volumes of digital traces and clickstream data recorded by learning management systems and other digital learning environments, such as student’s infor- mation systems, online library platforms and video streaming services. Third, the rev- olutionary developments in data science methods and improved computer hardware that became more powerful and accessible [15, 21, 83]. Inspired by the industry, LA has initially used digital trace data to create predictive models to for example, forecast drop- outs, identify students at-risk of failing, or offer visual dashboards. However, criticism had been levied at these models for failing to account for contextual factors, being athe- oretical and difficult to replicate [25, 100]. Research has since extended both data col- lection, analysis methods and approaches to theory in LA research. Recently, data col- lection methods have grown in volume and diversity to cover the full breadth of learn- ers’ activities e.g., classroom interactions, physiological indices, proximity data, eye tracking, self-reports in addition to the commonly used digital traces. Similarly, meth- ods have grown to include sequence and process mining, epistemic network analysis as well as advanced machine learning algorithms [16, 20, 62]. Today, when it is common that students’ educational reality is an integration of physical and virtual learning environments, often referred to as a Blended Learning (BL) [37-39], the data collection methods are often broadened to survey both the phys- ical and the virtual spaces. Applying such data collection and theoretically aligned re- fined models, researchers hope to capture learners’ behaviour where it occurs. The study of LA and BL is a growing field of inquiry [48], which is garnering a growing attention in and outside the LA community. The rationale for undertaking this review is that LA research has been criticised for not being grounded sufficiently in theory. For example, that perspectives of learning are lacking [57], and that BL research have re- mained vague and unclear, why there has been a call for research to further develop definitions of BL, models and conceptualisations [46]. Moreover, as LA is becoming an established research field of its own, it is interesting to explore the methodological practices applied across LA research and with the fast-paced development of big data analytics on individual trace data and the implementation of the General Data Protec- tion Regulations of the European Union (GDPR), valid concerns might be raised with regards to if and how ethical and legal considerations have been applied. Accordingly, a systematic review that identify theoretical underpinnings, methodological practices, considerations of ethical aspects and legal requirements and the contributions in LA research is needed to raise awareness and inform alignment for future research. iJAI ‒ Vol. 2, No. 2, 2020 47 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 2 Background 2.1 Learning analytics in blended learning Blended Learning is a term coined in the 90’s, which related practices over the years gained substantial influence, and today is regarded as the “new normal” [46]. The con- cept of BL, its operationalisations and definitions are wide and still evolving [1]. Re- searchers have highlighted that BL is a broad term that may reflect different variations to what is blended, the extent and duration of the blend, models of blended learning, a systematic approach that include any part of a system that combines face-to-face (f2f) instruction with instructions meditated be digital technologies [37, 46] which com- monly include blended instruction, blended delivery, blended learning environments [37] spatiotemporal aspects, where learning can be self-paced and individual qualitative aspects [56] which reflect thoughtful integration [31]. In short, BL can be viewed as an umbrella term, that without specific descriptions will not inform the reader what aspects of teaching and learning that is approached. In addition, LA approaches has been critiqued for being atheoretical [32, 57]. In LA and BL research, as in other educational research, the main aim is to support students to succeed with their education. Such effort can be demonstrated by including theory that provides guidance for how to understand, operationalise, measure and interpret for example students’ engagement in learning. Engagement theories may emphasize different aspects, for example agency [5] or cognition [6] related to self-regulation (SRL). Student engagement is critical for learning, and from this perspective LA in BL is warranted as it combines the BL setting with theoretical insights of students use of digital technologies, their ability to self-regulate to re-engage in the face of difficulty, distraction, frustration, simultaneous social demands et cetera. However, critique has forwarded that LMS data may not be suitable to capture a nuanced understanding of student engagement, this as engagement is a multi-dimensional construct, and LMS data, still, at best can reflect a one-dimensional aspect of engagement, [42]. Moreover, researchers [42] could not find any significant correlation between student self-reports of engagement and the LMS trace data. Thus, if the approaches and applications of BL, theories from learning perspectives and how these are operationalised are lacking, and self-reports and traces data are not correlating, this decreases the value of the contribu- tions in the field of LA. Previous reviews have surveyed theoretical underpinnings in LA research, and concluded that the grounding in (educational) theory is evident but too often meagre or lacking [e.g. 57, 96]. For example [57] concluded that existing learning analytics dashboards research are rarely grounded in learning theory, cannot be suggested to support metacognition and thus, do not inform teachers of effective practices and strategies. LA reviews have also explored methods applied within LA research [e.g. 6] and identified that LA studies use diverse methods, for example, visual data analysis and clustering techniques, social network analysis (SNA) and educational data mining. Taken together, however, the existing reviews on LA research have not taken contributions into account, such approach is critical, as if applications of BL, theories from learning perspectives and how these are operationalised are insufficient or lacking, the contributions becomes unclear. 48 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Today, there are a substantial number of Learning Analytics reviews. LA reviews often specialise on particular areas like for instance: game learning analytics [9], visual learning analytics, [96], the role of self-regulated learning (SRL) [98], learning analyt- ics dashboards [57], or the uses of LA in relation to specific methods or approaches e.g.; open learner models [10] educational data mining [12] or apply a wider scope that explore national research ef-forts, policies, infrastructures and competence centres across several European countries [68]. While several reviews highlight similar findings (i.e. a lack of theoretical underpin- ning, unclear uses of methods) there is a risk of transferring and projecting findings across LA research, as findings which might not reflect the broader LA research, which in turn may lead to overgeneralisations. Although there are many published (systematic, scoping and area-specific) reviews on LA in online settings, in order to understand their aim, objective and contribution, it is beneficial to approach a less specific overview of LA research to survey commonalities, of theoretical underpinnings (including concep- tualisations of BL and learning perspectives) methodological approaches, ethical and legal requirements and contributions. However, in addition to theoretical and methodological aspects, an additional layer of complexity is added to LA research in a BL environment. LA is in itself a practice of gathering, analysing and sharing big amounts of personal data, which comes with an increased need for ethical considerations and adherences to legal requirements. The ethical, privacy and legal concerns of processing of personal data are on the frontier of data processing due to the presence of the GDPR [14]. LA is a subject developed on data-driven approaches to education innovation, and hence, in the spotlight of this con- cerns. Beyond ethics, the GDPR provides a legal framework in preserving the rights of the data subjects, that is: the students. Learning analytics operates on data about stu- dents and their learning environments, where personal data of the students is an integral part. Personal data of students refers to any data that directly or indirectly connected to an identifiable person, e.g., student names, personal identification numbers, email, pho- tographs, and other data that could lead to identifying an individual [29]. It is typical that learning and student management systems store, retrieve, and process such data, driven by different academic and learning purposes [15]. While the absence of ethical considerations [16], [17], privacy issues and GDPR [18] have been previously critiqued in regards to the adoption of LA we did not find any existing review that had explored these aspects of GDPR, ethics and privacy on LA research. Therefore, in this study, we have added a focus on how the reviewed studies consider ethical and legal aspects of using data. Informed by these previous concerns and critiques we raised the following questions: 1. How is blended learning defined in the reviewed learning analytics research? 2. For which learning focus perspectives are theories used in the reviewed learning an- alytics research? 3. What approaches of data collection, methods and analysis are evident in the reviewed learning analytics research? 4. How are ethical and legal aspects considered in the reviewed learning analytics re- search? 5. What are the contributions of the reviewed learning analytics research? iJAI ‒ Vol. 2, No. 2, 2020 49 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 3 Method 3.1 Search strategy and selection procedure This study examines academic journals and conference papers applying Learning Analytics in Blended Learning from two databases (see Table 1). A systematic search was conducted using EBSCOhost via Stockholm University library (filtered by content providers: Scopus, ERIC, Academic Search Premier, Directory of Open Access Jour- nals, ScienceDirect) for academic journals, and ACM DL Digital Library, for confer- ence papers. As detailed below, the systematic search via EBSCOhost followed educa- tional journals by status [13], and the selection employed journal rankings provided by SCIMAGO Institutions Rankings. Table 1. Overview of search string results Database and journal search strings Hits EBSCOhost database search string “learning analytics” + “blended learning” 79 “learning analytics” + “blended learning” (incl. “within full text of articles”) 282 “learning analytics” + “blended environment” 0 “learning analytics” + “blended learning environment” 2 “teaching analytics” + “blended /”-learning”/ “-environment” 0 “educational data mining” + “blended learning” 8 “educational data mining” + “blended” 8 “educational data mining” + “blended environment” 0 ACM DL Digital Library “learning analytics” + “blended learning” + “LAK” (Learning Analytics & Knowledge confer- ence) “educational data mining” + “blended learning” + “LAK” (Learning Analytics & Knowledge conference) “learning analytics” + “blended learning” 43 “educational data mining” + “blended learning” 22 Journal search via EBSCOhost via Stockholm University Library and the Journal of Learning Analytics “blended learning” 7 “learning analytics” + “blended learning” 6 “educational data mining” + “blended learning” 3 Internet and Higher Education “learning analytics” + “blended learning” 14 “educational data mining” + “blended learning” 8 Educational Technology and Society “learning analytics” + “blended learning” 2 “educational data mining” + “blended learning” 0 Journal of Computer Assisted Learning “learning analytics” + “blended learning” 2 “educational data mining” + “blended learning” 0 British Journal of Educational Technology “learning analytics” + “blended learning” 16 50 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … “educational data mining” + “blended learning” 5 Computers in Human Behavior “learning analytics” + “blended learning” 26 “educational data mining” + “blended learning” 17 Computers and Education, Communications in Information Literacy, Learning and Instruction, International Review of Research in Open and Distance Learning, Edu- cational Evaluation and Policy Analysis International Journal of Mobile and Blended Learning “learning analytics” + “blended learning” 0 “educational data mining” + “blended learning” 0 The search combinations used in SCIMAGO: Social Sciences + E-learning + All regions / countries + Journals + 2017; Social Sciences + Education + All regions / coun- tries + Journals + 2017; Computer Science + Human-Computer Interaction + All re- gions / countries + Journals + 2017; and Computer Science + Human-Computer Inter- action + All regions / countries + Journals + 2017. Inclusion from each 4 search com- binations above was determined by relevance of the title and the choice was limited to the top-10 journals in each search combination. We identified papers from the following six journals Internet and Higher Education, Journal of Computer Assisted Learning, British Journal of Educational Technology, Computers in Human Behaviour, Educa- tional Technology and Society, and the Journal of Learning Analytics. To search com- binations used for the EBSCOhost database, we used combinations of keywords: “learning analytics” + “blended learning”; “learning analytics” + “blended environ- ment”; “teaching analytics” + “blended learning”; “teaching analytics” + “blended en- vironment”; “educational data mining” + “blended learning”; “educational data min- ing” and “blended environment”. We included peer-reviewed, academic journals, writ- ten in English. We also tried including a “search within full text of articles”; and screened titles and abstracts of the papers for inclusion, and remove duplicates. We decided to not utilise the function further, as it returned irrelevant articles where BL and LA were mentioned only in the reference section. We searched for articles pub- lished the between January 2013-July 2020. Overall, the keyword searches amounted to 304 hits (not including the search within full text of articles). After removing the duplicates, 193 journal articles and conference papers remained; 38 hits did not return full texts and 4 hits returned hits in other lan- guages (three in Danish and one in German) although the search criteria were aiming at English texts only. After that, we sifted through the remaining papers and excluded 32 papers that were not directly relevant to LA and BL, and 49 that lacked one of the two focuses (either LA or BL). During close-reading, an additional three papers were excluded, as they did not meet the inclusion criteria. Thus, which we proceeded to code and later analyse 70 papers. 3.2 Data coding and analysis Following a coding scheme all articles were read through by two authors, who sorted the content in: article data (country, publication year, title), educational context, (blended learning interpretation and level), research aims/questions, theoretical iJAI ‒ Vol. 2, No. 2, 2020 51 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … underpinnings and definitions of BL, data sources, data collection methods, ethical con- siderations, analytical methods and results and contributions. All authors then con- ducted a deeper analysis of one section of the reviewed articles each (1. theoretical underpinnings, 2. data collection, methods and analysis, 3. ethical and legal considera- tions and 4. contributions). In depth discussions were held between the authors to dis- cuss approaches and align findings. 4 Results The result section details the findings as follows: 4.1 Theoretical underpinnings, 4.2 Data collection, methods and analysis, 4.3 Ethical and legal considerations and 4.4 Con- tributions of the reviewed articles. 4.1 Theoretical underpinnings To discern the positioning of the articles in terms of their relation to BL, we analysed the articles with regards to how blended learning was used throughout the articles, in particular how frequent the authors refer to blended learning, their definition, descrip- tion and use of theory. Currently, BL literature [e.g. 37, 39] have identified three com- mon ways in which explorations of blended learning delivery may vary: blended in- struction, blended distribution, as identified in [14, 15] or blended pedagogies [54]. However, going through these descriptions, we also found that studies could displayed a combination of blended instruction and blended distribution; i.e. when a section of the course is provided fully f2f, followed by the remainder offered fully online [79] or reversed: a course is delivered fully online and then fully f2f [49]. We also identified that the BL was used in ways beyond these categories. We identified a combination of blended learning approaches in which some, but not all, students use the BL component. For example, we identified studies that offer i) optional adoption of the blended com- ponent to the students, [e.g. 43, 80, 87], or when ii) there was a synchronous teaching of f2f students and online students in the same classroom [9] and iii) in cases of reversed distribution; a channel directed exclusively from the student to peers and/or teachers, in which the teaching (distribution, delivery and pedagogy) has remined traditional, for example as an e-portfolio accessible in a social network [35] or in flipped classrooms, where students responded to distributed (asynchrounous) media and instruction in their own time and place. [40]. 52 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Table 2. Blended learning definitions Blended learning definitions “Technology to support face-to-face teaching [2] and to enhance student participation” (Liao & Lu, 2008). [7] [32] “Blended learning system as one which combines face-to-face instruction with computer-medi- [58][63] ated instruction with the aim of complementing each other” (Graham, 2006; 2009; 2013) [75][102] “The range of possibilities presented by combining Internet and digital media with established [23] classroom forms that require the physical co-presence of teacher and students” (Friesen, 2012) “B-learning is the form of learning environment where the traditional classroom teaching and face-to-face communication between teacher and students are blended with the computer-medi- [30] ated interaction “(Bubaš & Kermek, 2004) “Blended learning is a combination of traditional face to face learning and online learning. It has the advantages of the both, providing students with unique flexible learning experience and be- [36] coming one of the fastest growing trends in educational field” (Thorne, 2003) “The thoughtful integration of classroom face-to-face learning experiences with online learning [41] [76] experiences” (Garrison & Kanuka, 2004) “Taking the best from self-paced, instructor-led, distance and classroom delivery to achieve flex- ible, cost-effective training that can reach the widest audience geographically and in terms of [44] learning styles and levels” (Marsh & Drexler, 2001) “The integration of thoughtfully selected and complementary face-to-face and online approaches [60] and technologies’’ (Garrison & Vaughan, 2008) “Blended learning is learning that happens in an instructional context which is characterized by a deliberate combination of online and classroom- based interventions to instigate and support [94] learning. Learning happening in purely online or purely classroom-based instructional settings is excluded” (Boelens,Van Laer, De Wever & Elen, 2015). Table 2 shows an overview of the used definitions of blended learning. While 29% of the articles offered a clear definition, most articles relied on inferences or contextual descriptions. 18% of the articles neither inferred nor described BL. The articles that offered a definition most commonly cited Graham [37-39] Analysis from a learning focus perspective: Revealed five themes reflecting the perspective of the research: (i) the flipped classroom, (ii) collaborative learning, (iii) conversational aspects of learning, (iv) engagement and self-regulation operationalised using system trace data and (v) learner profiles and procrastination. Studies that include theories are presented in a condensed and summarised form (the others are not). 1. The flipped classroom: While most studies exploring the flipped classroom, ap- proached student engagement and learning, a few were focusing on the actual learn- ing situation [19–21]. These studies applied a more over-arching, abstract level of theory to inform their study, and also discussed their findings in the light of theory. However, while SRL were, by far, the most commonly used theory to explore flipped classroom design, most studies did not seek to explore the blended learning environ- ment. 2. Collaborative learning: Social Network Analysis was used to visualise online in- teractions, and identify productive behaviours and correlation with performance [35, 41, 43, 81]. These used constructivist and situated learning theories and theories of self-regulation. iJAI ‒ Vol. 2, No. 2, 2020 53 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 3. Conversational aspects of learning: Studies exploring conversations aspects of learning, most commonly approached feedback operationalised as online reports, re- ferring to feedback and assessment theories [e.g. 72, 90], or deep learning theories [40, 51]. Another type of input to learning was explored by [89], grounded their study in the Dispositional Learning Analytics (DLA) infrastructure, used previous publications on assistant conversational agents, and theories on cognitive load in mi- croblogging. Using the foundation of the Community of Inquiry framework, which prioritises teacher presence, and active participation, [79] used trace data to operationalise active participation, as the number of: messages sent, documents uploaded, chat sessions attended, as well as data collected to analyse teacher presence. 4. Engagement and self-regulation operationalised using system trace data: Out of all the theories applied, engagement in general and self- regulated learning (SRL) in particular, were the most commonly used. To add to these research approaches, as- pects of culture and gender were introduced and explored [86]. While SRL often was operationalised as observable indicators in system logs, motivation was approached by measures of self-efficacy, intrinsic value, test anxiety, cognitive strategy and self- regulation using a questionnaire. SRL was often operationalised as trace data, and combined with other engagement and learning theories. [36, 52, 58, 80]. Numerous studies explored relations between trace data, performance and SRL using self- re- ports [30]; some in combination with other theories, for example theories of motiva- tion [20] socio-cultural perspectives [31], Self-Determination Theory and the Con- trol-Value Theory of achievement emotions [86-88]. 5. Learner profiles: We identified that with studies exploring learner profiles, it was common to inform this approach with other theories, for example, course satisfaction and social constructivist theory [18], deep and shallow learning [32] active learning and engagement [35] and procrastination [1, 54, 66]. In the reviewed studies, student learning strategies was often operationalised as trace data on student interaction with online learning resources [33]. Amongst these, procrastination was found to be com- mon. Several studies operationalised SRL as procrastination [54, 66]. For example, using LMS data to survey time spent studying and time spent refraining from access- ing available data [54]. Procrastination was also explored without relation to SRL, or how long the student waited before accessing LMS materials [1]. Other research- ers used questionnaires to survey procrastination and risk taking using the Expec- tancy-Value Theory, motivation, using the Academic Motivational Scale and help seeking, and epistemic emotions to inform a to approach how different learning strat- egies relate to preferences of feedback [66]. In sum: While most reviewed studies approaching a flipped classroom, used theories with a focus on students and their engagement and learning, a few were focusing on the actual learning situation [69, 75, 76, 94] or combined flipped classroom theories of Computer Assisted Language-Learning [34]. The latter studies applied a more over- arching, abstract level of theory to inform their study, and also discussed their findings in the light of theory. Some studies argued that there is a need to develop a specific SRL-LA theory [63]. However, while SRL were, by far, the most commonly used 54 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … theory, most studies did not seek to explore the blended learning environment, but seemed to relate their data collection to operationalisations related to a learning per- spective, with or without underpinning theories of learning. 4.2 Data collection, methods and analysis All studies included in the review used a digital platform for collecting data. As can be expected, LMS was the most used platform for data collection (this was true in 56 studies, 89%). Among them 14 studies (25%) used more than one platform for data collection (for a full overview of studies and data sources, see Appendix A). A single study used custom LMS, two studies used video streaming software, and one study used wiki. Digital traces were the most collected data types in (90.5%), followed by self- reported surveys 27 (42.9%). Self-reported surveys were used to collect data about stu- dents’ depositions such as engagement, motivation and learning styles. Relational and social network data from computer supported mediated interactions were collected in eight studies (12.7%). Interviews were collected in five studies, video or observation in three, multimodal data were collected in two studies, and, transcripts of classroom in- teractions were reported in one study. Most of the data collected in the reviewed studies were digital data (see Table 3). Data were collected from the classroom in only six studies, where two other studies reported on multimodal data, and four studies used video recording and observation of classroom setting. [45] used multimodal data through a system called SPACLE to rec- ord classroom interactions among students and teachers. The interactions recorded in- cluded on-task, off-task, talking to class, outside or inactivity data. The system allowed for spatial data about positions of the users in the class, and their activity levels. [85] used classroom observations to report on the teachers and students’ classroom behav- iour, although the methods do not clearly describe in detail what was observed and how it was reported. [81] collected f2f data to measure teaching presence according to the community of inquiry framework. Transcripts of audio recordings of the lessons facil- itated the thematic content analysis. Real-time classroom observations were also done. Performance data such as grades or continuous assessment were collected from most studies (88.9%). While LMS data may be informative, it does not capture the f2f learn- ing environment, the process of learning, or the student-teacher or the student-student dynamics. The stark contrast between results collected from digital resources and class- room represents an obvious gap. Most data were gathered using digital traces, disposi- tional self-reports, relational data and interviews that are disconnected from the class- room where a significant amount of learning happens. iJAI ‒ Vol. 2, No. 2, 2020 55 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Table 3. Types of data collected and their percentage across all studies Data type Y % No % Trace 57 90.5 6 9.5 Survey 27 42.9 36 57.1 SNA 8 12.7 55 87.3 Interviews 5 7.9 58 92.1 Observation/video 3 4.8 60 95.2 Multimodal 2 3.2 61 96.8 Discourse 2 3.2 61 96.8 The analysis methods in most of the studies (98.4%) employed were the traditional descriptive statistics, frequentists, and group comparisons, that included correlations, comparison of means, and chi-square (see Table 4). Visualisation was used in a signif- icant number of studies (77.8%), in the context of explaining results, but not necessarily as a research objective. Thus, few studies used visualisation as their research objective. However, we also found evidence of development of systems that gather information from different data sources to provide visual analytics to enhance feedback offered to students [102]. However, such application of visualisation was rare. Table 4. Overview of analysis methods Methods Count % No % Statistics 62 98.4 1 1.6 Visualisation 49 77.8 14 22.2 Regression 32 50.8 31 49.2 Machine learning or AI 29 46.0 34 54.0 Clustering 21 33.3 42 66.7 SNA 10 15.9 53 84.1 Sequence or process mining 10 15.9 53 84.1 Qualitative 9 14.3 54 85.7 Data mining or text analytics 5 7.9 58 92.1 Regression analysis were used to predict performance, or forecast learning outcomes in 29 studies (46%). Results show that prediction of performance is the main research objective for learning analytics in blended learning. In 88.9% of all the studies included performance, prediction or optimisation as the main objective. In 33.3% studies meth- ods for unsupervised classification of students by means of clustering studies were used to categorise students according to certain criteria such as learning strategies, baseline disposition, learning process sequences or self-regulation. Sequence mining appears to be gaining in the learning analytics field with 15.9% of the studies exploring the con- cept, and, in most of the times it was coupled with clustering and visualisation. Yet, all the studies in this category have not researched the impact of these visualisation on teachers or learners. Studies that used SNA in the analysis are 15.9%, and, similar to process mining research, all of the articles have not used visualisation techniques for the sake of helping students or teacher to optimise learning. Qualitative research was performed in nine studies through the analysis of interviews or transcripts. Data mining and pattern recognition was performed in five studies. 56 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 4.3 Ethical and legal considerations Irrespective of the necessity for considering ethical obligations in the use of student data, the papers did not provide documentations of such responsible use of personal data. Almost all the literature examined in this study, that is 99% of the articles, pri- marily focuses on LMS data. However, the ethical and legal aspects are very much under-represented in the discussions with only eight articles provided a clear evidence that they do not count on personal data or the data are de-identified. 22 of the 70 reviewed articles (31%) mentioned anonymising students. Nevertheless, it is important to recall here that hiding the student names from the data set is not enough to guarantee that individuals cannot be identified [40]. For exam- ple, if a student who enrolled in a course in a specific year, with specific major and so on could possibly have a significant probability of resulting in a perfect attribute set for identifying a specific student. Such events might raise red flags for ethical concerns of how legible is it to consider that anonymisation of data is sufficient (ibid.). Although 40% of the articles indicate that they, at some point, considered ethical aspects when collecting data, which are those ethical aspects were, how did these aspects mattered in the data collection, processing and outcomes, were not been mentioned in any of the reviewed studies. An important observation here is that at least 24 papers among the reviewed studies explicitly focus on the collection, analysis and managing of individ- ual’s personal data. Although a more profound discussion to explicate the instrumenta- tion of the legal and ethical procedure of retrieving and processing the sensitive pieces of the data is anticipated, a considerable gap in this focus in the articles is inevitable. Thirteen articles reported studies from Europe, but only six articles are mentioning that they have considered legal aspects and informed the students before the data collection, or the data has been anonymised. As nearly all of the studies were conducted prior to the GDPR rules in the EU [29], new and rigorous practices need to be applied in future LA approaches. 4.4 Contributions The contributions of reviewed studies could be classified into three themes, such as i) understanding and predicting performance, ii) understand student’s behaviours and profiles, and iii) understanding and improving the learning environments (for an over- view, see Table 5). iJAI ‒ Vol. 2, No. 2, 2020 57 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Table 5. Overview of contributions of reviewed studies Theme Aim/objectives Contribution Random forest, Linear and logistic regression, ensemble model- ling predicts academic performance with satisfying accuracy Forecast model could predict at-risk students Predict academic Visualisations are helpful for teachers to detect anomalous situa- 11 performance tions Two to six-week data is enough for future academic performance prediction Portability of pre- Portability of predictive models is low across courses dictive models LMS variables vary among general and course-specific models Data variables related to LMS engagement, self-regulated learn- Association of Understand ing, and collaborative learning are corrected with students’ aca- variables with per- /predict demic performance 9 formance predic- performance Tracking data is not significant predictors of academic success tion for some courses (e.g., graphic design) Seven factors found that affect students’ academic performance, Identification of consisting of four online and three traditional factors affecting 3 Four factors each found for both collaborative and self-regulated learning outcomes learning that affects students’ outcomes Social network metrics can be used as predictors Influence of social Number of interactions do not significantly correlate with student network with per- performance 3 formance SNA based upon questionnaires provide useful indicators for a more fine-grained analysis Two learning profiles identified based on student’s participation in online activities Identification of Identified learning behaviours before and after midterm exams behaviours /learn- 6 Different self-regulated learning behaviours identified based on ing patterns resources utilisation and procrastinator nature Found five learning trajectories with the varied resource use Four student clusters were observed based on their performance measures Based on interactions with the video annotation tool four profiles emerged Clustering/profil- Six profiles emerged from nine trace variables and student’s in- Understand ing-based student formation system data 8 student’s learning Based on students’ viewing behaviours, they were clustered into behaviours behaviours three groups and profiles Based on usage of LMS three profiles discovered Based on learner control, scaffolding, and interaction three self- regulated profiles appeared Self-assessment exercises, regularly resource access, and active online behaviour are significantly correlated with learning out- comes Relation between Use of videos annotations, metacognitive skills, and motivational profiles/learning strategies are weakly associated with learning achievements 15 behaviours on Procrastination behaviour, low level of participation, and worked learning outcomes examples could affect students learning outcome Found that students tend to change their learning behaviour throughout the course 58 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Resources access, LMS access time, and active learning days have a positive influence on learning results Fully worked-out solutions and engagements have adverse ef- Learning re- fects on students’ achievements sources and Personalised feedback has a small to medium positive effect on activities in 15 the learning outcome relation to Visualisations feedback allows students to make a better diagno- learning outcome sis of their performance Understanding Learning analytics-based interventions can improve student aca- and improving demic achievement the learning environments Video viewing patterns, resource utilisation, and order of activi- ties provide feedback to enhance classroom teaching and re- sources Improvements in Visualisation-based learning analytics allow teachers to identify course design, which learning design elements should be revised and improved 8 content, and Differences in instructional approaches during f2f and blended instructions courses are very likely due to the different class formats An understanding develops teachers’ interventions through learn- ing activity redesign can cultivate better learning attitudes 1. Understanding and predicting performance: To predict students’ academic per- formance, random forest, linear and logistic regression, and ensemble modelling based predictive models provided satisfying results (over 70% accuracy) [2, 51, 77, 80, 81, 102]. Similarly, a forecast learning outcome model (FLOM) was developed using interactive data to predict at-risk students [67]. However, FLOM achieved lower accuracy than other predictive models. On the contrary, student’s data visual- isations found helpful for teachers to detect anomalous situations [97]. Regarding appropriate time for prediction, studies discovered that two to six weeks data is suf- ficient to obtain accurate prediction [51, 53]. However, the portability of predictive models across courses remains low [23, 32]. Since prediction is entirely dependent on the supplied data, studies identified that LMS variables (e.g., access time), en- gagement indicators, self-regulated learning (e.g., self-efficacy and test anxiety), and collaborative learning (e.g., social stability, and time spent on task) variables have reliable predictive power due to their positive correlation with students achievements [2, 30, 71, 80, 90, 91, 102]. Nevertheless, for some courses (e.g., graphic design) tracking data became useless because different patterns exist in the effect of individ- ual data variables [32]. Reviewed studies also disclosed that social network metrics (e.g., degree, authority and PageRank) could be employed to predict student perfor- mance [15, 43, 81]. However, using these metrics, the representativity of the predic- tive models would be limited [81]. In factors identification, four online (e.g., activi- ties, video clicks, videos backwards and practice score per week) and three tradi- tional (e.g., participation in after-school tutoring, homework and quiz scores) factors were identified that affect students’ performance [53]. While, attendance, time spent in class, sitting position, and groups are essential for collaborative learning and self- efficacy, positive strategy, less anxious and less usage of negative strategy found important for self-regulated learning [2, 72]. 2. Understand student’s behaviours and profiles: To identify students learning pat- terns and behaviours, studies utilised student’s participation, resources access, and iJAI ‒ Vol. 2, No. 2, 2020 59 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … other LMS data. For instance, based on students participations two learning behav- iours emerged: sensing where students are more likely to participate in information access, interactive and networked learning activities, reflective where students are more predisposed to materials development activities [53]. Similarly, a study identi- fied behaviours before and after midterm exams, for example, out-degree centrality, LMS visit, and time spent before midterm exams and discussion views and visit in- terval regularity after the midterm exam [51]. In the self-regulated learning context, based on resources access three patterns emerged: self-regulator, external sources users, non-self-regulatory and based on LMS data four behaviours emerged: contin- uously active, probers, procrastinators and inactive [12, 20]. Likewise, based on re- source use five different learning trajectories discovered: overall below-average ac- tivity, average resource use, higher use of resources, most active students, least ac- tive students [56]. Similarly, studies clustered and profiled students based on their learning behaviours, for example, four clusters (achievers, regular, half-hearted, un- derachievers) discovered using students’ performance measures [57]. Likewise, us- ing video annotation tool interactions, four profiles were created, such as minimal- ists, task-oriented, disenchanted, and intensive [60]. Correspondingly, students viewing behaviour were adopted to cluster consistent, slide intensive and less inten- sive students [27]. On the other hand, utilising e-tutorial and information systems data, six profiles emerged, which were the difference in overall activity level and the use of worked-out solutions [62]. In the same vein, based on LMS usages, three clusters were generated such as low, acceptable, and good and students have differ- ent patterns of learning behaviour in these clusters [59]. In self-regulated learning context based on authenticity, personalisation, learner control, scaffolding, and in- teraction, three profiles were identified such as self-regulating, external regulating and lack of regulation [21]. On the other hand, a considerable number of studies contributed in terms of identifying the association and effects of different learning behaviour on students’ achievements. For instance, self-assessment exercises, regu- larly resources access, active online behaviour, and time management are signifi- cantly correlated with student learning outcome [5, 18, 23, 34, 49, 52, 63, 79]. While, the use of videos annotations, metacognitive skills, and motivational strategies are weakly associated with learning achievements [54, 55, 73]. On the other side, pro- crastination behaviour, low level of participation, and dependency on worked exam- ples could affect students learning outcome [54, 60, 89]. Furthermore, few studies discovered that students have a tendency to change their learning behaviour through- out the course and comparison can be conducted between successful and non-suc- cessful students based on their learning patterns [34, 49]. 3. Understanding and improving the learning environments: Reviewed studies dis- covered that course material access without lapses, LMS access time, active learning days and teachers’ monitoring influence learning results [1, 7, 8, 44, 45, 65]. Whereas, worked-out solutions and engagements create adverse effects on students’ achievements [41, 66, 85]. In the context of feedback provision, personalised feed- back have a small to medium positive effect on the learning outcome [71]. In terms of intervention, learning analytics-based interventions improved student academic achievement, with a 10.6% higher score than blended learning without intervention 60 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [36]. Concerning improvement in courses, video viewing patterns, resource utilisa- tion, course item frequencies and order of activities provide enough feedback to en- hance classroom teaching and course resources [19, 23, 34]. Similarly, visualisation- based learning analytics allow teachers to identify which learning design elements should be revised and improved [59]. 5 Discussion RQ1 and RQ2: We raised the questions how blended learning is defined and how learning theories and perspectives are used in the reviewed learning analytics research? In line with [46] we conclude that BL seems to have become somewhat of a meta- concept. Thus, as have been detailed in the results, blended learning is often not an adoption of one pure type of blended learning, but a combination of blended learning approaches. When BL approaches are combined, there is a greater complexity than in a "simple blend", why we propose that this can be referred to as complex hybrid learn- ing. For example, we see combinations of blended instruction and blended distribution: optional adoption [e.g. 66, 80], synchronous teaching of f2f students and online stu- dents [6] and iii) reversed distribution; [35, 40]. Result show that SRL, by far, was the most common used theory. To operationalise SRL, engagement and other perspectives of learning, LMS trace data was used to collate the number of messages sent, documents uploaded, chat sessions attended, as well as data collected to analyse teacher presence (e.g. 1, 54, 79]. In line with [42], we conclude, that operationalisations relying on LMS data might risk to be superficial and oversimplified interpretations of the underpinning theory. However, some of the reviewed studies also explored relations between trace data, performance and SRL using self- reports [70, 87, 91]. However, results revealed that certain perspectives of learning were more common to explore than others: for ex- ample, the flipped classroom, collaborative learning and conversational aspects of learning. We thus call for innovative perspectives of learning, for example, complex or multi-modal data gathering, longitudal studies and mixed methods approaches. Conclu- sively, the theoretical underpinnings of a research study (including what is meant by BL and the learning perspectives taken), are needed to increase clarity, quality and va- lidity of the objectives and contributions of that study to enable comparison and trans- parency a richer description of the actual blended learning environment approached. RQ 3: We also raised the question, what approaches of data collection, methods and analysis that are evident in the reviewed learning analytics research. One would expect that learning analytics in a blended learning scenario would account for the fact that the context of BL integrates both modalities (physical and digital). However, most of the reviewed studies have used digital traces, dispositional self-reports, relational data and interviews that does not fully cover the gamut of possible data sources of the classroom where a significant amount of learning happens. While predictive models have -in many cases- been able to infer future performance, they have failed short of explaining learn- ing, or offer a guide on how to intervene in the classroom. Of course, collecting data in a blended scenario is not easy, and therefore more research is needed that collects con- textually relevant data, and more importantly, on how to unobtrusively collect data in iJAI ‒ Vol. 2, No. 2, 2020 61 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … the classroom. We also recognise the benefits of visualisation as an intermediate step: albeit we found that visual analytics were rare [102]. Believing that eye-balling the data, the accessibility of instant overview, might support the teacher, we propose that more research on the impact of visual insights offered to stakeholders is needed. We found that blended learning studies did not use classroom data to investigate the complexity of both the online and f2f learning setting. RQ4: We then explored how ethical and legal aspects are considered in the reviewed learning analytics research. In line with previous critique [e.g. 74; 84] we found that although ethical, privacy and legislator requirements exits, the current practises do not always consider these. While results reveal that almost all (99%) of the reviewed studies were conducted prior to the GDPR rules in the EU. 13 articles reported studies from Europe, but only six articles mentioned legal aspects, having informed the students prior to data collection, or considered anonymising the data. This raises critical concerns, as aside the GDPR legislation, ethical considerations need still be adhered to. This may also reflect general slow governmental responses to regulate consequences of the digi- talisation. RQ5: Lastly, we surveyed the contributions of the reviewed articles. The results re- vealed that reviewed articles made several contributions on predicting academic per- formance, identifying learning behaviours, and improving learning environments. With regards to predicting academic performance, machine learning-based predictive models was proven to be effective but with low portability across courses, whereas visualisa- tion-based methods required teacher assistance [2, 77]. Moreover, data variables effec- tiveness on performance prediction is based on course structure; however, social net- work metrics and variables related to LMS engagement, self-regulated learning, collab- orative learning are found significantly correlated with academic performance [86, 90, 102]. In terms of identifying learning behaviours, results show that by utilising student’s participation in online activities and resources access impactful learning behaviours could be identified, and these behaviours are beneficial to cluster or profile students based on adopted behaviours [56]. In the regards of improving learning environments, results show that learning resources that provide student assistance to complete their assignments create positive effects on learning outcome [1, 7]. 6 Conclusion and Future Research As BL currently seems to be a more general concept, detailed descriptions of the actual learning situation, delivery, blend or hybrid solution is needed alongside clear underpinning theories to position the research, or as proposed an indication of whether one is approaching a "simple blend" or complex hybrid learning. We argue that in the current wake of the transforming distance learning, we see hybrid solutions, that raises awareness of a complexity of multiple blended solutions in parallel, that if not de- scribed, could mean just about any kind of learning, delivery or setting. We found that data used in many learning analytics studies were used as a proxy for what happens in the classroom. However, when studies do not include manifestations in the real class- room, they fall short of explaining learning, or offer a guide on how to intervene in the 62 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … classroom. More research is needed that accounts for the context of BL and more im- portantly, on how to unobtrusively collect relevant data that enables the support of learning where it occurs. In the light of our findings of ethical and legal considerations, we strongly argue that while there are no established traditions in LA research in terms of legal requirements; new and rigorous practices need to be developed and applied in current and future LA approaches. Ethical consequences might be devastating and the field urgently needs to acknowledge this lack of consideration. 7 Acknowledgement This research is financed by the Swedish Research Council (VR). 8 References [1] Agnihotri, L., Essa, A., & Baker, R. (2017). Impact of student choice of content adoption delay on course outcomes. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 16–20). ACM. https://doi.org/10.1145/3027385. [2] Akhtar, S., Warburton, S., & Xu, W. (2017). The use of an online learning and teaching system for monitoring computer aided design student participation and predicting student success. International Journal of Technology and Design Education, 27(2), 251–270. https://doi.org/10.1007/s10798-015-9346-8 [3] Aldowah, H., Al-Samarraie, H., & Fauzy, W. M. (2019). Educational data mining and learn- ing analytics for 21st century higher education: A review and synthesis. Telematics and In- formatics, 37(April 2018), 13–49. https://doi.org/10.1016/j.tele.2019.01.007 [4] Alonso-Fernández, C., Calvo-Morata, A., Freire, M., Martínez-Ortiz, I., & Fernández- Manjón, B. (2019). Applications of data science to game learning analytics data: A system- atic literature review. Computers and Education, 141(June), 103612. https://doi.org/10.1016 /j.compedu.2019.103612 [5] Andergassen, M., Mödritscher, F., & Neumann, G. (2014). Practice and Repetition during Exam Preparation in Blended Learning Courses: Correlations with Learning Results. Journal of Learning Analytics, 1(1), 48–74. https://doi.org/10.18608/jla.2014.11.4 [6] Avella, J. T., Kebritchi, M., Nunn, S. G., & Kanai, T. (2016). Learning Analytics in Distance Education: A Systematic Literature Review. Online Learning, 20(2 (October)), 13–29. https://doi.org/10.24059/olj.v20i2.790 [7] Ayub, M., Toba, H., Yong, S., & Wijanto, M. C. (2017). Modelling students’ activities in programming subjects through educational data mining. Global Journal of Engineering Ed- ucation, 19(3), 249–255. https://doi.org/10.1109/icodse.2017.8285881 [8] Baik, E. J., & Reynolds, R. B. (2013). Contribution of wiki trace and wiki resource use variables towards quality of game design in a guided discovery-based program of game de- sign learning. Proceedings of the ASIST Annual Meeting, 50(1). https://doi.org/10. 1002/meet.14505001165 [9] Barata, G., Gama, S., Jorge, J., & Gonçalves, D. (2017). Studying student differentiation in gamified education: A long-term study. Computers in Human Behavior, 71, 550–585. https://doi.org/10.1016/j.chb.2016.08.049 iJAI ‒ Vol. 2, No. 2, 2020 63 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [10] Bodily, R., Kay, J., Aleven, V., Jivet, I., Davis, D., Xhakaj, F., & Verbert, K. (2018). Open Learner Models and Learning Analytics Dashboards: A Systematic Review in Proceedings of the 8th international conference on learning analytics and knowledge (pp. 41-50). https://doi.org/10.1145/3170358.3170409 [11] Boelens, R., Van Laer, S., De Wever, B., & Elen, J. (2015). Blended learning in adult edu- cation: towards a definition of blended learning. Adult Learners Online! Blended and Online Learning in Adult Education and Training. https://doi.org/10.21125/iceri.2018.1219 [12] Bos, N., & Saskia, B. G. (2016). Student differences in regulation strategies and their use of learning resources: Implications for educational design. In ACM International Conference Proceeding Series (Vol. 25-29-April, pp. 344–353). New York, New York, USA: Associa- tion for Computing Machinery. doi:10.1145/2883851.2883890 [13] Bray, N. J., & Major, C. H. (2012). Status of Journals in the Field of Higher Education. The Journal of Higher Education, 82(4), 479–503. https://doi.org/10.1080/00221546. 2011.11777213 [14] Cerezo, R., Esteban, M., Sánchez-Santillán, M., & Núñez, J. C. (2017). Procrastinating be- havior in computer-based learning environments to predict performance: A case study in Moodle. Frontiers in Psychology, 8, 1–11. https://doi.org/10.3389/fpsyg.2017. [15] Chen, H., Chiang, R. H., & Storey, V. C. (2012). Business intelligence and analytics: From big data to big impact. MIS quarterly, 1165-1188. https://doi.org/10.2307/41703503 [16] Chen, B., Knight, S., & Wise, A. (2018). Critical issues in designing and implementing tem- poral analytics. Journal of Learning Analytics, 5 (1) (2018), p. 9. https://doi.org/10. 18608/jla.2018.51.1 [17] Chen, J., & Foung, D. (2020). A Motivational Story in Hong Kong: Generating Goals for Language Learners and Blended Learning Designers from a Mixed-Method Learning Ana- lytics Approach in English for Academic Purposes. In Technology and the Psychology of Second Language Learners and Users (pp. 491-516). Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-34212-8_19 [18] Cheng, G., & Chau, J. (2016). Exploring the relationships between learning styles, online participation, learning achievement and course satisfaction: An empirical study of a blended learning course. British Journal of Educational Technology, 47(2), 257–278. https://doi.org/10.1111/bjet.12243 [19] Chetlur, M., Tamhane, A., Reddy, V. K., Sengupta, B., Jain, M., Sukjunnimit, P., & Wagh, R. (2014). EduPaL: Enabling Blended Learning in Resource Constrained Environments. In Proceedings of the Fifth ACM Symposium on Computing for Development (pp. 73–82). ACM. https://doi.org/10.1145/2674377.2674388 [20] Cicchinelli, A., Veas, E., Pardo, A., Pammer-Schindler, V., Fessl, A., Barreiros, C., & Lind- städt, S. (2018). Finding traces of self-regulated learning in activity streams. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 191–200). ACM. https://doi.org/10.1145/3170358.3170381 [21] Clow, D. (2012a). The learning analytics cycle: closing the loop effectively. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 134-138). doi.org/10.1145/2330601.2330636 [22] Clow, D. W., Nanus, L., Verdin, K. L., & Schmidt, J. (2012b). Evaluation of SNODAS snow depth and snow water equivalent estimates for the Colorado Rocky Mountains, USA. Hy- drological Processes, 26(17), 2583-2591. https://doi.org/10.1002/hyp.9385 [23] Conijn, R., Van den Beemt, A., & Cuijpers, P. (2018). Predicting student performance in a blended MOOC. Journal of Computer Assisted Learning, 34(5), 615–628. https://doi.org/10.1111/jcal.12270 64 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [24] Conijn, Rianne, Snijders, C., Kleingeld, A., & Matzat, U. (2017). Predicting student perfor- mance from LMS data: A comparison of 17 blended courses using moodle LMS. IEEE Transactions on Learning Technologies, 10(1), 17–29. https://doi.org/10.1109/ tlt.2016.2616312 [25] Dawson, S., Mirriahi, N., & Gasevic, D. (2015). Importance of theory in learning analytics in formal and workplace settings. Journal of Learning Analytics, 2(2), 1-4. https://doi.org/10.18608/jla.2015.22.1 [26] De Houwer, J., Barnes-Holmes, D., & Moors, A. (2013). What is learning? On the nature and merits of a functional definition of learning. Psychonomic bulletin & review, 20(4), 631- 642. https://doi.org/10.3758/s13423-013-0386-3 [27] De-Marcos, L., Garciá-López, E., Garciá-Cabot, A., Medina-Merodio, J. A., Domínguez, A., Martínez-Herraíz, J. J., & Diez-Folledo, T. (2016). Social network analysis of a gamified e-learning course: Small-world phenomenon and network metrics as predictors of academic performance. Computers in Human Behavior, 60, 312–321. https://doi.org/10.1016/j. chb.2016.02.052 [28] Dimić, G., Predić, B., Rančić, D., Petrović, V., Maček, N., & Spalević, P. (2018). Associa- tion analysis of moodle e-tests in blended learning educational environment. Computer Ap- plications in Engineering Education, 26(3), 417–430. https://doi.org/10.1002/cae. [29] EU 2016/679. (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC. Brussels: European Commission. https://doi.org/10.5593/sgemsocial2019v/1.1/s02.022 [30] Gamulin, J., Gamulin, O., & Kermek, D. (2016). Using Fourier coefficients in time series analysis for student performance prediction in blended learning environments. Expert Sys- tems, 33(2), 189–200. https://doi.org/10.1111/exsy.12142 [31] Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative po- tential in higher education. Internet and Higher Education, 7(2), 95–105. https://doi.org/10.1016/j.iheduc.2004.02.001 [32] Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic suc- cess. Internet and Higher Education, 28, 68–84. https://doi.org/10.1016/j.iheduc. 2015.10.002 [33] Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting Learning Strategies with Analytics: Links with Self-Reported Measures and Academic Performance. Journal of Learning Analytics, 4(2), 113–128. https://doi.org/10.18608/jla.2017.42.10 [34] Gelan, A., Fastré, G., Verjans, M., Martin, N., Janssenswillen, G., Creemers, M., … Thomas, M. (2018). Affordances and limitations of learning analytics for computer-assisted language learning: a case study of the VITAL project. Computer Assisted Language Learning, 31(3), 294–319. https://doi.org/10.1080/09588221.2017.1418382 [35] Gewerc-Barujel, A., Montero-Mesa, L., & Lama-Penín, M. (2013). Collaboration and Social Networking in Higher Education. Comunicar, 21(42), 55–63. https://doi.org/10. 3916/c42-2014-05 [36] Gong, L., Liu, Y., & Zhao, W. (2018). Using Learning Analytics to Promote Student En- gagement and Achievement in Blended Learning: An Empirical Study. In Proceedings of the 2nd International Conference on E-Education, E-Business and E-Technology. https://doi.org/10.1145/3241748.3241760 iJAI ‒ Vol. 2, No. 2, 2020 65 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [37] Graham, C. R. (2006). Blended learning systems: Definition, current trends, and future di- rections. In B. Miller (Ed.), The handbook of blended learning (1st ed., Vol. LVII, pp. 3– 21). San Francisco, CA: Pfeiffer. [38] Graham, C.R. (2009) Blended Learning Models. In: Encyclopedia of Information Science and Technology. Hershey, PA: Idea Group Inc., 375-383. [39] Graham, C. R. (2013). Emerging practice and research in blended learning. In M. G. Moore (Ed.), Handbook of Distance Education (3rd ed., pp. 333–350). New York, New York, USA: Routledge. [40] Harrak, F., Bouchet, F., Luengo, V., & Gillois, P. (2018). Profiling students from their ques- tions in a blended learning environment. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 102–110). ACM. https://doi.org/10.1145/ 3170358.3170389 [41] Hecking, T., Ziebarth, S., & Hoppe, H. U. (2014). Analysis of Dynamic Resource Access Patterns in Online Courses. Journal of Learning Analytics, 1(3), 34–60. https://doi.org/10.18608/jla.2014.13.4 [42] Henrie, C. R., Bodily, R., Larsen, R., & Graham, C. R. (2018). Exploring the potential of LMS log data as a proxy measure of student engagement. Journal of Computing in Higher Education, 30(2), 344–362. https://doi.org/10.1007/s12528-017-9161-1 [43] Hernández-Nanclares, N., García-Muñiz, A. S., & Rienties, B. (2017). Making the most of “external” group members in blended and online environments. Interactive Learning Envi- ronments, 25(4), 467–481. https://doi.org/10.1080/10494820.2016.1140656 [44] Hill, T., Chidambaram, L., & Summers, J. D. (2017). Playing ‘catch up’ with blended learn- ing: performance impacts of augmenting classroom instruction with online learning. Behav- iour and Information Technology, 36(1), 54–62. https://doi.org/10.1080/0144929x. 2016.1189964 [45] Holstein, K., McLaren, B. M., & Aleven, V. (2017). SPACLE: investigating learning across virtual and physical spaces using spatial replays. Proceedings of the Seventh International Learning Analytics & Knowledge Conference on - LAK ’17, 358–367. https://doi.org/10.1145/3027385.3027450 [46] Hrastinski, S. (2019). What Do We Mean by Blended Learning? TechTrends, 63(5), 564– 569. https://doi.org/10.1007/s11528-019-00375-5 [47] Hui, Y. K., Mai, B., Qian, S., & Kwok, L. F. (2018). Cultivating better learning attitudes: a preliminary longitudinal study. Open Learning, 33(2), 155–170. https://doi.org/10. 1080/02680513.2018.1454830 [48] Johnson, L., Becker, S. A., Cummins, M., Estrada, V., Freeman, A., & Hall, C. (2016). NMC Horizon Report; 2016 Higher Education Edition. Retr. Feb. 13, 2019, from www.sconul.ac.uk/sites/default/files/documents/2016-nmc-horizon-report-he-EN-1.pdf [49] Jovanović, J., Gašević, D., Dawson, S., Pardo, A., & Mirriahi, N. (2017). Learning analytics to unveil learning strategies in a flipped classroom. Internet and Higher Education, 33, 74– 85. https://doi.org/10.1016/j.iheduc.2017.02.001 [50] Khalil, M., & Ebner, M. (2016). De-identification in learning analytics. Journal of Learning Analytics. https://doi.org/10.18608/jla.2016.31.8 [51] Kim, D., Park, Y., Yoon, M., & Jo, I. H. (2016). Toward evidence-based learning analytics: Using proxy variables to improve asynchronous online discussion environments. Internet and Higher Education, 30, 30–43. https://doi.org/10.1016/j.iheduc.2016.03.002 [52] Li, L. Y., & Tsai, C. C. (2017). Accessing online learning material: Quantitative behavior patterns and their effects on motivation and learning performance. Computers and Educa- tion, 114(300), 286–297. https://doi.org/10.1016/j.compedu.2017.07.007 66 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [53] Lu, O. H. T., Huang, A. Y. Q., Lin, A. J. Q., Ogata, H., & Yang, S. J. H. (2018). Applying Learning Analytics for the Early Prediction of Students’ Academic Performance in Blended Learning. Educational Technology & Society (Vol. 21). [54] Lukarov, V., Verbert, K., & Schroeder, U. (2019). Scaling up learning analytics in blended learning scenarios (Doctoral dissertation, Universitätsbibliothek der RWTH Aachen). [55] Manzanares, M. C. S., Sánchez, R. M., García Osorio, C. I., & Díez-Pastor, J. F. (2017). How do B-learning and learning patterns influence learning outcomes? Frontiers in Psychol- ogy, 8(MAY), 1–13. https://doi.org/10.3389/fpsyg.2017.00745 [56] Marsh, J., & Drexler, P. (2001). How to design effective blended learning. Sunnyvale: Bran- don-Hall. Sunnyvale: California, USA. [57] Matcha, W., Ahmad Uzir, N., Gasevic, D., & Pardo, A. (2019). A Systematic Review of Empirical Studies on Learning Analytics Dashboards: A Self-Regulated Learning Perspec- tive. IEEE Transactions on Learning Technologies, 1382(c), 1–1. https://doi.org/10. 1109/tlt.2019.2916802 [58] McKenzie, W. A., Perini, E., Rohlf, V., Toukhsati, S., Conduit, R., & Sanson, G. (2013). A blended learning lecture delivery model for large and diverse undergraduate cohorts. Com- puters and Education, 64, 116–126. https://doi.org/10.1016/j.compedu.2013.01.009 [59] Melero, J., Hernández-Leo, D., Sun, J., Santos, P., & Blat, J. (2015). How was the activity? A visualization support for a case of location-based learning design. British Journal of Edu- cational Technology, 46(2), 317–329. https://doi.org/10.1111/bjet.12238 [60] Mirriahi, N., Liaqat, D., Dawson, S., & Gašević, D. (2016). Uncovering student learning profiles with a video annotation tool: reflective learning with and without instructional norms. Educational Technology Research and Development, 64(6), 1083–1106. https://doi.org/10.1007/s11423-016-9449-2 [61] Misiejuk, K., & Wasson, B. (2017). State of the Field report on Learning Analytics. SLATE Report 2017-2. [62] Molenaar, I., & Järvelä, S. (2014). Sequential and temporal characteristics of self and so- cially regulated learning. Metacognition and Learning, 9(2), 75–85. https://doi.org/10. 1007/s11409-014-9114-2 [63] Montgomery, A. P., Mousavi, A., Carbonaro, M., Hayward, D. V., & Dunn, W. (2019). Using learning analytics to explore self-regulated learning in flipped blended learning music teacher education. British Journal of Educational Technology, 50(1), 114–127. https://doi.org/10.1111/bjet.12590 [64] Musabirov, I., & Bakhitova, A. (2017). Trajectories of student interaction with learning re- sources in blended learning. In Proceedings of the 17th Koli Calling International Confer- ence on Computing Education Research (pp. 191–192). ACM. https://doi.org/10. 1145/3141880.3141907 [65] Mödritscher, F., Andergassen, M., & Neumann, G. (2013). Dependencies between E-Learn- ing Usage Patterns and Learning Results. In Proceedings of the 13th International Confer- ence on Knowledge Management and Knowledge Technologies (pp. 1–8). ACM. https://doi.org/10.1145/2494188.2494206 [66] Nguyen, Q., Tempelaar, D. T., Rienties, B., & Bas, G. (2016). What learning analytics-based prediction models tell us about feedback preferences of students. Q. Rev. Distance Educa- tion, 17(3), 13–33. [67] Nguyen, V. A., Nguyen, Q. B., & Nguyen, V. T. (2018). A Model to Forecast Learning Outcomes for Students in Blended Learning Courses Based on Learning Analytics, 35–41. https://doi.org/10.1145/3268808.3268827 [68] Nouri, J., Ebner, M., Ifenthaler, D., Saqr, M., Malmberg, J., Khalil, M., … Berthelsen, U. D. (2019). Efforts in Europe for Data-Driven Improvement of Education – A Review of iJAI ‒ Vol. 2, No. 2, 2020 67 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Learning Analytics Research in Seven Countries. International Journal of Learning Analyt- ics and Artificial Intelligence for Education (IJAI), 1(1), 8. https://doi.org/10.3991/ijai.v1i1. [69] Nouri, J., Saqr, M & Fors, U. (2019). Predicting performance of students in a flipped class- room using machine learning: towards automated data-driven formative feedback. Journal of Systemics, Cybernetics and Informatics. 17(2). [70] Pardo, A., Han, F., & Ellis, R. A. (2016). Exploring the relation between self-regulation, online activities, and academic performance, 422–429. doi:10.1145/2883851.2883883 [71] Pardo, A., Han, F., & Ellis, R. A. (2017). Combining University student self-regulated learn- ing indicators and engagement with online learning events to Predict Academic Perfor- mance. IEEE Transactions on Learning Technologies, 10(1), 82–92. https://doi.org/10. 1109/tlt.2016.2639508 [72] Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback. British Journal of Educational Technology, 50(1), 128–138. https://doi.org/10.1111/bjet.12592 [73] Pardo, A., Mirriahi, N., Dawson, S., Zhao, Y., Zhao, A., & Gašević, D. (2015). Identifying learning strategies associated with active use of video annotation software, 255–259. https://doi.org/10.1145/2723576.2723611 [74] Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. Brit- ish Journal of Educational Technology, 45(3), 438–450. https://doi.org/10.1111/bjet. [75] Park, Y., Yu, J. H., & Jo, I. H. (2016). Clustering blended learning courses by online behav- ior data case study in a Korean higher education institute. Internet and Higher Education, 29, 1–11. https://doi.org/10.1016/j.iheduc.2015.11.001 [76] Paskevicius, M., & Bortolin, K. (2016). Blending our practice: using online and face-to-face methods to sustain community among faculty in an extended length professional develop- ment program. Innovations in Education and Teaching International, 53(6), 605–615. https://doi.org/10.1080/14703297.2015.1095646 [77] Predić, B., Dimić, G., Rančić, D., Štrbac, P., Maček, N., & Spalević, P. (2018). Improving final grade prediction accuracy in blended learning environment using voting ensembles. Computer Applications in Engineering Education, 26(6), 2294–2306. https://doi.org/10. 1002/cae.22042 [78] Reeve, J. (2012). A Self-determination Theory Perspective on Student Engagement. In S. Christenson, A. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 149–172). Boston, MA: Springer US. https://doi.org/10.1007/978-1-4614-2018-7_7 [79] Rubio, F., Thomas, J. M., & Li, Q. (2018). The role of teaching presence and student partic- ipation in Spanish blended courses. Computer Assisted Language Learning, 31(3), 226–250. https://doi.org/10.1080/09588221.2017.1372481 [80] Saqr, M., Fors, U., & Tedre, M. (2017). How learning analytics can early predict under- achieving students in a blended medical education course. Medical Teacher, 39(7), 757–767. https://doi.org/10.1080/0142159x.2017.1309376 [81] Saqr, M., Fors, U., & Tedre, M. (2018). How the study of online collaborative learning can guide teachers and predict students’ performance in a medical course. BMC Medical Edu- cation, 18(1), 1–14. https://doi.org/10.1186/s12909-018-1126-1 [82] Scholes, V. (2016). The ethics of using learning analytics to categorize students on risk. Educational Technology Research and Development, 64(5), 939–955. https://doi.org/10. 1007/s11423-016-9458-1 [83] Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behav- ioral Scientist, 57(10), 1380-1400. https://doi.org/10.1177/0002764213498851 68 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [84] Slade, S., & Prinsloo, P. (2013). Learning Analytics. American Behavioral Scientist, 57(10), 1510–1529. https://doi.org/10.1177/0002764213479366 [85] Snodgrass Rangel, V., Bell, E. R., Monroy, C., & Whitaker, J. R. (2015). Toward a New Approach to the Evaluation of a Digital Curriculum Using Learning Analytics. Journal of Research on Technology in Education, 47(2), 89–104. https://doi.org/10.1080/ 15391523.2015.999639 [86] Tempelaar, D. (2017). How Dispositional Learning Analytics Helps Understanding the Worked-Example Principle. International Association for Development of the Information Society, (Celda), 14. [87] Tempelaar, D., Rienties, B., Mittelmeier, J., & Nguyen, Q. (2018a). Student profiling in a dispositional learning analytics application using formative assessment. Computers in Hu- man Behavior, 78, 408–420. https://doi.org/10.1016/j.chb.2017.08.010 [88] Tempelaar, D., Rienties, B., & Nguyen, Q. (2018b). A multi-modal study into students’ tim- ing and learning regulation: time is ticking. Interactive Technology and Smart Education, 15(4), 298–313. https://doi.org/10.1108/itse-02-2018-0015 [89] Tempelaar, D., Rienties, B., & Nguyen, Q. (2018c). Investigating learning strategies in a dispositional learning analytics context, 201–205. https://doi.org/10.1145/3170358. [90] Tempelaar, D., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning analytics in a data-rich context. Computers in Human Behavior, 47, 157–167. https://doi.org/10.1016/j.chb.2014.05.038 [91] Tempelaar, D., Rienties, B., Nguyen, Q., Macan, T., & Hoffmacan, T. (2017). Towards Ac- tionable Learning Analytics Using Dispositions, 79(1), 381–391. doi:10.1037/0021- 9010.79.3.381 [92] Tempelaar, D. (2020). Supporting the less-adaptive student: the role of learning analytics, formative assessment and blended learning. Assessment & Evaluation in Higher Education, 45(4), 579-593. https://doi.org/10.1080/02602938.2019.1677855 [93] van Goidsenhoven, S., Bogdanova, D., Deeva, G., Broucke, S. V., De Weerdt, J., & Snoeck, M. (2020, March). Predicting student success in a blended learning environment. In Pro- ceedings of the Tenth International Conference on Learning Analytics & Knowledge (pp. 17-25). https://doi.org/10.1145/3375462.3375494 [94] Van Laer, S., & Elen, J. (2018). Adults’ Self-Regulatory Behaviour Profiles in Blended Learning Environments and Their Implications for Design. Technology, Knowledge and Learning, 1–31. https://doi.org/10.1007/s10758-017-9351-y [95] Van Leeuwen, A. (2019). Teachers’ perceptions of the usability of learning analytics reports in a flipped university course: when and how does information become actionable knowledge? Educational Technology Research and Development, 67(5), 1043-1064. https://doi.org/10.1007/s11423-018-09639-y [96] Vieira, C., Parsons, P., & Byrd, V. (2018). Visual learning analytics of educational data: A systematic literature review and research agenda. Computers and Education, 122(March), 119–135. https://doi.org/10.1016/j.compedu.2018.03.018 [97] Villamañe, M., Álvarez, A., Larrañaga, M., Hernández-Rivas, O., & Caballero, J. (2018). Using Visualizations to Improve Assessment in Blended Learning Environments, 165–169. https://doi.org/10.1145/3284179.3284209 [98] Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated learning. In Metacognition in educational theory and practice. [99] Winne, P. H., & Perry, N. E. (2000). Measuring self-regulated learning. In Handbook of self- regulation (pp. 531-566). Academic Press. https://doi.org/10.1016/b978-012109890- 2/50045-7 iJAI ‒ Vol. 2, No. 2, 2020 69 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [100] Wise AF, Shaffer DW. 2015. Why theory matters more than ever in the age of big data. JLA. 2:5–13. [101] Whitelock-Wainwright, A., Tsai, Y. S., Lyons, K., Kaliff, S., Bryant, M., Ryan, K., & Gašević, D. (2020, March). Disciplinary differences in blended learning design: a network analytic study. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge (pp. 579-588). https://doi.org/10.1145/3375462.3375488 [102] Zacharis, N. Z. (2015). A multivariate approach to predicting student outcomes in web-ena- bled blended learning courses. Internet and Higher Education, 27, 44–53. https://doi.org/10.1016/j.iheduc.2015.05.002 9 Authors Nina Bergdahl is a researcher in didactics and digitalisation at the Department of Adult and Upper Secondary Education, Malmoe, Sweden and affiliated with the De- partment of Computer and Systems Sciences (DSV) Stockholm university, Sweden. Email: ninabe@dsv.su.se Jalal Nouri is an associate professor at the Department of Computer and Systems Sciences (DSV) Stockholm university, Sweden. Thashmee Karunaratne is an associate professor at the Department of Computer and Systems Sciences (DSV) Stockholm university, Sweden Muhammad Afzaal is a PhD student at the Department of Computer and Systems Sciences (DSV) Stockholm university, Sweden. Mohammad Saqr is a postdoc at KTH Royal Institute of Technology, School of Electrical Engineering and Computer Science, Stockholm, Sweden. Article submitted 2020-08-19. Resubmitted 2020-09-16. Final acceptance 2020-09-16. Final version pub- lished as submitted by the authors. 70 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 10 Appendix A Obser- Dis- Inter- SN Sur- Class- Perfor- S Study LMS Trace MML vation/ cours views A vey room mance video e Agnihotri, L., Essa, A., & Baker, R. (2017). Impact of student choice of con- tent adoption delay on course outcomes. In Proceedings of the Seventh Interna- 1 Y Y N N N N N N N Y tional Learning Analytics & Knowledge Conference (pp. 16–20). ACM. https://doi.org/10.1145/3027385.302743 Akhtar, S., Warburton, S., & Xu, W. (2017). The use of an online learning and teaching system for monitoring computer aided design student participa- 2 tion and predicting student success. In- Y Y Y N Y N Y N Y Y ternational Journal of Technology and Design Education, 27(2), 251–270. https://doi.org/10.1007/s10798-015- 9346-8 Andergassen, M., & Mödritscher, F. (2014). Practice and repetition during 3 exam preparation in blended learning M Y N N N N N D N Y courses: Correlations with learning re- sults. Of Learning Analytics, 1, 48–74. Ayub, M., Toba, H., Yong, S., & Wi- janto, M. C. (2017). Modelling students’ 4 activities in programming subjects Y Y N N N N N N N Y through educational data mining. Global Journal of Engineering Education, 19(3) Baik, E. J., & Reynolds, R. B. (2013). Contribution of wiki trace and wiki re- source use variables towards quality of game design in a guided discovery- 5 based program of game design learning. Wiki Y N N N N N Y N Y Proceedings of the ASIST Annual Meet- ing, 50(1). https://doi.org/10.1002/meet.145050011 Barata, G., Gama, S., Jorge, J., & Gon- çalves, D. (2017). Studying student dif- ferentiation in gamified education: A 6 long-term study. Computers in Human Y Y N N N N N D N Y Behavior, 71, 550–585. https://doi.org/10.1016/j.chb.2016.08.04 Bos, N. (n.d.). Student differences in regulation strategies and their use of learning resources: implications for edu- 7 Y Y N N N N N D Y Y cational design. Learning Analytics and Knowledge (LAK’16), 27-29 April 2016, 344–353. Cheng, G., & Chau, J. (2016). Exploring the relationships between learning styles, online participation, learning achievement and course satisfaction: An 8 Y Y N N N N N D N Y empirical study of a blended learning course. British Journal of Educational Technology, 47(2), 257–278. https://doi.org/10.1111/bjet.12243 iJAI ‒ Vol. 2, No. 2, 2020 71 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Chen, J., & Foung, D. (2020). A Moti- vational Story in Hong Kong: Generat- ing Goals for Language Learners and Blended Learning Designers from a Mixed-Method Learning Analytics Ap- 9 Y Y N N N N N D N Y proach in English for Academic Pur- poses. In Technology and the Psychol- ogy of Second Language Learners and Users (pp. 491-516). Palgrave Macmil- lan, Cham. Chetlur, M., Tamhane, A., Reddy, V. K., Sengupta, B., Jain, M., Sukjunnimit, P., & Wagh, R. (2014). EduPaL: Ena- bling Blended Learning in Resource Constrained Environments. In Proceed- 10 M Y N N N N N D N Y ings of the Fifth ACM Symposium on Computing for Development (pp. 73– 82). ACM. https://doi.org/10.1145/2674377.267438 Cicchinelli, A., Veas, E., Pardo, A., Pammer-Schindler, V., Fessl, A., Bar- reiros, C., & Lindstädt, S. (2018). Find- ing traces of self-regulated learning in Cus- activity streams. In Proceedings of the 11 tom Y N N N N N Y N Y 8th International Conference on Learn- LMS ing Analytics and Knowledge (pp. 191– 200). ACM. https://doi.org/10.1145/3170358.317038 Conijn, R., Snijders, C., Kleingeld, A., & Matzat, U. (2017). Predicting student performance from LMS data. IEEE 12 Transactions on Learning Technologies, Y Y N N N N N N N Y 10(1), 17–29. https://doi.org/10.1109/TLT.2016.26163 Conijn, R., Van den Beemt, A., & Cuijpers, P. (2018). Predicting student performance in a blended MOOC. Jour- 13 M Y N N N N N N N Y nal of Computer Assisted Learning, 34(5), 615–628. https://doi.org/10.1111/jcal.12270 Dimić, G., Predić, B., Rančić, D., Pe- trović, V., Maček, N., & Spalević, P. (2018). Association analysis of moodle 14 e-tests in blended learning educational Y Y N N N N N N N Y environment. Computer Applications in Engineering Education, 26(3), 417–430. https://doi.org/10.1002/cae.21894 De-Marcos, L., Garciá-López, E., Garciá-Cabot, A., Medina-Merodio, J. A., Domínguez, A., Martínez-Herraíz, J. J., & Diez-Folledo, T. (2016). Social network analysis of a gamified e-learn- 15 ing course: Small-world phenomenon M N N N N N Y N N Y and network metrics as predictors of ac- ademic performance. Computers in Hu- man Behavior, 60, 312–321. https://doi.org/10.1016/j.chb.2016.02.05 Dobashi, K. (2016). Development and 16 Trial of Excel Macros for Time Series Y Y N N N N N N N N Cross Section Monitoring of Student 72 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Engagement: Analyzing Students’ Page Views of Course Materials. Procedia Computer Science, 96, 1086–1095. https://doi.org/10.1016/j.procs.2016.08. Gamulin, J., Gamulin, O., & Kermek, D. (2016). Using Fourier coefficients in time series analysis for student perfor- 17 mance prediction in blended learning Y Y N N N N N N N Y environments. Expert Systems, 33(2), 189–200. https://doi.org/10.1111/exsy.12142 Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in pre- 18 Y Y N N N N N N N Y dicting academic success. Internet and Higher Education, 28, 68–84. https://doi.org/10.1016/j.iheduc.2015.10 Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting Learning Strategies with Analytics: Links with 19 Self-Reported Measures and Academic Y Y N N N N N D N Y Performance. Journal of Learning Ana- lytics, 4(2), 113–128. https://doi.org/10.18608/jla.2017.42.10 Gelan, A., Fastré, G., Verjans, M., Mar- tin, N., Janssenswillen, G., Creemers, M., … Thomas, M. (2018). Affordances and limitations of learning analytics for computer-assisted language learning: a 20 Y Y N N N N N N N Y case study of the VITAL project. Com- puter Assisted Language Learning, 31(3), 294–319. https://doi.org/10.1080/09588221.2017. Gewerc-Barujel, A., Montero-Mesa, L., & Lama-Penín, M. (2013). Collabora- 21 tion and Social Networking in Higher Y Y N N N N N N N N Education. Comunicar, 21(42), 55–63. https://doi.org/10.3916/c42-2014-05 Gong, L., Liu, Y., & Zhao, W. (2018). Using Learning Analytics to Promote Student Engagement and Achievement in Blended Learning: An Empirical 22 Study. In Proceedings of the 2nd Inter- Y N N N N N N D N Y national Conference on E-Education, E- Business and E-Technology. https://doi.org/10.1145/3241748.324176 Harrak, F., Bouchet, F., Luengo, V., & Gillois, P. (2018). Profiling students from their questions in a blended learn- ing environment. In Proceedings of the 23 8th International Conference on Learn- Y N N N N N N N Y N ing Analytics and Knowledge (pp. 102– 110). ACM. https://doi.org/10.1145/3170358.317038 Hecking, T., Ziebarth, S., & Hoppe, H. U. (2014). Analysis of Dynamic Re- 24 Y Y N N N N Y N N N source Access Patterns in Online Courses. Journal of Learning Analytics, iJAI ‒ Vol. 2, No. 2, 2020 73 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 1(3), 34–60. https://doi.org/10.18608/jla.2014.13.4 Hernández-Nanclares, N., García-Mu- ñiz, A. S., & Rienties, B. (2017). Mak- ing the most of “external” group mem- bers in blended and online environ- 25 Y N N N N N Y N N N ments. Interactive Learning Environ- ments, 25(4), 467–481. https://doi.org/10.1080/10494820.2016. Hill, T., Chidambaram, L., & Summers, J. D. (2017). Playing ‘catch up’ with blended learning: performance impacts 26 Y Y N N N N N N N Y of augmenting...: Discovery Service for De La Salle Univ. Behaviour & Infor- mation Technology, 36(1), 54–62. Holstein, K., McLaren, B. M., & Ale- ven, V. (2017). SPACLE: investigating learning across virtual and physical spaces using spatial replays. Proceed- 27 ings of the Seventh International Learn- Y Y Y Y N N N D Y Y ing Analytics & Knowledge Conference on - LAK ’17, 358–367. https://doi.org/10.1145/3027385.302745 Hui, Y. K., Mai, B., Qian, S., & Kwok, L. F. (2018). Cultivating better learning attitudes: a preliminary longitudinal 28 Y Y N N Y N Y N N Y study. Open Learning, 33(2), 155–170. https://doi.org/10.1080/02680513.2018. Jovanović, J., Gašević, D., Dawson, S., Pardo, A., & Mirriahi, N. (2017). Learn- ing analytics to unveil learning strate- 29 gies in a flipped classroom. Internet and Y Y N N N N N N N Y Higher Education, 33, 74–85. https://doi.org/10.1016/j.iheduc.2017.02 Kim, D., Park, Y., Yoon, M., & Jo, I. H. (2016). Toward evidence-based learning analytics: Using proxy variables to im- prove asynchronous online discussion 30 Y Y N N N Y N N Y environments. Internet and Higher Edu- cation, 30, 30–43. https://doi.org/10.1016/j.iheduc.2016.03 Kovanović, V., Gašević, D., Dawson, S., Joksimović, S., Baker, R. S., & Hatala, M. (2015). Does time-on-task estimation matter? Implications for the 31 Y Y N N N N N N N Y validity of learning analytics findings. Journal of Learning Analytics, 2(3), 81– https://doi.org/10.18608/jla.2015.23.6 Li, L. Y., & Tsai, C. C. (2017). Access- ing online learning material: Quantita- tive behavior patterns and their effects on motivation and learning performance. 32 Y Y N N N N N D N Y Computers and Education, 114(300), 286–297. https://doi.org/10.1016/j.compedu.2017. 07.007 Lu, O. H. T., Huang, A. Y. Q., Huang, J. 33 Y Y N N N N N N N Y C. H., Lin, A. J. Q., Ogata, H., & Yang, 74 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … S. J. H. (2018). Applying learning ana- lytics for the early prediction of stu- dents’ academic performance in blended learning. Educational Technology and Society, 21(2), 220–232. Lukarov, V., Verbert, K., & Schroeder, U. (2019). Scaling up learning analytics 34 in blended learning scenarios (Doctoral Y Y N N N N N N N Y dissertation, Universitätsbibliothek der RWTH Aachen). Manzanares, M. C. S., Sánchez, R. M., García Osorio, C. I., & Díez-Pastor, J. F. (2017). How do B-learning and learning patterns influence learning outcomes? 35 Y Y N N N N N D N Y Frontiers in Psychology, 8(MAY), 1– https://doi.org/10.3389/fpsyg.2017.0074 McKenzie, W. A., Perini, E., Rohlf, V., Toukhsati, S., Conduit, R., & Sanson, G. (2013). A blended learning lecture de- livery model for large and diverse un- 36 M Y N N N N N D N Y dergraduate cohorts. Computers and Ed- ucation, 64, 116–126. https://doi.org/10.1016/j.compedu.2013. 01.009 Melero, J., Hernández-Leo, D., Sun, J., Santos, P., & Blat, J. (2015). How was the activity? A visualization support for Gam 37 a case of location-based learning design. Y N N Y N N Y N Y British Journal of Educational Technol- ogy, 46(2), 317–329. https://doi.org/10.1111/bjet.12238 Mirriahi, N., Liaqat, D., Dawson, S., & Gašević, D. (2016). Uncovering student learning profiles with a video annotation Vide tool: reflective learning with and with- 38 out instructional norms. Educational Y N N N N N N N Y soft- Technology Research and Development, ware 64(6), 1083–1106. https://doi.org/10.1007/s11423-016- 9449-2 Mödritscher, F., Andergassen, M., & Neumann, G. (2013). Dependencies be- tween E-Learning Usage Patterns and Learning Results. In Proceedings of the 13th International Conference on 39 M N N N N N N N N Y Knowledge Management and Knowledge Technologies (pp. 1–8). ACM. https://doi.org/10.1145/2494188.249420 Montgomery, A. P., Mousavi, A., Car- bonaro, M., Hayward, D. V., & Dunn, W. (2019). Using learning analytics to explore self-regulated learning in 40 M Y N N N N N N N Y flipped blended learning music teacher education. British Journal of Educa- tional Technology, 50(1), 114–127. https://doi.org/10.1111/bjet.12590 Musabirov, I., & Bakhitova, A. (2017). Trajectories of student interaction with 41 Y Y N N N N N N N Y learning resources in blended learning. In Proceedings of the 17th Koli Calling iJAI ‒ Vol. 2, No. 2, 2020 75 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … International Conference on Computing Education Research (pp. 191–192). ACM. https://doi.org/10.1145/3141880.314190 Nespereira, C. G., Dai, K., Redondo, R. P. D., & Vilas, A. F. (2014). Is the LMS access frequency a sign of students’ suc- cess in face-to-face higher education? In Proceedings of the Second International 42 Conference on Technological Ecosys- Y Y N N N N N N N Y tems for Enhancing Multiculturality - TEEM ’14 (s. 283–290). Salamanca, Spain: ACM Press. https://doi.org/10.1145/2669711.266991 Nguyen, Q., Rienties, B., Tempelaar, D. T., & Giesbers, B. (2016). What learn- ing analytics-based prediction models 43 Y Y N N N N N D N Y tells us about feedback preferences of students. Quarterly Review of Distance Education, 17(3), 13–33. Nguyen, V. A., Nguyen, Q. B., & Nguyen, V. T. (2018). A Model to Fore- cast Learning Outcomes for Students in Blended Learning Courses Based On Learning Analytics. In Proceedings of 44 the 2nd International Conference on E- Y Y N N N N N N N Y Society, E-Education and E-Technology - ICSET 2018 (s. 35–41). Taipei, Tai- wan: ACM Press. https://doi.org/10.1145/3268808.326882 Nouri, J., Saqr, M & Fors, U. (2019). Predicting performance of students in a flipped classroom using machine learn- 45 Y Y N N N N N D N Y ing: towards automated data-driven formative feedback. Journal of System- ics, Cybernetics and Informatics. 17(2). Pardo, A., Han, F., & Ellis, R. A. (2016). Exploring the relation between self-regulation, online activities, and ac- ademic performance: a case study. In Proceedings of the Sixth International 46 Y Y N N N N N D N Y Conference on Learning Analytics & Knowledge - LAK ’16 (s. 422–429). Ed- inburgh, United Kingdom: ACM Press. https://doi.org/10.1145/2883851.288388 Pardo, A., Han, F., & Ellis, R. A. (2017). Combining University Student Self-Regulated Learning Indicators and 47 Engagement with Online Learning Y N N N N N N D N Y Events to Predict Academic Perfor- mance. IEEE Transactions on Learning Technologies, 10(1), 82–92. Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2019). Us- ing learning analytics to scale the provi- 48 Y Y N N N N N D N Y sion of personalised feedback. British Journal of Educational Technology, 50(1), 128–138. Pardo, A., Mirriahi, N., Dawson, S., Vide 49 Y N N N N N D N Y Zhao, Y., Zhao, A., & Gašević, D. o 76 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … (2015). Identifying learning strategies soft- associated with active use of video an- ware notation software. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge - LAK ’15 (s. 255–259). Poughkeepsie, New York: ACM Press. https://doi.org/10.1145/2723576.272361 Park, Y., Yu, J. H., & Jo, I.-H. (2016). Clustering blended learning courses by online behavior data case study in a Ko- 50 rean higher education institute. Internet Y Y N N N N N N N N and Higher Education, 29, 1–11. https://doi.org/10.1016/j.iheduc.2015.11 Paskevicius, M., & Bortolin, K. (2016). Blending our practice: using online and face-to-face methods to sustain commu- 51 nity among faculty in an extended M Y N N N Y Y D N N length professional development pro- gram. Innovations in Education & Teaching International, 53(6), 605–615. Predić, B. ( 1 ), Rančić, D. ( 1 ), Dimić, G. ( 2 ), Štrbac, P. ( 2 ), Maček, N. ( 2 ), & Spalević, P. ( 3 ). (2018). Improving final grade prediction accuracy in 52 Y Y N N N N N N N Y blended learning environment using vot- ing ensembles. Computer Applications in Engineering Education, 26(6), 2294– 2306. https://doi.org/10.1002/cae.22042 Snodgrass Rangel, V., Bell, E. R., Mon- roy, C., & Whitaker, J. R. (2015). To- ward a New Approach to the Evaluation of a Digital Curriculum Using Learning 53 Y Y N Y Y N N D Y Y Analytics. Journal of Research on Tech- nology in Education, 47(2), 89–104. https://doi.org/10.1080/15391523.2015. Cerezo, R., Esteban, M., Sánchez-San- tillán, M., & Núñez, J. C. (2017). Pro- crastinating behavior in computer-based 54 learning environments to predict perfor- Y Y N N N N N N N Y mance: A case study in Moodle. Fron- tiers in Psychology, 8, 1–11. doi:10.3389/fpsyg.2017.01403 Rubio, F., Thomas, J. M., & Li, Q. (2018). The role of teaching presence and student participation in Spanish 55 blended courses. Computer Assisted Y Y N Y N Y N N Y Y Language Learning, 31(3), 226–250. https://doi.org/10.1080/09588221.2017. Saqr, M., Fors, U., & Tedre, M. (2018). How the study of online collaborative learning can guide teachers and predict 56 students’ performance in a medical Y Y N N N N Y N N Y course. BMC Medical Education, 18(1). https://doi.org/10.1186/s12909-018- 1126-1 Saqr, Mohammed, Fors, U., & Tedre, 57 M. (2017). How learning analytics can Y Y N N N N N N N Y early predict under-achieving students in iJAI ‒ Vol. 2, No. 2, 2020 77 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … a blended medical education course. Medical Teacher, 39(7), 757–767. Tempelaar, D. ( 1 ), Rienties, B. ( 2 ), & Nguyen, Q. ( 2 ). (2018). A multi-modal study into students’ timing and learning 58 regulation: time is ticking. Interactive Y Y N N N N N D N Y Technology and Smart Education, 15(4), 298–313. https://doi.org/10.1108/ITSE- 02-2018-0015 Tempelaar, D. T., Rienties, B., & Gies- bers, B. (2015). In search for the most informative data for feedback genera- 59 M Y N N N N N N N Y tion: Learning analytics in a data-rich context. Computers in Human Behavior, 47, 157–167. Tempelaar, D. T., Rienties, B., & Nguyen, Q. (2017). Towards Actionable 60 Learning Analytics Using Dispositions. M Y N N N N N D N Y IEEE Transactions on Learning Tech- nologies, 10(1), 6–16. Tempelaar, Dirk. (2017). How Disposi- tional Learning Analytics Helps Under- 61 standing the Worked-Example Principle. Y Y N N N N N D N Y International Association for Develop- ment of the Information Society, 14. Tempelaar, Dirk, Rienties, B., Mittel- meier, J., & Nguyen, Q. (2018). Student profiling in a dispositional learning ana- 62 M Y N N N N N D N Y lytics application using formative as- sessment. Computers in Human Behav- ior, 78, 408–420. Tempelaar, Dirk, Rienties, B., & Ngu- yen, Q. (2018). Investigating learning strategies in a dispositional learning ana- lytics context: the case of worked exam- ples. In Proceedings of the 8th Interna- 63 tional Conference on Learning Analytics M Y N N N N N D N Y and Knowledge - LAK ’18 (s. 201–205). Sydney, New South Wales, Australia: ACM Press. https://doi.org/10.1145/3170358.317038 Tempelaar, D. (2020). Supporting the less-adaptive student: the role of learn- ing analytics, formative assessment and 64 M Y N N N N N N N Y blended learning. Assessment & Evalua- tion in Higher Education, 45(4), 579- Van Goidsenhoven, S., Bogdanova, D., Deeva, G., Broucke, S. V., De Weerdt, J., & Snoeck, M. (2020, March). Pre- dicting student success in a blended 65 Y Y N N N N Y N N Y learning environment. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge (pp. 17-25). van Laer, S., & Elen, J. (2018). Adults’ Self-Regulatory Behaviour Profiles in Blended Learning Environments and 66 Their Implications for Design. Technol- M Y N N N N N D N Y ogy, Knowledge and Learning, 1–31. https://doi.org/10.1007/s10758-017- 9351-y 78 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Van Leeuwen, A. (2019). Teachers’ per- ceptions of the usability of learning ana- lytics reports in a flipped university 67 course: when and how does information M N N N N N N N N Y become actionable knowledge? Educa- tional Technology Research and Devel- opment, 67(5), 1043-1064. Villamañe, M., Álvarez, A., Larrañaga, M., Hernández-Rivas, O., & Caballero, J. (2018). Using Visualizations to Im- prove Assessment in Blended Learning Environments. In Proceedings of the 68 Sixth International Conference on Tech- M Y N N Y N N N N Y nological Ecosystems for Enhancing Multiculturality - TEEM’18 (s. 165– 169). Salamanca, Spain: ACM Press. https://doi.org/10.1145/3284179.328420 Whitelock-Wainwright, A., Tsai, Y. S., Lyons, K., Kaliff, S., Bryant, M., Ryan, K., & Gašević, D. (2020, March). Disci- plinary differences in blended learning 69 M Y N N N N N D N Y design: a network analytic study. In Pro- ceedings of the Tenth International Con- ference on Learning Analytics & Knowledge (pp. 579-588) Zacharis, N. Z. (2015). A multivariate approach to predicting student outcomes in web-enabled blended learning 70 courses. Internet and Higher Education, Y Y N N N N N N N Y 27, 44–53. https://doi.org/10.1016/j.iheduc.2015.05 iJAI ‒ Vol. 2, No. 2, 2020 79 http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png International Journal of Learning Analytics and Artificial Intelligence for Education (iJAI) Unpaywall

Learning Analytics for Blended Learning: A Systematic Review of Theory, Methodology, and Ethical Considerations

International Journal of Learning Analytics and Artificial Intelligence for Education (iJAI)Oct 29, 2020

Loading next page...
 
/lp/unpaywall/learning-analytics-for-blended-learning-a-systematic-review-of-theory-0NNO51Rsif

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Publisher
Unpaywall
ISSN
2706-7564
DOI
10.3991/ijai.v2i2.17887
Publisher site
See Article on Publisher Site

Abstract

Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, and Ethical Considerations https://doi.org/10.3991/ijai.v2i2.17887 ( ) Nina Bergdahl , Jalal Nouri, Thashmee Karunaratne, Muhammad Afzaal Stockholm University, Stockholm, Sweden ninabe@dsv.su.se Mohammad Saqr KTH Royal Institute of Technology, Stockholm, Sweden Abstract—Learning Analytics (LA) approaches in Blended Learning (BL) research is becoming an established field. In the light of previous critique toward LA for not being grounded in theory, the General Data Protection and a renewed focus on individuals’ integrity, this review aims to explore the use of theories, the methodological and analytic approaches in educational settings, along with sur- veying ethical and legal considerations. The review also maps and explores the outcomes and discusses the pitfalls and potentials currently seen in the field. Jour- nal articles and conference papers were identified through systematic search across relevant databases. 70 papers met the inclusion criteria: they applied LA within a BL setting, were peer-reviewed, full-papers, and were written in English. The results reveal that the use of theoretical and methodological approaches were disperse. We identified approaches of BL not included in categories of BL in existing BL literature and suggest these may be referred to as complex hybrid learning, and that ethical considerations and legal requirements often have been overlooked. We highlight critical issues that contribute to raise awareness and inform alignment for future research to ameliorate diffuse applications within the field of LA. Keywords—Literature review; learning analytics, blended learning, complex hybrid learning 1 Introduction 1.1 The emergence of learning analytics Given the wealth and complexity of learning, learning sciences have become an in- terdisciplinarity domain that includes cognitive, educational, social and computer sci- ence among others. Ten years ago, learning analytics (LA) emerged as a multidiscipli- nary research domain with the overarching premise to harness the power of data and analytics to advance our understanding of learning as well as help improve learning, 46 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … teaching and optimize the learning environments [26]. As such, LA is defined as “the measurement, collection, analysis and reporting of data about learners and their con- texts for purposes of understanding and optimizing learning and the environments in which it occurs” [83]. Interest in LA was catalysed by three main factors; firstly: the rewarding business intelligence and big-data success stories that has contributed to in- dustry growth and added competitive advantage to companies by allowing them to bet- ter understand customers and offer better recommendations (ibid.). Secondly, the avail- ability of immense volumes of digital traces and clickstream data recorded by learning management systems and other digital learning environments, such as student’s infor- mation systems, online library platforms and video streaming services. Third, the rev- olutionary developments in data science methods and improved computer hardware that became more powerful and accessible [15, 21, 83]. Inspired by the industry, LA has initially used digital trace data to create predictive models to for example, forecast drop- outs, identify students at-risk of failing, or offer visual dashboards. However, criticism had been levied at these models for failing to account for contextual factors, being athe- oretical and difficult to replicate [25, 100]. Research has since extended both data col- lection, analysis methods and approaches to theory in LA research. Recently, data col- lection methods have grown in volume and diversity to cover the full breadth of learn- ers’ activities e.g., classroom interactions, physiological indices, proximity data, eye tracking, self-reports in addition to the commonly used digital traces. Similarly, meth- ods have grown to include sequence and process mining, epistemic network analysis as well as advanced machine learning algorithms [16, 20, 62]. Today, when it is common that students’ educational reality is an integration of physical and virtual learning environments, often referred to as a Blended Learning (BL) [37-39], the data collection methods are often broadened to survey both the phys- ical and the virtual spaces. Applying such data collection and theoretically aligned re- fined models, researchers hope to capture learners’ behaviour where it occurs. The study of LA and BL is a growing field of inquiry [48], which is garnering a growing attention in and outside the LA community. The rationale for undertaking this review is that LA research has been criticised for not being grounded sufficiently in theory. For example, that perspectives of learning are lacking [57], and that BL research have re- mained vague and unclear, why there has been a call for research to further develop definitions of BL, models and conceptualisations [46]. Moreover, as LA is becoming an established research field of its own, it is interesting to explore the methodological practices applied across LA research and with the fast-paced development of big data analytics on individual trace data and the implementation of the General Data Protec- tion Regulations of the European Union (GDPR), valid concerns might be raised with regards to if and how ethical and legal considerations have been applied. Accordingly, a systematic review that identify theoretical underpinnings, methodological practices, considerations of ethical aspects and legal requirements and the contributions in LA research is needed to raise awareness and inform alignment for future research. iJAI ‒ Vol. 2, No. 2, 2020 47 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 2 Background 2.1 Learning analytics in blended learning Blended Learning is a term coined in the 90’s, which related practices over the years gained substantial influence, and today is regarded as the “new normal” [46]. The con- cept of BL, its operationalisations and definitions are wide and still evolving [1]. Re- searchers have highlighted that BL is a broad term that may reflect different variations to what is blended, the extent and duration of the blend, models of blended learning, a systematic approach that include any part of a system that combines face-to-face (f2f) instruction with instructions meditated be digital technologies [37, 46] which com- monly include blended instruction, blended delivery, blended learning environments [37] spatiotemporal aspects, where learning can be self-paced and individual qualitative aspects [56] which reflect thoughtful integration [31]. In short, BL can be viewed as an umbrella term, that without specific descriptions will not inform the reader what aspects of teaching and learning that is approached. In addition, LA approaches has been critiqued for being atheoretical [32, 57]. In LA and BL research, as in other educational research, the main aim is to support students to succeed with their education. Such effort can be demonstrated by including theory that provides guidance for how to understand, operationalise, measure and interpret for example students’ engagement in learning. Engagement theories may emphasize different aspects, for example agency [5] or cognition [6] related to self-regulation (SRL). Student engagement is critical for learning, and from this perspective LA in BL is warranted as it combines the BL setting with theoretical insights of students use of digital technologies, their ability to self-regulate to re-engage in the face of difficulty, distraction, frustration, simultaneous social demands et cetera. However, critique has forwarded that LMS data may not be suitable to capture a nuanced understanding of student engagement, this as engagement is a multi-dimensional construct, and LMS data, still, at best can reflect a one-dimensional aspect of engagement, [42]. Moreover, researchers [42] could not find any significant correlation between student self-reports of engagement and the LMS trace data. Thus, if the approaches and applications of BL, theories from learning perspectives and how these are operationalised are lacking, and self-reports and traces data are not correlating, this decreases the value of the contribu- tions in the field of LA. Previous reviews have surveyed theoretical underpinnings in LA research, and concluded that the grounding in (educational) theory is evident but too often meagre or lacking [e.g. 57, 96]. For example [57] concluded that existing learning analytics dashboards research are rarely grounded in learning theory, cannot be suggested to support metacognition and thus, do not inform teachers of effective practices and strategies. LA reviews have also explored methods applied within LA research [e.g. 6] and identified that LA studies use diverse methods, for example, visual data analysis and clustering techniques, social network analysis (SNA) and educational data mining. Taken together, however, the existing reviews on LA research have not taken contributions into account, such approach is critical, as if applications of BL, theories from learning perspectives and how these are operationalised are insufficient or lacking, the contributions becomes unclear. 48 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Today, there are a substantial number of Learning Analytics reviews. LA reviews often specialise on particular areas like for instance: game learning analytics [9], visual learning analytics, [96], the role of self-regulated learning (SRL) [98], learning analyt- ics dashboards [57], or the uses of LA in relation to specific methods or approaches e.g.; open learner models [10] educational data mining [12] or apply a wider scope that explore national research ef-forts, policies, infrastructures and competence centres across several European countries [68]. While several reviews highlight similar findings (i.e. a lack of theoretical underpin- ning, unclear uses of methods) there is a risk of transferring and projecting findings across LA research, as findings which might not reflect the broader LA research, which in turn may lead to overgeneralisations. Although there are many published (systematic, scoping and area-specific) reviews on LA in online settings, in order to understand their aim, objective and contribution, it is beneficial to approach a less specific overview of LA research to survey commonalities, of theoretical underpinnings (including concep- tualisations of BL and learning perspectives) methodological approaches, ethical and legal requirements and contributions. However, in addition to theoretical and methodological aspects, an additional layer of complexity is added to LA research in a BL environment. LA is in itself a practice of gathering, analysing and sharing big amounts of personal data, which comes with an increased need for ethical considerations and adherences to legal requirements. The ethical, privacy and legal concerns of processing of personal data are on the frontier of data processing due to the presence of the GDPR [14]. LA is a subject developed on data-driven approaches to education innovation, and hence, in the spotlight of this con- cerns. Beyond ethics, the GDPR provides a legal framework in preserving the rights of the data subjects, that is: the students. Learning analytics operates on data about stu- dents and their learning environments, where personal data of the students is an integral part. Personal data of students refers to any data that directly or indirectly connected to an identifiable person, e.g., student names, personal identification numbers, email, pho- tographs, and other data that could lead to identifying an individual [29]. It is typical that learning and student management systems store, retrieve, and process such data, driven by different academic and learning purposes [15]. While the absence of ethical considerations [16], [17], privacy issues and GDPR [18] have been previously critiqued in regards to the adoption of LA we did not find any existing review that had explored these aspects of GDPR, ethics and privacy on LA research. Therefore, in this study, we have added a focus on how the reviewed studies consider ethical and legal aspects of using data. Informed by these previous concerns and critiques we raised the following questions: 1. How is blended learning defined in the reviewed learning analytics research? 2. For which learning focus perspectives are theories used in the reviewed learning an- alytics research? 3. What approaches of data collection, methods and analysis are evident in the reviewed learning analytics research? 4. How are ethical and legal aspects considered in the reviewed learning analytics re- search? 5. What are the contributions of the reviewed learning analytics research? iJAI ‒ Vol. 2, No. 2, 2020 49 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 3 Method 3.1 Search strategy and selection procedure This study examines academic journals and conference papers applying Learning Analytics in Blended Learning from two databases (see Table 1). A systematic search was conducted using EBSCOhost via Stockholm University library (filtered by content providers: Scopus, ERIC, Academic Search Premier, Directory of Open Access Jour- nals, ScienceDirect) for academic journals, and ACM DL Digital Library, for confer- ence papers. As detailed below, the systematic search via EBSCOhost followed educa- tional journals by status [13], and the selection employed journal rankings provided by SCIMAGO Institutions Rankings. Table 1. Overview of search string results Database and journal search strings Hits EBSCOhost database search string “learning analytics” + “blended learning” 79 “learning analytics” + “blended learning” (incl. “within full text of articles”) 282 “learning analytics” + “blended environment” 0 “learning analytics” + “blended learning environment” 2 “teaching analytics” + “blended /”-learning”/ “-environment” 0 “educational data mining” + “blended learning” 8 “educational data mining” + “blended” 8 “educational data mining” + “blended environment” 0 ACM DL Digital Library “learning analytics” + “blended learning” + “LAK” (Learning Analytics & Knowledge confer- ence) “educational data mining” + “blended learning” + “LAK” (Learning Analytics & Knowledge conference) “learning analytics” + “blended learning” 43 “educational data mining” + “blended learning” 22 Journal search via EBSCOhost via Stockholm University Library and the Journal of Learning Analytics “blended learning” 7 “learning analytics” + “blended learning” 6 “educational data mining” + “blended learning” 3 Internet and Higher Education “learning analytics” + “blended learning” 14 “educational data mining” + “blended learning” 8 Educational Technology and Society “learning analytics” + “blended learning” 2 “educational data mining” + “blended learning” 0 Journal of Computer Assisted Learning “learning analytics” + “blended learning” 2 “educational data mining” + “blended learning” 0 British Journal of Educational Technology “learning analytics” + “blended learning” 16 50 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … “educational data mining” + “blended learning” 5 Computers in Human Behavior “learning analytics” + “blended learning” 26 “educational data mining” + “blended learning” 17 Computers and Education, Communications in Information Literacy, Learning and Instruction, International Review of Research in Open and Distance Learning, Edu- cational Evaluation and Policy Analysis International Journal of Mobile and Blended Learning “learning analytics” + “blended learning” 0 “educational data mining” + “blended learning” 0 The search combinations used in SCIMAGO: Social Sciences + E-learning + All regions / countries + Journals + 2017; Social Sciences + Education + All regions / coun- tries + Journals + 2017; Computer Science + Human-Computer Interaction + All re- gions / countries + Journals + 2017; and Computer Science + Human-Computer Inter- action + All regions / countries + Journals + 2017. Inclusion from each 4 search com- binations above was determined by relevance of the title and the choice was limited to the top-10 journals in each search combination. We identified papers from the following six journals Internet and Higher Education, Journal of Computer Assisted Learning, British Journal of Educational Technology, Computers in Human Behaviour, Educa- tional Technology and Society, and the Journal of Learning Analytics. To search com- binations used for the EBSCOhost database, we used combinations of keywords: “learning analytics” + “blended learning”; “learning analytics” + “blended environ- ment”; “teaching analytics” + “blended learning”; “teaching analytics” + “blended en- vironment”; “educational data mining” + “blended learning”; “educational data min- ing” and “blended environment”. We included peer-reviewed, academic journals, writ- ten in English. We also tried including a “search within full text of articles”; and screened titles and abstracts of the papers for inclusion, and remove duplicates. We decided to not utilise the function further, as it returned irrelevant articles where BL and LA were mentioned only in the reference section. We searched for articles pub- lished the between January 2013-July 2020. Overall, the keyword searches amounted to 304 hits (not including the search within full text of articles). After removing the duplicates, 193 journal articles and conference papers remained; 38 hits did not return full texts and 4 hits returned hits in other lan- guages (three in Danish and one in German) although the search criteria were aiming at English texts only. After that, we sifted through the remaining papers and excluded 32 papers that were not directly relevant to LA and BL, and 49 that lacked one of the two focuses (either LA or BL). During close-reading, an additional three papers were excluded, as they did not meet the inclusion criteria. Thus, which we proceeded to code and later analyse 70 papers. 3.2 Data coding and analysis Following a coding scheme all articles were read through by two authors, who sorted the content in: article data (country, publication year, title), educational context, (blended learning interpretation and level), research aims/questions, theoretical iJAI ‒ Vol. 2, No. 2, 2020 51 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … underpinnings and definitions of BL, data sources, data collection methods, ethical con- siderations, analytical methods and results and contributions. All authors then con- ducted a deeper analysis of one section of the reviewed articles each (1. theoretical underpinnings, 2. data collection, methods and analysis, 3. ethical and legal considera- tions and 4. contributions). In depth discussions were held between the authors to dis- cuss approaches and align findings. 4 Results The result section details the findings as follows: 4.1 Theoretical underpinnings, 4.2 Data collection, methods and analysis, 4.3 Ethical and legal considerations and 4.4 Con- tributions of the reviewed articles. 4.1 Theoretical underpinnings To discern the positioning of the articles in terms of their relation to BL, we analysed the articles with regards to how blended learning was used throughout the articles, in particular how frequent the authors refer to blended learning, their definition, descrip- tion and use of theory. Currently, BL literature [e.g. 37, 39] have identified three com- mon ways in which explorations of blended learning delivery may vary: blended in- struction, blended distribution, as identified in [14, 15] or blended pedagogies [54]. However, going through these descriptions, we also found that studies could displayed a combination of blended instruction and blended distribution; i.e. when a section of the course is provided fully f2f, followed by the remainder offered fully online [79] or reversed: a course is delivered fully online and then fully f2f [49]. We also identified that the BL was used in ways beyond these categories. We identified a combination of blended learning approaches in which some, but not all, students use the BL component. For example, we identified studies that offer i) optional adoption of the blended com- ponent to the students, [e.g. 43, 80, 87], or when ii) there was a synchronous teaching of f2f students and online students in the same classroom [9] and iii) in cases of reversed distribution; a channel directed exclusively from the student to peers and/or teachers, in which the teaching (distribution, delivery and pedagogy) has remined traditional, for example as an e-portfolio accessible in a social network [35] or in flipped classrooms, where students responded to distributed (asynchrounous) media and instruction in their own time and place. [40]. 52 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Table 2. Blended learning definitions Blended learning definitions “Technology to support face-to-face teaching [2] and to enhance student participation” (Liao & Lu, 2008). [7] [32] “Blended learning system as one which combines face-to-face instruction with computer-medi- [58][63] ated instruction with the aim of complementing each other” (Graham, 2006; 2009; 2013) [75][102] “The range of possibilities presented by combining Internet and digital media with established [23] classroom forms that require the physical co-presence of teacher and students” (Friesen, 2012) “B-learning is the form of learning environment where the traditional classroom teaching and face-to-face communication between teacher and students are blended with the computer-medi- [30] ated interaction “(Bubaš & Kermek, 2004) “Blended learning is a combination of traditional face to face learning and online learning. It has the advantages of the both, providing students with unique flexible learning experience and be- [36] coming one of the fastest growing trends in educational field” (Thorne, 2003) “The thoughtful integration of classroom face-to-face learning experiences with online learning [41] [76] experiences” (Garrison & Kanuka, 2004) “Taking the best from self-paced, instructor-led, distance and classroom delivery to achieve flex- ible, cost-effective training that can reach the widest audience geographically and in terms of [44] learning styles and levels” (Marsh & Drexler, 2001) “The integration of thoughtfully selected and complementary face-to-face and online approaches [60] and technologies’’ (Garrison & Vaughan, 2008) “Blended learning is learning that happens in an instructional context which is characterized by a deliberate combination of online and classroom- based interventions to instigate and support [94] learning. Learning happening in purely online or purely classroom-based instructional settings is excluded” (Boelens,Van Laer, De Wever & Elen, 2015). Table 2 shows an overview of the used definitions of blended learning. While 29% of the articles offered a clear definition, most articles relied on inferences or contextual descriptions. 18% of the articles neither inferred nor described BL. The articles that offered a definition most commonly cited Graham [37-39] Analysis from a learning focus perspective: Revealed five themes reflecting the perspective of the research: (i) the flipped classroom, (ii) collaborative learning, (iii) conversational aspects of learning, (iv) engagement and self-regulation operationalised using system trace data and (v) learner profiles and procrastination. Studies that include theories are presented in a condensed and summarised form (the others are not). 1. The flipped classroom: While most studies exploring the flipped classroom, ap- proached student engagement and learning, a few were focusing on the actual learn- ing situation [19–21]. These studies applied a more over-arching, abstract level of theory to inform their study, and also discussed their findings in the light of theory. However, while SRL were, by far, the most commonly used theory to explore flipped classroom design, most studies did not seek to explore the blended learning environ- ment. 2. Collaborative learning: Social Network Analysis was used to visualise online in- teractions, and identify productive behaviours and correlation with performance [35, 41, 43, 81]. These used constructivist and situated learning theories and theories of self-regulation. iJAI ‒ Vol. 2, No. 2, 2020 53 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 3. Conversational aspects of learning: Studies exploring conversations aspects of learning, most commonly approached feedback operationalised as online reports, re- ferring to feedback and assessment theories [e.g. 72, 90], or deep learning theories [40, 51]. Another type of input to learning was explored by [89], grounded their study in the Dispositional Learning Analytics (DLA) infrastructure, used previous publications on assistant conversational agents, and theories on cognitive load in mi- croblogging. Using the foundation of the Community of Inquiry framework, which prioritises teacher presence, and active participation, [79] used trace data to operationalise active participation, as the number of: messages sent, documents uploaded, chat sessions attended, as well as data collected to analyse teacher presence. 4. Engagement and self-regulation operationalised using system trace data: Out of all the theories applied, engagement in general and self- regulated learning (SRL) in particular, were the most commonly used. To add to these research approaches, as- pects of culture and gender were introduced and explored [86]. While SRL often was operationalised as observable indicators in system logs, motivation was approached by measures of self-efficacy, intrinsic value, test anxiety, cognitive strategy and self- regulation using a questionnaire. SRL was often operationalised as trace data, and combined with other engagement and learning theories. [36, 52, 58, 80]. Numerous studies explored relations between trace data, performance and SRL using self- re- ports [30]; some in combination with other theories, for example theories of motiva- tion [20] socio-cultural perspectives [31], Self-Determination Theory and the Con- trol-Value Theory of achievement emotions [86-88]. 5. Learner profiles: We identified that with studies exploring learner profiles, it was common to inform this approach with other theories, for example, course satisfaction and social constructivist theory [18], deep and shallow learning [32] active learning and engagement [35] and procrastination [1, 54, 66]. In the reviewed studies, student learning strategies was often operationalised as trace data on student interaction with online learning resources [33]. Amongst these, procrastination was found to be com- mon. Several studies operationalised SRL as procrastination [54, 66]. For example, using LMS data to survey time spent studying and time spent refraining from access- ing available data [54]. Procrastination was also explored without relation to SRL, or how long the student waited before accessing LMS materials [1]. Other research- ers used questionnaires to survey procrastination and risk taking using the Expec- tancy-Value Theory, motivation, using the Academic Motivational Scale and help seeking, and epistemic emotions to inform a to approach how different learning strat- egies relate to preferences of feedback [66]. In sum: While most reviewed studies approaching a flipped classroom, used theories with a focus on students and their engagement and learning, a few were focusing on the actual learning situation [69, 75, 76, 94] or combined flipped classroom theories of Computer Assisted Language-Learning [34]. The latter studies applied a more over- arching, abstract level of theory to inform their study, and also discussed their findings in the light of theory. Some studies argued that there is a need to develop a specific SRL-LA theory [63]. However, while SRL were, by far, the most commonly used 54 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … theory, most studies did not seek to explore the blended learning environment, but seemed to relate their data collection to operationalisations related to a learning per- spective, with or without underpinning theories of learning. 4.2 Data collection, methods and analysis All studies included in the review used a digital platform for collecting data. As can be expected, LMS was the most used platform for data collection (this was true in 56 studies, 89%). Among them 14 studies (25%) used more than one platform for data collection (for a full overview of studies and data sources, see Appendix A). A single study used custom LMS, two studies used video streaming software, and one study used wiki. Digital traces were the most collected data types in (90.5%), followed by self- reported surveys 27 (42.9%). Self-reported surveys were used to collect data about stu- dents’ depositions such as engagement, motivation and learning styles. Relational and social network data from computer supported mediated interactions were collected in eight studies (12.7%). Interviews were collected in five studies, video or observation in three, multimodal data were collected in two studies, and, transcripts of classroom in- teractions were reported in one study. Most of the data collected in the reviewed studies were digital data (see Table 3). Data were collected from the classroom in only six studies, where two other studies reported on multimodal data, and four studies used video recording and observation of classroom setting. [45] used multimodal data through a system called SPACLE to rec- ord classroom interactions among students and teachers. The interactions recorded in- cluded on-task, off-task, talking to class, outside or inactivity data. The system allowed for spatial data about positions of the users in the class, and their activity levels. [85] used classroom observations to report on the teachers and students’ classroom behav- iour, although the methods do not clearly describe in detail what was observed and how it was reported. [81] collected f2f data to measure teaching presence according to the community of inquiry framework. Transcripts of audio recordings of the lessons facil- itated the thematic content analysis. Real-time classroom observations were also done. Performance data such as grades or continuous assessment were collected from most studies (88.9%). While LMS data may be informative, it does not capture the f2f learn- ing environment, the process of learning, or the student-teacher or the student-student dynamics. The stark contrast between results collected from digital resources and class- room represents an obvious gap. Most data were gathered using digital traces, disposi- tional self-reports, relational data and interviews that are disconnected from the class- room where a significant amount of learning happens. iJAI ‒ Vol. 2, No. 2, 2020 55 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Table 3. Types of data collected and their percentage across all studies Data type Y % No % Trace 57 90.5 6 9.5 Survey 27 42.9 36 57.1 SNA 8 12.7 55 87.3 Interviews 5 7.9 58 92.1 Observation/video 3 4.8 60 95.2 Multimodal 2 3.2 61 96.8 Discourse 2 3.2 61 96.8 The analysis methods in most of the studies (98.4%) employed were the traditional descriptive statistics, frequentists, and group comparisons, that included correlations, comparison of means, and chi-square (see Table 4). Visualisation was used in a signif- icant number of studies (77.8%), in the context of explaining results, but not necessarily as a research objective. Thus, few studies used visualisation as their research objective. However, we also found evidence of development of systems that gather information from different data sources to provide visual analytics to enhance feedback offered to students [102]. However, such application of visualisation was rare. Table 4. Overview of analysis methods Methods Count % No % Statistics 62 98.4 1 1.6 Visualisation 49 77.8 14 22.2 Regression 32 50.8 31 49.2 Machine learning or AI 29 46.0 34 54.0 Clustering 21 33.3 42 66.7 SNA 10 15.9 53 84.1 Sequence or process mining 10 15.9 53 84.1 Qualitative 9 14.3 54 85.7 Data mining or text analytics 5 7.9 58 92.1 Regression analysis were used to predict performance, or forecast learning outcomes in 29 studies (46%). Results show that prediction of performance is the main research objective for learning analytics in blended learning. In 88.9% of all the studies included performance, prediction or optimisation as the main objective. In 33.3% studies meth- ods for unsupervised classification of students by means of clustering studies were used to categorise students according to certain criteria such as learning strategies, baseline disposition, learning process sequences or self-regulation. Sequence mining appears to be gaining in the learning analytics field with 15.9% of the studies exploring the con- cept, and, in most of the times it was coupled with clustering and visualisation. Yet, all the studies in this category have not researched the impact of these visualisation on teachers or learners. Studies that used SNA in the analysis are 15.9%, and, similar to process mining research, all of the articles have not used visualisation techniques for the sake of helping students or teacher to optimise learning. Qualitative research was performed in nine studies through the analysis of interviews or transcripts. Data mining and pattern recognition was performed in five studies. 56 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 4.3 Ethical and legal considerations Irrespective of the necessity for considering ethical obligations in the use of student data, the papers did not provide documentations of such responsible use of personal data. Almost all the literature examined in this study, that is 99% of the articles, pri- marily focuses on LMS data. However, the ethical and legal aspects are very much under-represented in the discussions with only eight articles provided a clear evidence that they do not count on personal data or the data are de-identified. 22 of the 70 reviewed articles (31%) mentioned anonymising students. Nevertheless, it is important to recall here that hiding the student names from the data set is not enough to guarantee that individuals cannot be identified [40]. For exam- ple, if a student who enrolled in a course in a specific year, with specific major and so on could possibly have a significant probability of resulting in a perfect attribute set for identifying a specific student. Such events might raise red flags for ethical concerns of how legible is it to consider that anonymisation of data is sufficient (ibid.). Although 40% of the articles indicate that they, at some point, considered ethical aspects when collecting data, which are those ethical aspects were, how did these aspects mattered in the data collection, processing and outcomes, were not been mentioned in any of the reviewed studies. An important observation here is that at least 24 papers among the reviewed studies explicitly focus on the collection, analysis and managing of individ- ual’s personal data. Although a more profound discussion to explicate the instrumenta- tion of the legal and ethical procedure of retrieving and processing the sensitive pieces of the data is anticipated, a considerable gap in this focus in the articles is inevitable. Thirteen articles reported studies from Europe, but only six articles are mentioning that they have considered legal aspects and informed the students before the data collection, or the data has been anonymised. As nearly all of the studies were conducted prior to the GDPR rules in the EU [29], new and rigorous practices need to be applied in future LA approaches. 4.4 Contributions The contributions of reviewed studies could be classified into three themes, such as i) understanding and predicting performance, ii) understand student’s behaviours and profiles, and iii) understanding and improving the learning environments (for an over- view, see Table 5). iJAI ‒ Vol. 2, No. 2, 2020 57 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Table 5. Overview of contributions of reviewed studies Theme Aim/objectives Contribution Random forest, Linear and logistic regression, ensemble model- ling predicts academic performance with satisfying accuracy Forecast model could predict at-risk students Predict academic Visualisations are helpful for teachers to detect anomalous situa- 11 performance tions Two to six-week data is enough for future academic performance prediction Portability of pre- Portability of predictive models is low across courses dictive models LMS variables vary among general and course-specific models Data variables related to LMS engagement, self-regulated learn- Association of Understand ing, and collaborative learning are corrected with students’ aca- variables with per- /predict demic performance 9 formance predic- performance Tracking data is not significant predictors of academic success tion for some courses (e.g., graphic design) Seven factors found that affect students’ academic performance, Identification of consisting of four online and three traditional factors affecting 3 Four factors each found for both collaborative and self-regulated learning outcomes learning that affects students’ outcomes Social network metrics can be used as predictors Influence of social Number of interactions do not significantly correlate with student network with per- performance 3 formance SNA based upon questionnaires provide useful indicators for a more fine-grained analysis Two learning profiles identified based on student’s participation in online activities Identification of Identified learning behaviours before and after midterm exams behaviours /learn- 6 Different self-regulated learning behaviours identified based on ing patterns resources utilisation and procrastinator nature Found five learning trajectories with the varied resource use Four student clusters were observed based on their performance measures Based on interactions with the video annotation tool four profiles emerged Clustering/profil- Six profiles emerged from nine trace variables and student’s in- Understand ing-based student formation system data 8 student’s learning Based on students’ viewing behaviours, they were clustered into behaviours behaviours three groups and profiles Based on usage of LMS three profiles discovered Based on learner control, scaffolding, and interaction three self- regulated profiles appeared Self-assessment exercises, regularly resource access, and active online behaviour are significantly correlated with learning out- comes Relation between Use of videos annotations, metacognitive skills, and motivational profiles/learning strategies are weakly associated with learning achievements 15 behaviours on Procrastination behaviour, low level of participation, and worked learning outcomes examples could affect students learning outcome Found that students tend to change their learning behaviour throughout the course 58 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Resources access, LMS access time, and active learning days have a positive influence on learning results Fully worked-out solutions and engagements have adverse ef- Learning re- fects on students’ achievements sources and Personalised feedback has a small to medium positive effect on activities in 15 the learning outcome relation to Visualisations feedback allows students to make a better diagno- learning outcome sis of their performance Understanding Learning analytics-based interventions can improve student aca- and improving demic achievement the learning environments Video viewing patterns, resource utilisation, and order of activi- ties provide feedback to enhance classroom teaching and re- sources Improvements in Visualisation-based learning analytics allow teachers to identify course design, which learning design elements should be revised and improved 8 content, and Differences in instructional approaches during f2f and blended instructions courses are very likely due to the different class formats An understanding develops teachers’ interventions through learn- ing activity redesign can cultivate better learning attitudes 1. Understanding and predicting performance: To predict students’ academic per- formance, random forest, linear and logistic regression, and ensemble modelling based predictive models provided satisfying results (over 70% accuracy) [2, 51, 77, 80, 81, 102]. Similarly, a forecast learning outcome model (FLOM) was developed using interactive data to predict at-risk students [67]. However, FLOM achieved lower accuracy than other predictive models. On the contrary, student’s data visual- isations found helpful for teachers to detect anomalous situations [97]. Regarding appropriate time for prediction, studies discovered that two to six weeks data is suf- ficient to obtain accurate prediction [51, 53]. However, the portability of predictive models across courses remains low [23, 32]. Since prediction is entirely dependent on the supplied data, studies identified that LMS variables (e.g., access time), en- gagement indicators, self-regulated learning (e.g., self-efficacy and test anxiety), and collaborative learning (e.g., social stability, and time spent on task) variables have reliable predictive power due to their positive correlation with students achievements [2, 30, 71, 80, 90, 91, 102]. Nevertheless, for some courses (e.g., graphic design) tracking data became useless because different patterns exist in the effect of individ- ual data variables [32]. Reviewed studies also disclosed that social network metrics (e.g., degree, authority and PageRank) could be employed to predict student perfor- mance [15, 43, 81]. However, using these metrics, the representativity of the predic- tive models would be limited [81]. In factors identification, four online (e.g., activi- ties, video clicks, videos backwards and practice score per week) and three tradi- tional (e.g., participation in after-school tutoring, homework and quiz scores) factors were identified that affect students’ performance [53]. While, attendance, time spent in class, sitting position, and groups are essential for collaborative learning and self- efficacy, positive strategy, less anxious and less usage of negative strategy found important for self-regulated learning [2, 72]. 2. Understand student’s behaviours and profiles: To identify students learning pat- terns and behaviours, studies utilised student’s participation, resources access, and iJAI ‒ Vol. 2, No. 2, 2020 59 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … other LMS data. For instance, based on students participations two learning behav- iours emerged: sensing where students are more likely to participate in information access, interactive and networked learning activities, reflective where students are more predisposed to materials development activities [53]. Similarly, a study identi- fied behaviours before and after midterm exams, for example, out-degree centrality, LMS visit, and time spent before midterm exams and discussion views and visit in- terval regularity after the midterm exam [51]. In the self-regulated learning context, based on resources access three patterns emerged: self-regulator, external sources users, non-self-regulatory and based on LMS data four behaviours emerged: contin- uously active, probers, procrastinators and inactive [12, 20]. Likewise, based on re- source use five different learning trajectories discovered: overall below-average ac- tivity, average resource use, higher use of resources, most active students, least ac- tive students [56]. Similarly, studies clustered and profiled students based on their learning behaviours, for example, four clusters (achievers, regular, half-hearted, un- derachievers) discovered using students’ performance measures [57]. Likewise, us- ing video annotation tool interactions, four profiles were created, such as minimal- ists, task-oriented, disenchanted, and intensive [60]. Correspondingly, students viewing behaviour were adopted to cluster consistent, slide intensive and less inten- sive students [27]. On the other hand, utilising e-tutorial and information systems data, six profiles emerged, which were the difference in overall activity level and the use of worked-out solutions [62]. In the same vein, based on LMS usages, three clusters were generated such as low, acceptable, and good and students have differ- ent patterns of learning behaviour in these clusters [59]. In self-regulated learning context based on authenticity, personalisation, learner control, scaffolding, and in- teraction, three profiles were identified such as self-regulating, external regulating and lack of regulation [21]. On the other hand, a considerable number of studies contributed in terms of identifying the association and effects of different learning behaviour on students’ achievements. For instance, self-assessment exercises, regu- larly resources access, active online behaviour, and time management are signifi- cantly correlated with student learning outcome [5, 18, 23, 34, 49, 52, 63, 79]. While, the use of videos annotations, metacognitive skills, and motivational strategies are weakly associated with learning achievements [54, 55, 73]. On the other side, pro- crastination behaviour, low level of participation, and dependency on worked exam- ples could affect students learning outcome [54, 60, 89]. Furthermore, few studies discovered that students have a tendency to change their learning behaviour through- out the course and comparison can be conducted between successful and non-suc- cessful students based on their learning patterns [34, 49]. 3. Understanding and improving the learning environments: Reviewed studies dis- covered that course material access without lapses, LMS access time, active learning days and teachers’ monitoring influence learning results [1, 7, 8, 44, 45, 65]. Whereas, worked-out solutions and engagements create adverse effects on students’ achievements [41, 66, 85]. In the context of feedback provision, personalised feed- back have a small to medium positive effect on the learning outcome [71]. In terms of intervention, learning analytics-based interventions improved student academic achievement, with a 10.6% higher score than blended learning without intervention 60 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [36]. Concerning improvement in courses, video viewing patterns, resource utilisa- tion, course item frequencies and order of activities provide enough feedback to en- hance classroom teaching and course resources [19, 23, 34]. Similarly, visualisation- based learning analytics allow teachers to identify which learning design elements should be revised and improved [59]. 5 Discussion RQ1 and RQ2: We raised the questions how blended learning is defined and how learning theories and perspectives are used in the reviewed learning analytics research? In line with [46] we conclude that BL seems to have become somewhat of a meta- concept. Thus, as have been detailed in the results, blended learning is often not an adoption of one pure type of blended learning, but a combination of blended learning approaches. When BL approaches are combined, there is a greater complexity than in a "simple blend", why we propose that this can be referred to as complex hybrid learn- ing. For example, we see combinations of blended instruction and blended distribution: optional adoption [e.g. 66, 80], synchronous teaching of f2f students and online stu- dents [6] and iii) reversed distribution; [35, 40]. Result show that SRL, by far, was the most common used theory. To operationalise SRL, engagement and other perspectives of learning, LMS trace data was used to collate the number of messages sent, documents uploaded, chat sessions attended, as well as data collected to analyse teacher presence (e.g. 1, 54, 79]. In line with [42], we conclude, that operationalisations relying on LMS data might risk to be superficial and oversimplified interpretations of the underpinning theory. However, some of the reviewed studies also explored relations between trace data, performance and SRL using self- reports [70, 87, 91]. However, results revealed that certain perspectives of learning were more common to explore than others: for ex- ample, the flipped classroom, collaborative learning and conversational aspects of learning. We thus call for innovative perspectives of learning, for example, complex or multi-modal data gathering, longitudal studies and mixed methods approaches. Conclu- sively, the theoretical underpinnings of a research study (including what is meant by BL and the learning perspectives taken), are needed to increase clarity, quality and va- lidity of the objectives and contributions of that study to enable comparison and trans- parency a richer description of the actual blended learning environment approached. RQ 3: We also raised the question, what approaches of data collection, methods and analysis that are evident in the reviewed learning analytics research. One would expect that learning analytics in a blended learning scenario would account for the fact that the context of BL integrates both modalities (physical and digital). However, most of the reviewed studies have used digital traces, dispositional self-reports, relational data and interviews that does not fully cover the gamut of possible data sources of the classroom where a significant amount of learning happens. While predictive models have -in many cases- been able to infer future performance, they have failed short of explaining learn- ing, or offer a guide on how to intervene in the classroom. Of course, collecting data in a blended scenario is not easy, and therefore more research is needed that collects con- textually relevant data, and more importantly, on how to unobtrusively collect data in iJAI ‒ Vol. 2, No. 2, 2020 61 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … the classroom. We also recognise the benefits of visualisation as an intermediate step: albeit we found that visual analytics were rare [102]. Believing that eye-balling the data, the accessibility of instant overview, might support the teacher, we propose that more research on the impact of visual insights offered to stakeholders is needed. We found that blended learning studies did not use classroom data to investigate the complexity of both the online and f2f learning setting. RQ4: We then explored how ethical and legal aspects are considered in the reviewed learning analytics research. In line with previous critique [e.g. 74; 84] we found that although ethical, privacy and legislator requirements exits, the current practises do not always consider these. While results reveal that almost all (99%) of the reviewed studies were conducted prior to the GDPR rules in the EU. 13 articles reported studies from Europe, but only six articles mentioned legal aspects, having informed the students prior to data collection, or considered anonymising the data. This raises critical concerns, as aside the GDPR legislation, ethical considerations need still be adhered to. This may also reflect general slow governmental responses to regulate consequences of the digi- talisation. RQ5: Lastly, we surveyed the contributions of the reviewed articles. The results re- vealed that reviewed articles made several contributions on predicting academic per- formance, identifying learning behaviours, and improving learning environments. With regards to predicting academic performance, machine learning-based predictive models was proven to be effective but with low portability across courses, whereas visualisa- tion-based methods required teacher assistance [2, 77]. Moreover, data variables effec- tiveness on performance prediction is based on course structure; however, social net- work metrics and variables related to LMS engagement, self-regulated learning, collab- orative learning are found significantly correlated with academic performance [86, 90, 102]. In terms of identifying learning behaviours, results show that by utilising student’s participation in online activities and resources access impactful learning behaviours could be identified, and these behaviours are beneficial to cluster or profile students based on adopted behaviours [56]. In the regards of improving learning environments, results show that learning resources that provide student assistance to complete their assignments create positive effects on learning outcome [1, 7]. 6 Conclusion and Future Research As BL currently seems to be a more general concept, detailed descriptions of the actual learning situation, delivery, blend or hybrid solution is needed alongside clear underpinning theories to position the research, or as proposed an indication of whether one is approaching a "simple blend" or complex hybrid learning. We argue that in the current wake of the transforming distance learning, we see hybrid solutions, that raises awareness of a complexity of multiple blended solutions in parallel, that if not de- scribed, could mean just about any kind of learning, delivery or setting. We found that data used in many learning analytics studies were used as a proxy for what happens in the classroom. However, when studies do not include manifestations in the real class- room, they fall short of explaining learning, or offer a guide on how to intervene in the 62 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … classroom. More research is needed that accounts for the context of BL and more im- portantly, on how to unobtrusively collect relevant data that enables the support of learning where it occurs. In the light of our findings of ethical and legal considerations, we strongly argue that while there are no established traditions in LA research in terms of legal requirements; new and rigorous practices need to be developed and applied in current and future LA approaches. Ethical consequences might be devastating and the field urgently needs to acknowledge this lack of consideration. 7 Acknowledgement This research is financed by the Swedish Research Council (VR). 8 References [1] Agnihotri, L., Essa, A., & Baker, R. (2017). Impact of student choice of content adoption delay on course outcomes. In Proceedings of the Seventh International Learning Analytics & Knowledge Conference (pp. 16–20). ACM. https://doi.org/10.1145/3027385. [2] Akhtar, S., Warburton, S., & Xu, W. (2017). The use of an online learning and teaching system for monitoring computer aided design student participation and predicting student success. International Journal of Technology and Design Education, 27(2), 251–270. https://doi.org/10.1007/s10798-015-9346-8 [3] Aldowah, H., Al-Samarraie, H., & Fauzy, W. M. (2019). Educational data mining and learn- ing analytics for 21st century higher education: A review and synthesis. Telematics and In- formatics, 37(April 2018), 13–49. https://doi.org/10.1016/j.tele.2019.01.007 [4] Alonso-Fernández, C., Calvo-Morata, A., Freire, M., Martínez-Ortiz, I., & Fernández- Manjón, B. (2019). Applications of data science to game learning analytics data: A system- atic literature review. Computers and Education, 141(June), 103612. https://doi.org/10.1016 /j.compedu.2019.103612 [5] Andergassen, M., Mödritscher, F., & Neumann, G. (2014). Practice and Repetition during Exam Preparation in Blended Learning Courses: Correlations with Learning Results. Journal of Learning Analytics, 1(1), 48–74. https://doi.org/10.18608/jla.2014.11.4 [6] Avella, J. T., Kebritchi, M., Nunn, S. G., & Kanai, T. (2016). Learning Analytics in Distance Education: A Systematic Literature Review. Online Learning, 20(2 (October)), 13–29. https://doi.org/10.24059/olj.v20i2.790 [7] Ayub, M., Toba, H., Yong, S., & Wijanto, M. C. (2017). Modelling students’ activities in programming subjects through educational data mining. Global Journal of Engineering Ed- ucation, 19(3), 249–255. https://doi.org/10.1109/icodse.2017.8285881 [8] Baik, E. J., & Reynolds, R. B. (2013). Contribution of wiki trace and wiki resource use variables towards quality of game design in a guided discovery-based program of game de- sign learning. Proceedings of the ASIST Annual Meeting, 50(1). https://doi.org/10. 1002/meet.14505001165 [9] Barata, G., Gama, S., Jorge, J., & Gonçalves, D. (2017). Studying student differentiation in gamified education: A long-term study. Computers in Human Behavior, 71, 550–585. https://doi.org/10.1016/j.chb.2016.08.049 iJAI ‒ Vol. 2, No. 2, 2020 63 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [10] Bodily, R., Kay, J., Aleven, V., Jivet, I., Davis, D., Xhakaj, F., & Verbert, K. (2018). Open Learner Models and Learning Analytics Dashboards: A Systematic Review in Proceedings of the 8th international conference on learning analytics and knowledge (pp. 41-50). https://doi.org/10.1145/3170358.3170409 [11] Boelens, R., Van Laer, S., De Wever, B., & Elen, J. (2015). Blended learning in adult edu- cation: towards a definition of blended learning. Adult Learners Online! Blended and Online Learning in Adult Education and Training. https://doi.org/10.21125/iceri.2018.1219 [12] Bos, N., & Saskia, B. G. (2016). Student differences in regulation strategies and their use of learning resources: Implications for educational design. In ACM International Conference Proceeding Series (Vol. 25-29-April, pp. 344–353). New York, New York, USA: Associa- tion for Computing Machinery. doi:10.1145/2883851.2883890 [13] Bray, N. J., & Major, C. H. (2012). Status of Journals in the Field of Higher Education. The Journal of Higher Education, 82(4), 479–503. https://doi.org/10.1080/00221546. 2011.11777213 [14] Cerezo, R., Esteban, M., Sánchez-Santillán, M., & Núñez, J. C. (2017). Procrastinating be- havior in computer-based learning environments to predict performance: A case study in Moodle. Frontiers in Psychology, 8, 1–11. https://doi.org/10.3389/fpsyg.2017. [15] Chen, H., Chiang, R. H., & Storey, V. C. (2012). Business intelligence and analytics: From big data to big impact. MIS quarterly, 1165-1188. https://doi.org/10.2307/41703503 [16] Chen, B., Knight, S., & Wise, A. (2018). Critical issues in designing and implementing tem- poral analytics. Journal of Learning Analytics, 5 (1) (2018), p. 9. https://doi.org/10. 18608/jla.2018.51.1 [17] Chen, J., & Foung, D. (2020). A Motivational Story in Hong Kong: Generating Goals for Language Learners and Blended Learning Designers from a Mixed-Method Learning Ana- lytics Approach in English for Academic Purposes. In Technology and the Psychology of Second Language Learners and Users (pp. 491-516). Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-34212-8_19 [18] Cheng, G., & Chau, J. (2016). Exploring the relationships between learning styles, online participation, learning achievement and course satisfaction: An empirical study of a blended learning course. British Journal of Educational Technology, 47(2), 257–278. https://doi.org/10.1111/bjet.12243 [19] Chetlur, M., Tamhane, A., Reddy, V. K., Sengupta, B., Jain, M., Sukjunnimit, P., & Wagh, R. (2014). EduPaL: Enabling Blended Learning in Resource Constrained Environments. In Proceedings of the Fifth ACM Symposium on Computing for Development (pp. 73–82). ACM. https://doi.org/10.1145/2674377.2674388 [20] Cicchinelli, A., Veas, E., Pardo, A., Pammer-Schindler, V., Fessl, A., Barreiros, C., & Lind- städt, S. (2018). Finding traces of self-regulated learning in activity streams. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 191–200). ACM. https://doi.org/10.1145/3170358.3170381 [21] Clow, D. (2012a). The learning analytics cycle: closing the loop effectively. In Proceedings of the 2nd international conference on learning analytics and knowledge (pp. 134-138). doi.org/10.1145/2330601.2330636 [22] Clow, D. W., Nanus, L., Verdin, K. L., & Schmidt, J. (2012b). Evaluation of SNODAS snow depth and snow water equivalent estimates for the Colorado Rocky Mountains, USA. Hy- drological Processes, 26(17), 2583-2591. https://doi.org/10.1002/hyp.9385 [23] Conijn, R., Van den Beemt, A., & Cuijpers, P. (2018). Predicting student performance in a blended MOOC. Journal of Computer Assisted Learning, 34(5), 615–628. https://doi.org/10.1111/jcal.12270 64 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [24] Conijn, Rianne, Snijders, C., Kleingeld, A., & Matzat, U. (2017). Predicting student perfor- mance from LMS data: A comparison of 17 blended courses using moodle LMS. IEEE Transactions on Learning Technologies, 10(1), 17–29. https://doi.org/10.1109/ tlt.2016.2616312 [25] Dawson, S., Mirriahi, N., & Gasevic, D. (2015). Importance of theory in learning analytics in formal and workplace settings. Journal of Learning Analytics, 2(2), 1-4. https://doi.org/10.18608/jla.2015.22.1 [26] De Houwer, J., Barnes-Holmes, D., & Moors, A. (2013). What is learning? On the nature and merits of a functional definition of learning. Psychonomic bulletin & review, 20(4), 631- 642. https://doi.org/10.3758/s13423-013-0386-3 [27] De-Marcos, L., Garciá-López, E., Garciá-Cabot, A., Medina-Merodio, J. A., Domínguez, A., Martínez-Herraíz, J. J., & Diez-Folledo, T. (2016). Social network analysis of a gamified e-learning course: Small-world phenomenon and network metrics as predictors of academic performance. Computers in Human Behavior, 60, 312–321. https://doi.org/10.1016/j. chb.2016.02.052 [28] Dimić, G., Predić, B., Rančić, D., Petrović, V., Maček, N., & Spalević, P. (2018). Associa- tion analysis of moodle e-tests in blended learning educational environment. Computer Ap- plications in Engineering Education, 26(3), 417–430. https://doi.org/10.1002/cae. [29] EU 2016/679. (2016). Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC. Brussels: European Commission. https://doi.org/10.5593/sgemsocial2019v/1.1/s02.022 [30] Gamulin, J., Gamulin, O., & Kermek, D. (2016). Using Fourier coefficients in time series analysis for student performance prediction in blended learning environments. Expert Sys- tems, 33(2), 189–200. https://doi.org/10.1111/exsy.12142 [31] Garrison, D. R., & Kanuka, H. (2004). Blended learning: Uncovering its transformative po- tential in higher education. Internet and Higher Education, 7(2), 95–105. https://doi.org/10.1016/j.iheduc.2004.02.001 [32] Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in predicting academic suc- cess. Internet and Higher Education, 28, 68–84. https://doi.org/10.1016/j.iheduc. 2015.10.002 [33] Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting Learning Strategies with Analytics: Links with Self-Reported Measures and Academic Performance. Journal of Learning Analytics, 4(2), 113–128. https://doi.org/10.18608/jla.2017.42.10 [34] Gelan, A., Fastré, G., Verjans, M., Martin, N., Janssenswillen, G., Creemers, M., … Thomas, M. (2018). Affordances and limitations of learning analytics for computer-assisted language learning: a case study of the VITAL project. Computer Assisted Language Learning, 31(3), 294–319. https://doi.org/10.1080/09588221.2017.1418382 [35] Gewerc-Barujel, A., Montero-Mesa, L., & Lama-Penín, M. (2013). Collaboration and Social Networking in Higher Education. Comunicar, 21(42), 55–63. https://doi.org/10. 3916/c42-2014-05 [36] Gong, L., Liu, Y., & Zhao, W. (2018). Using Learning Analytics to Promote Student En- gagement and Achievement in Blended Learning: An Empirical Study. In Proceedings of the 2nd International Conference on E-Education, E-Business and E-Technology. https://doi.org/10.1145/3241748.3241760 iJAI ‒ Vol. 2, No. 2, 2020 65 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [37] Graham, C. R. (2006). Blended learning systems: Definition, current trends, and future di- rections. In B. Miller (Ed.), The handbook of blended learning (1st ed., Vol. LVII, pp. 3– 21). San Francisco, CA: Pfeiffer. [38] Graham, C.R. (2009) Blended Learning Models. In: Encyclopedia of Information Science and Technology. Hershey, PA: Idea Group Inc., 375-383. [39] Graham, C. R. (2013). Emerging practice and research in blended learning. In M. G. Moore (Ed.), Handbook of Distance Education (3rd ed., pp. 333–350). New York, New York, USA: Routledge. [40] Harrak, F., Bouchet, F., Luengo, V., & Gillois, P. (2018). Profiling students from their ques- tions in a blended learning environment. In Proceedings of the 8th International Conference on Learning Analytics and Knowledge (pp. 102–110). ACM. https://doi.org/10.1145/ 3170358.3170389 [41] Hecking, T., Ziebarth, S., & Hoppe, H. U. (2014). Analysis of Dynamic Resource Access Patterns in Online Courses. Journal of Learning Analytics, 1(3), 34–60. https://doi.org/10.18608/jla.2014.13.4 [42] Henrie, C. R., Bodily, R., Larsen, R., & Graham, C. R. (2018). Exploring the potential of LMS log data as a proxy measure of student engagement. Journal of Computing in Higher Education, 30(2), 344–362. https://doi.org/10.1007/s12528-017-9161-1 [43] Hernández-Nanclares, N., García-Muñiz, A. S., & Rienties, B. (2017). Making the most of “external” group members in blended and online environments. Interactive Learning Envi- ronments, 25(4), 467–481. https://doi.org/10.1080/10494820.2016.1140656 [44] Hill, T., Chidambaram, L., & Summers, J. D. (2017). Playing ‘catch up’ with blended learn- ing: performance impacts of augmenting classroom instruction with online learning. Behav- iour and Information Technology, 36(1), 54–62. https://doi.org/10.1080/0144929x. 2016.1189964 [45] Holstein, K., McLaren, B. M., & Aleven, V. (2017). SPACLE: investigating learning across virtual and physical spaces using spatial replays. Proceedings of the Seventh International Learning Analytics & Knowledge Conference on - LAK ’17, 358–367. https://doi.org/10.1145/3027385.3027450 [46] Hrastinski, S. (2019). What Do We Mean by Blended Learning? TechTrends, 63(5), 564– 569. https://doi.org/10.1007/s11528-019-00375-5 [47] Hui, Y. K., Mai, B., Qian, S., & Kwok, L. F. (2018). Cultivating better learning attitudes: a preliminary longitudinal study. Open Learning, 33(2), 155–170. https://doi.org/10. 1080/02680513.2018.1454830 [48] Johnson, L., Becker, S. A., Cummins, M., Estrada, V., Freeman, A., & Hall, C. (2016). NMC Horizon Report; 2016 Higher Education Edition. Retr. Feb. 13, 2019, from www.sconul.ac.uk/sites/default/files/documents/2016-nmc-horizon-report-he-EN-1.pdf [49] Jovanović, J., Gašević, D., Dawson, S., Pardo, A., & Mirriahi, N. (2017). Learning analytics to unveil learning strategies in a flipped classroom. Internet and Higher Education, 33, 74– 85. https://doi.org/10.1016/j.iheduc.2017.02.001 [50] Khalil, M., & Ebner, M. (2016). De-identification in learning analytics. Journal of Learning Analytics. https://doi.org/10.18608/jla.2016.31.8 [51] Kim, D., Park, Y., Yoon, M., & Jo, I. H. (2016). Toward evidence-based learning analytics: Using proxy variables to improve asynchronous online discussion environments. Internet and Higher Education, 30, 30–43. https://doi.org/10.1016/j.iheduc.2016.03.002 [52] Li, L. Y., & Tsai, C. C. (2017). Accessing online learning material: Quantitative behavior patterns and their effects on motivation and learning performance. Computers and Educa- tion, 114(300), 286–297. https://doi.org/10.1016/j.compedu.2017.07.007 66 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [53] Lu, O. H. T., Huang, A. Y. Q., Lin, A. J. Q., Ogata, H., & Yang, S. J. H. (2018). Applying Learning Analytics for the Early Prediction of Students’ Academic Performance in Blended Learning. Educational Technology & Society (Vol. 21). [54] Lukarov, V., Verbert, K., & Schroeder, U. (2019). Scaling up learning analytics in blended learning scenarios (Doctoral dissertation, Universitätsbibliothek der RWTH Aachen). [55] Manzanares, M. C. S., Sánchez, R. M., García Osorio, C. I., & Díez-Pastor, J. F. (2017). How do B-learning and learning patterns influence learning outcomes? Frontiers in Psychol- ogy, 8(MAY), 1–13. https://doi.org/10.3389/fpsyg.2017.00745 [56] Marsh, J., & Drexler, P. (2001). How to design effective blended learning. Sunnyvale: Bran- don-Hall. Sunnyvale: California, USA. [57] Matcha, W., Ahmad Uzir, N., Gasevic, D., & Pardo, A. (2019). A Systematic Review of Empirical Studies on Learning Analytics Dashboards: A Self-Regulated Learning Perspec- tive. IEEE Transactions on Learning Technologies, 1382(c), 1–1. https://doi.org/10. 1109/tlt.2019.2916802 [58] McKenzie, W. A., Perini, E., Rohlf, V., Toukhsati, S., Conduit, R., & Sanson, G. (2013). A blended learning lecture delivery model for large and diverse undergraduate cohorts. Com- puters and Education, 64, 116–126. https://doi.org/10.1016/j.compedu.2013.01.009 [59] Melero, J., Hernández-Leo, D., Sun, J., Santos, P., & Blat, J. (2015). How was the activity? A visualization support for a case of location-based learning design. British Journal of Edu- cational Technology, 46(2), 317–329. https://doi.org/10.1111/bjet.12238 [60] Mirriahi, N., Liaqat, D., Dawson, S., & Gašević, D. (2016). Uncovering student learning profiles with a video annotation tool: reflective learning with and without instructional norms. Educational Technology Research and Development, 64(6), 1083–1106. https://doi.org/10.1007/s11423-016-9449-2 [61] Misiejuk, K., & Wasson, B. (2017). State of the Field report on Learning Analytics. SLATE Report 2017-2. [62] Molenaar, I., & Järvelä, S. (2014). Sequential and temporal characteristics of self and so- cially regulated learning. Metacognition and Learning, 9(2), 75–85. https://doi.org/10. 1007/s11409-014-9114-2 [63] Montgomery, A. P., Mousavi, A., Carbonaro, M., Hayward, D. V., & Dunn, W. (2019). Using learning analytics to explore self-regulated learning in flipped blended learning music teacher education. British Journal of Educational Technology, 50(1), 114–127. https://doi.org/10.1111/bjet.12590 [64] Musabirov, I., & Bakhitova, A. (2017). Trajectories of student interaction with learning re- sources in blended learning. In Proceedings of the 17th Koli Calling International Confer- ence on Computing Education Research (pp. 191–192). ACM. https://doi.org/10. 1145/3141880.3141907 [65] Mödritscher, F., Andergassen, M., & Neumann, G. (2013). Dependencies between E-Learn- ing Usage Patterns and Learning Results. In Proceedings of the 13th International Confer- ence on Knowledge Management and Knowledge Technologies (pp. 1–8). ACM. https://doi.org/10.1145/2494188.2494206 [66] Nguyen, Q., Tempelaar, D. T., Rienties, B., & Bas, G. (2016). What learning analytics-based prediction models tell us about feedback preferences of students. Q. Rev. Distance Educa- tion, 17(3), 13–33. [67] Nguyen, V. A., Nguyen, Q. B., & Nguyen, V. T. (2018). A Model to Forecast Learning Outcomes for Students in Blended Learning Courses Based on Learning Analytics, 35–41. https://doi.org/10.1145/3268808.3268827 [68] Nouri, J., Ebner, M., Ifenthaler, D., Saqr, M., Malmberg, J., Khalil, M., … Berthelsen, U. D. (2019). Efforts in Europe for Data-Driven Improvement of Education – A Review of iJAI ‒ Vol. 2, No. 2, 2020 67 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Learning Analytics Research in Seven Countries. International Journal of Learning Analyt- ics and Artificial Intelligence for Education (IJAI), 1(1), 8. https://doi.org/10.3991/ijai.v1i1. [69] Nouri, J., Saqr, M & Fors, U. (2019). Predicting performance of students in a flipped class- room using machine learning: towards automated data-driven formative feedback. Journal of Systemics, Cybernetics and Informatics. 17(2). [70] Pardo, A., Han, F., & Ellis, R. A. (2016). Exploring the relation between self-regulation, online activities, and academic performance, 422–429. doi:10.1145/2883851.2883883 [71] Pardo, A., Han, F., & Ellis, R. A. (2017). Combining University student self-regulated learn- ing indicators and engagement with online learning events to Predict Academic Perfor- mance. IEEE Transactions on Learning Technologies, 10(1), 82–92. https://doi.org/10. 1109/tlt.2016.2639508 [72] Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback. British Journal of Educational Technology, 50(1), 128–138. https://doi.org/10.1111/bjet.12592 [73] Pardo, A., Mirriahi, N., Dawson, S., Zhao, Y., Zhao, A., & Gašević, D. (2015). Identifying learning strategies associated with active use of video annotation software, 255–259. https://doi.org/10.1145/2723576.2723611 [74] Pardo, A., & Siemens, G. (2014). Ethical and privacy principles for learning analytics. Brit- ish Journal of Educational Technology, 45(3), 438–450. https://doi.org/10.1111/bjet. [75] Park, Y., Yu, J. H., & Jo, I. H. (2016). Clustering blended learning courses by online behav- ior data case study in a Korean higher education institute. Internet and Higher Education, 29, 1–11. https://doi.org/10.1016/j.iheduc.2015.11.001 [76] Paskevicius, M., & Bortolin, K. (2016). Blending our practice: using online and face-to-face methods to sustain community among faculty in an extended length professional develop- ment program. Innovations in Education and Teaching International, 53(6), 605–615. https://doi.org/10.1080/14703297.2015.1095646 [77] Predić, B., Dimić, G., Rančić, D., Štrbac, P., Maček, N., & Spalević, P. (2018). Improving final grade prediction accuracy in blended learning environment using voting ensembles. Computer Applications in Engineering Education, 26(6), 2294–2306. https://doi.org/10. 1002/cae.22042 [78] Reeve, J. (2012). A Self-determination Theory Perspective on Student Engagement. In S. Christenson, A. Reschly, & C. Wylie (Eds.), Handbook of Research on Student Engagement (pp. 149–172). Boston, MA: Springer US. https://doi.org/10.1007/978-1-4614-2018-7_7 [79] Rubio, F., Thomas, J. M., & Li, Q. (2018). The role of teaching presence and student partic- ipation in Spanish blended courses. Computer Assisted Language Learning, 31(3), 226–250. https://doi.org/10.1080/09588221.2017.1372481 [80] Saqr, M., Fors, U., & Tedre, M. (2017). How learning analytics can early predict under- achieving students in a blended medical education course. Medical Teacher, 39(7), 757–767. https://doi.org/10.1080/0142159x.2017.1309376 [81] Saqr, M., Fors, U., & Tedre, M. (2018). How the study of online collaborative learning can guide teachers and predict students’ performance in a medical course. BMC Medical Edu- cation, 18(1), 1–14. https://doi.org/10.1186/s12909-018-1126-1 [82] Scholes, V. (2016). The ethics of using learning analytics to categorize students on risk. Educational Technology Research and Development, 64(5), 939–955. https://doi.org/10. 1007/s11423-016-9458-1 [83] Siemens, G. (2013). Learning analytics: The emergence of a discipline. American Behav- ioral Scientist, 57(10), 1380-1400. https://doi.org/10.1177/0002764213498851 68 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [84] Slade, S., & Prinsloo, P. (2013). Learning Analytics. American Behavioral Scientist, 57(10), 1510–1529. https://doi.org/10.1177/0002764213479366 [85] Snodgrass Rangel, V., Bell, E. R., Monroy, C., & Whitaker, J. R. (2015). Toward a New Approach to the Evaluation of a Digital Curriculum Using Learning Analytics. Journal of Research on Technology in Education, 47(2), 89–104. https://doi.org/10.1080/ 15391523.2015.999639 [86] Tempelaar, D. (2017). How Dispositional Learning Analytics Helps Understanding the Worked-Example Principle. International Association for Development of the Information Society, (Celda), 14. [87] Tempelaar, D., Rienties, B., Mittelmeier, J., & Nguyen, Q. (2018a). Student profiling in a dispositional learning analytics application using formative assessment. Computers in Hu- man Behavior, 78, 408–420. https://doi.org/10.1016/j.chb.2017.08.010 [88] Tempelaar, D., Rienties, B., & Nguyen, Q. (2018b). A multi-modal study into students’ tim- ing and learning regulation: time is ticking. Interactive Technology and Smart Education, 15(4), 298–313. https://doi.org/10.1108/itse-02-2018-0015 [89] Tempelaar, D., Rienties, B., & Nguyen, Q. (2018c). Investigating learning strategies in a dispositional learning analytics context, 201–205. https://doi.org/10.1145/3170358. [90] Tempelaar, D., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: Learning analytics in a data-rich context. Computers in Human Behavior, 47, 157–167. https://doi.org/10.1016/j.chb.2014.05.038 [91] Tempelaar, D., Rienties, B., Nguyen, Q., Macan, T., & Hoffmacan, T. (2017). Towards Ac- tionable Learning Analytics Using Dispositions, 79(1), 381–391. doi:10.1037/0021- 9010.79.3.381 [92] Tempelaar, D. (2020). Supporting the less-adaptive student: the role of learning analytics, formative assessment and blended learning. Assessment & Evaluation in Higher Education, 45(4), 579-593. https://doi.org/10.1080/02602938.2019.1677855 [93] van Goidsenhoven, S., Bogdanova, D., Deeva, G., Broucke, S. V., De Weerdt, J., & Snoeck, M. (2020, March). Predicting student success in a blended learning environment. In Pro- ceedings of the Tenth International Conference on Learning Analytics & Knowledge (pp. 17-25). https://doi.org/10.1145/3375462.3375494 [94] Van Laer, S., & Elen, J. (2018). Adults’ Self-Regulatory Behaviour Profiles in Blended Learning Environments and Their Implications for Design. Technology, Knowledge and Learning, 1–31. https://doi.org/10.1007/s10758-017-9351-y [95] Van Leeuwen, A. (2019). Teachers’ perceptions of the usability of learning analytics reports in a flipped university course: when and how does information become actionable knowledge? Educational Technology Research and Development, 67(5), 1043-1064. https://doi.org/10.1007/s11423-018-09639-y [96] Vieira, C., Parsons, P., & Byrd, V. (2018). Visual learning analytics of educational data: A systematic literature review and research agenda. Computers and Education, 122(March), 119–135. https://doi.org/10.1016/j.compedu.2018.03.018 [97] Villamañe, M., Álvarez, A., Larrañaga, M., Hernández-Rivas, O., & Caballero, J. (2018). Using Visualizations to Improve Assessment in Blended Learning Environments, 165–169. https://doi.org/10.1145/3284179.3284209 [98] Winne, P. H., & Hadwin, A. F. (1998). Studying as self-regulated learning. In Metacognition in educational theory and practice. [99] Winne, P. H., & Perry, N. E. (2000). Measuring self-regulated learning. In Handbook of self- regulation (pp. 531-566). Academic Press. https://doi.org/10.1016/b978-012109890- 2/50045-7 iJAI ‒ Vol. 2, No. 2, 2020 69 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … [100] Wise AF, Shaffer DW. 2015. Why theory matters more than ever in the age of big data. JLA. 2:5–13. [101] Whitelock-Wainwright, A., Tsai, Y. S., Lyons, K., Kaliff, S., Bryant, M., Ryan, K., & Gašević, D. (2020, March). Disciplinary differences in blended learning design: a network analytic study. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge (pp. 579-588). https://doi.org/10.1145/3375462.3375488 [102] Zacharis, N. Z. (2015). A multivariate approach to predicting student outcomes in web-ena- bled blended learning courses. Internet and Higher Education, 27, 44–53. https://doi.org/10.1016/j.iheduc.2015.05.002 9 Authors Nina Bergdahl is a researcher in didactics and digitalisation at the Department of Adult and Upper Secondary Education, Malmoe, Sweden and affiliated with the De- partment of Computer and Systems Sciences (DSV) Stockholm university, Sweden. Email: ninabe@dsv.su.se Jalal Nouri is an associate professor at the Department of Computer and Systems Sciences (DSV) Stockholm university, Sweden. Thashmee Karunaratne is an associate professor at the Department of Computer and Systems Sciences (DSV) Stockholm university, Sweden Muhammad Afzaal is a PhD student at the Department of Computer and Systems Sciences (DSV) Stockholm university, Sweden. Mohammad Saqr is a postdoc at KTH Royal Institute of Technology, School of Electrical Engineering and Computer Science, Stockholm, Sweden. Article submitted 2020-08-19. Resubmitted 2020-09-16. Final acceptance 2020-09-16. Final version pub- lished as submitted by the authors. 70 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 10 Appendix A Obser- Dis- Inter- SN Sur- Class- Perfor- S Study LMS Trace MML vation/ cours views A vey room mance video e Agnihotri, L., Essa, A., & Baker, R. (2017). Impact of student choice of con- tent adoption delay on course outcomes. In Proceedings of the Seventh Interna- 1 Y Y N N N N N N N Y tional Learning Analytics & Knowledge Conference (pp. 16–20). ACM. https://doi.org/10.1145/3027385.302743 Akhtar, S., Warburton, S., & Xu, W. (2017). The use of an online learning and teaching system for monitoring computer aided design student participa- 2 tion and predicting student success. In- Y Y Y N Y N Y N Y Y ternational Journal of Technology and Design Education, 27(2), 251–270. https://doi.org/10.1007/s10798-015- 9346-8 Andergassen, M., & Mödritscher, F. (2014). Practice and repetition during 3 exam preparation in blended learning M Y N N N N N D N Y courses: Correlations with learning re- sults. Of Learning Analytics, 1, 48–74. Ayub, M., Toba, H., Yong, S., & Wi- janto, M. C. (2017). Modelling students’ 4 activities in programming subjects Y Y N N N N N N N Y through educational data mining. Global Journal of Engineering Education, 19(3) Baik, E. J., & Reynolds, R. B. (2013). Contribution of wiki trace and wiki re- source use variables towards quality of game design in a guided discovery- 5 based program of game design learning. Wiki Y N N N N N Y N Y Proceedings of the ASIST Annual Meet- ing, 50(1). https://doi.org/10.1002/meet.145050011 Barata, G., Gama, S., Jorge, J., & Gon- çalves, D. (2017). Studying student dif- ferentiation in gamified education: A 6 long-term study. Computers in Human Y Y N N N N N D N Y Behavior, 71, 550–585. https://doi.org/10.1016/j.chb.2016.08.04 Bos, N. (n.d.). Student differences in regulation strategies and their use of learning resources: implications for edu- 7 Y Y N N N N N D Y Y cational design. Learning Analytics and Knowledge (LAK’16), 27-29 April 2016, 344–353. Cheng, G., & Chau, J. (2016). Exploring the relationships between learning styles, online participation, learning achievement and course satisfaction: An 8 Y Y N N N N N D N Y empirical study of a blended learning course. British Journal of Educational Technology, 47(2), 257–278. https://doi.org/10.1111/bjet.12243 iJAI ‒ Vol. 2, No. 2, 2020 71 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Chen, J., & Foung, D. (2020). A Moti- vational Story in Hong Kong: Generat- ing Goals for Language Learners and Blended Learning Designers from a Mixed-Method Learning Analytics Ap- 9 Y Y N N N N N D N Y proach in English for Academic Pur- poses. In Technology and the Psychol- ogy of Second Language Learners and Users (pp. 491-516). Palgrave Macmil- lan, Cham. Chetlur, M., Tamhane, A., Reddy, V. K., Sengupta, B., Jain, M., Sukjunnimit, P., & Wagh, R. (2014). EduPaL: Ena- bling Blended Learning in Resource Constrained Environments. In Proceed- 10 M Y N N N N N D N Y ings of the Fifth ACM Symposium on Computing for Development (pp. 73– 82). ACM. https://doi.org/10.1145/2674377.267438 Cicchinelli, A., Veas, E., Pardo, A., Pammer-Schindler, V., Fessl, A., Bar- reiros, C., & Lindstädt, S. (2018). Find- ing traces of self-regulated learning in Cus- activity streams. In Proceedings of the 11 tom Y N N N N N Y N Y 8th International Conference on Learn- LMS ing Analytics and Knowledge (pp. 191– 200). ACM. https://doi.org/10.1145/3170358.317038 Conijn, R., Snijders, C., Kleingeld, A., & Matzat, U. (2017). Predicting student performance from LMS data. IEEE 12 Transactions on Learning Technologies, Y Y N N N N N N N Y 10(1), 17–29. https://doi.org/10.1109/TLT.2016.26163 Conijn, R., Van den Beemt, A., & Cuijpers, P. (2018). Predicting student performance in a blended MOOC. Jour- 13 M Y N N N N N N N Y nal of Computer Assisted Learning, 34(5), 615–628. https://doi.org/10.1111/jcal.12270 Dimić, G., Predić, B., Rančić, D., Pe- trović, V., Maček, N., & Spalević, P. (2018). Association analysis of moodle 14 e-tests in blended learning educational Y Y N N N N N N N Y environment. Computer Applications in Engineering Education, 26(3), 417–430. https://doi.org/10.1002/cae.21894 De-Marcos, L., Garciá-López, E., Garciá-Cabot, A., Medina-Merodio, J. A., Domínguez, A., Martínez-Herraíz, J. J., & Diez-Folledo, T. (2016). Social network analysis of a gamified e-learn- 15 ing course: Small-world phenomenon M N N N N N Y N N Y and network metrics as predictors of ac- ademic performance. Computers in Hu- man Behavior, 60, 312–321. https://doi.org/10.1016/j.chb.2016.02.05 Dobashi, K. (2016). Development and 16 Trial of Excel Macros for Time Series Y Y N N N N N N N N Cross Section Monitoring of Student 72 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Engagement: Analyzing Students’ Page Views of Course Materials. Procedia Computer Science, 96, 1086–1095. https://doi.org/10.1016/j.procs.2016.08. Gamulin, J., Gamulin, O., & Kermek, D. (2016). Using Fourier coefficients in time series analysis for student perfor- 17 mance prediction in blended learning Y Y N N N N N N N Y environments. Expert Systems, 33(2), 189–200. https://doi.org/10.1111/exsy.12142 Gašević, D., Dawson, S., Rogers, T., & Gasevic, D. (2016). Learning analytics should not promote one size fits all: The effects of instructional conditions in pre- 18 Y Y N N N N N N N Y dicting academic success. Internet and Higher Education, 28, 68–84. https://doi.org/10.1016/j.iheduc.2015.10 Gašević, D., Jovanović, J., Pardo, A., & Dawson, S. (2017). Detecting Learning Strategies with Analytics: Links with 19 Self-Reported Measures and Academic Y Y N N N N N D N Y Performance. Journal of Learning Ana- lytics, 4(2), 113–128. https://doi.org/10.18608/jla.2017.42.10 Gelan, A., Fastré, G., Verjans, M., Mar- tin, N., Janssenswillen, G., Creemers, M., … Thomas, M. (2018). Affordances and limitations of learning analytics for computer-assisted language learning: a 20 Y Y N N N N N N N Y case study of the VITAL project. Com- puter Assisted Language Learning, 31(3), 294–319. https://doi.org/10.1080/09588221.2017. Gewerc-Barujel, A., Montero-Mesa, L., & Lama-Penín, M. (2013). Collabora- 21 tion and Social Networking in Higher Y Y N N N N N N N N Education. Comunicar, 21(42), 55–63. https://doi.org/10.3916/c42-2014-05 Gong, L., Liu, Y., & Zhao, W. (2018). Using Learning Analytics to Promote Student Engagement and Achievement in Blended Learning: An Empirical 22 Study. In Proceedings of the 2nd Inter- Y N N N N N N D N Y national Conference on E-Education, E- Business and E-Technology. https://doi.org/10.1145/3241748.324176 Harrak, F., Bouchet, F., Luengo, V., & Gillois, P. (2018). Profiling students from their questions in a blended learn- ing environment. In Proceedings of the 23 8th International Conference on Learn- Y N N N N N N N Y N ing Analytics and Knowledge (pp. 102– 110). ACM. https://doi.org/10.1145/3170358.317038 Hecking, T., Ziebarth, S., & Hoppe, H. U. (2014). Analysis of Dynamic Re- 24 Y Y N N N N Y N N N source Access Patterns in Online Courses. Journal of Learning Analytics, iJAI ‒ Vol. 2, No. 2, 2020 73 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … 1(3), 34–60. https://doi.org/10.18608/jla.2014.13.4 Hernández-Nanclares, N., García-Mu- ñiz, A. S., & Rienties, B. (2017). Mak- ing the most of “external” group mem- bers in blended and online environ- 25 Y N N N N N Y N N N ments. Interactive Learning Environ- ments, 25(4), 467–481. https://doi.org/10.1080/10494820.2016. Hill, T., Chidambaram, L., & Summers, J. D. (2017). Playing ‘catch up’ with blended learning: performance impacts 26 Y Y N N N N N N N Y of augmenting...: Discovery Service for De La Salle Univ. Behaviour & Infor- mation Technology, 36(1), 54–62. Holstein, K., McLaren, B. M., & Ale- ven, V. (2017). SPACLE: investigating learning across virtual and physical spaces using spatial replays. Proceed- 27 ings of the Seventh International Learn- Y Y Y Y N N N D Y Y ing Analytics & Knowledge Conference on - LAK ’17, 358–367. https://doi.org/10.1145/3027385.302745 Hui, Y. K., Mai, B., Qian, S., & Kwok, L. F. (2018). Cultivating better learning attitudes: a preliminary longitudinal 28 Y Y N N Y N Y N N Y study. Open Learning, 33(2), 155–170. https://doi.org/10.1080/02680513.2018. Jovanović, J., Gašević, D., Dawson, S., Pardo, A., & Mirriahi, N. (2017). Learn- ing analytics to unveil learning strate- 29 gies in a flipped classroom. Internet and Y Y N N N N N N N Y Higher Education, 33, 74–85. https://doi.org/10.1016/j.iheduc.2017.02 Kim, D., Park, Y., Yoon, M., & Jo, I. H. (2016). Toward evidence-based learning analytics: Using proxy variables to im- prove asynchronous online discussion 30 Y Y N N N Y N N Y environments. Internet and Higher Edu- cation, 30, 30–43. https://doi.org/10.1016/j.iheduc.2016.03 Kovanović, V., Gašević, D., Dawson, S., Joksimović, S., Baker, R. S., & Hatala, M. (2015). Does time-on-task estimation matter? Implications for the 31 Y Y N N N N N N N Y validity of learning analytics findings. Journal of Learning Analytics, 2(3), 81– https://doi.org/10.18608/jla.2015.23.6 Li, L. Y., & Tsai, C. C. (2017). Access- ing online learning material: Quantita- tive behavior patterns and their effects on motivation and learning performance. 32 Y Y N N N N N D N Y Computers and Education, 114(300), 286–297. https://doi.org/10.1016/j.compedu.2017. 07.007 Lu, O. H. T., Huang, A. Y. Q., Huang, J. 33 Y Y N N N N N N N Y C. H., Lin, A. J. Q., Ogata, H., & Yang, 74 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … S. J. H. (2018). Applying learning ana- lytics for the early prediction of stu- dents’ academic performance in blended learning. Educational Technology and Society, 21(2), 220–232. Lukarov, V., Verbert, K., & Schroeder, U. (2019). Scaling up learning analytics 34 in blended learning scenarios (Doctoral Y Y N N N N N N N Y dissertation, Universitätsbibliothek der RWTH Aachen). Manzanares, M. C. S., Sánchez, R. M., García Osorio, C. I., & Díez-Pastor, J. F. (2017). How do B-learning and learning patterns influence learning outcomes? 35 Y Y N N N N N D N Y Frontiers in Psychology, 8(MAY), 1– https://doi.org/10.3389/fpsyg.2017.0074 McKenzie, W. A., Perini, E., Rohlf, V., Toukhsati, S., Conduit, R., & Sanson, G. (2013). A blended learning lecture de- livery model for large and diverse un- 36 M Y N N N N N D N Y dergraduate cohorts. Computers and Ed- ucation, 64, 116–126. https://doi.org/10.1016/j.compedu.2013. 01.009 Melero, J., Hernández-Leo, D., Sun, J., Santos, P., & Blat, J. (2015). How was the activity? A visualization support for Gam 37 a case of location-based learning design. Y N N Y N N Y N Y British Journal of Educational Technol- ogy, 46(2), 317–329. https://doi.org/10.1111/bjet.12238 Mirriahi, N., Liaqat, D., Dawson, S., & Gašević, D. (2016). Uncovering student learning profiles with a video annotation Vide tool: reflective learning with and with- 38 out instructional norms. Educational Y N N N N N N N Y soft- Technology Research and Development, ware 64(6), 1083–1106. https://doi.org/10.1007/s11423-016- 9449-2 Mödritscher, F., Andergassen, M., & Neumann, G. (2013). Dependencies be- tween E-Learning Usage Patterns and Learning Results. In Proceedings of the 13th International Conference on 39 M N N N N N N N N Y Knowledge Management and Knowledge Technologies (pp. 1–8). ACM. https://doi.org/10.1145/2494188.249420 Montgomery, A. P., Mousavi, A., Car- bonaro, M., Hayward, D. V., & Dunn, W. (2019). Using learning analytics to explore self-regulated learning in 40 M Y N N N N N N N Y flipped blended learning music teacher education. British Journal of Educa- tional Technology, 50(1), 114–127. https://doi.org/10.1111/bjet.12590 Musabirov, I., & Bakhitova, A. (2017). Trajectories of student interaction with 41 Y Y N N N N N N N Y learning resources in blended learning. In Proceedings of the 17th Koli Calling iJAI ‒ Vol. 2, No. 2, 2020 75 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … International Conference on Computing Education Research (pp. 191–192). ACM. https://doi.org/10.1145/3141880.314190 Nespereira, C. G., Dai, K., Redondo, R. P. D., & Vilas, A. F. (2014). Is the LMS access frequency a sign of students’ suc- cess in face-to-face higher education? In Proceedings of the Second International 42 Conference on Technological Ecosys- Y Y N N N N N N N Y tems for Enhancing Multiculturality - TEEM ’14 (s. 283–290). Salamanca, Spain: ACM Press. https://doi.org/10.1145/2669711.266991 Nguyen, Q., Rienties, B., Tempelaar, D. T., & Giesbers, B. (2016). What learn- ing analytics-based prediction models 43 Y Y N N N N N D N Y tells us about feedback preferences of students. Quarterly Review of Distance Education, 17(3), 13–33. Nguyen, V. A., Nguyen, Q. B., & Nguyen, V. T. (2018). A Model to Fore- cast Learning Outcomes for Students in Blended Learning Courses Based On Learning Analytics. In Proceedings of 44 the 2nd International Conference on E- Y Y N N N N N N N Y Society, E-Education and E-Technology - ICSET 2018 (s. 35–41). Taipei, Tai- wan: ACM Press. https://doi.org/10.1145/3268808.326882 Nouri, J., Saqr, M & Fors, U. (2019). Predicting performance of students in a flipped classroom using machine learn- 45 Y Y N N N N N D N Y ing: towards automated data-driven formative feedback. Journal of System- ics, Cybernetics and Informatics. 17(2). Pardo, A., Han, F., & Ellis, R. A. (2016). Exploring the relation between self-regulation, online activities, and ac- ademic performance: a case study. In Proceedings of the Sixth International 46 Y Y N N N N N D N Y Conference on Learning Analytics & Knowledge - LAK ’16 (s. 422–429). Ed- inburgh, United Kingdom: ACM Press. https://doi.org/10.1145/2883851.288388 Pardo, A., Han, F., & Ellis, R. A. (2017). Combining University Student Self-Regulated Learning Indicators and 47 Engagement with Online Learning Y N N N N N N D N Y Events to Predict Academic Perfor- mance. IEEE Transactions on Learning Technologies, 10(1), 82–92. Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2019). Us- ing learning analytics to scale the provi- 48 Y Y N N N N N D N Y sion of personalised feedback. British Journal of Educational Technology, 50(1), 128–138. Pardo, A., Mirriahi, N., Dawson, S., Vide 49 Y N N N N N D N Y Zhao, Y., Zhao, A., & Gašević, D. o 76 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … (2015). Identifying learning strategies soft- associated with active use of video an- ware notation software. In Proceedings of the Fifth International Conference on Learning Analytics And Knowledge - LAK ’15 (s. 255–259). Poughkeepsie, New York: ACM Press. https://doi.org/10.1145/2723576.272361 Park, Y., Yu, J. H., & Jo, I.-H. (2016). Clustering blended learning courses by online behavior data case study in a Ko- 50 rean higher education institute. Internet Y Y N N N N N N N N and Higher Education, 29, 1–11. https://doi.org/10.1016/j.iheduc.2015.11 Paskevicius, M., & Bortolin, K. (2016). Blending our practice: using online and face-to-face methods to sustain commu- 51 nity among faculty in an extended M Y N N N Y Y D N N length professional development pro- gram. Innovations in Education & Teaching International, 53(6), 605–615. Predić, B. ( 1 ), Rančić, D. ( 1 ), Dimić, G. ( 2 ), Štrbac, P. ( 2 ), Maček, N. ( 2 ), & Spalević, P. ( 3 ). (2018). Improving final grade prediction accuracy in 52 Y Y N N N N N N N Y blended learning environment using vot- ing ensembles. Computer Applications in Engineering Education, 26(6), 2294– 2306. https://doi.org/10.1002/cae.22042 Snodgrass Rangel, V., Bell, E. R., Mon- roy, C., & Whitaker, J. R. (2015). To- ward a New Approach to the Evaluation of a Digital Curriculum Using Learning 53 Y Y N Y Y N N D Y Y Analytics. Journal of Research on Tech- nology in Education, 47(2), 89–104. https://doi.org/10.1080/15391523.2015. Cerezo, R., Esteban, M., Sánchez-San- tillán, M., & Núñez, J. C. (2017). Pro- crastinating behavior in computer-based 54 learning environments to predict perfor- Y Y N N N N N N N Y mance: A case study in Moodle. Fron- tiers in Psychology, 8, 1–11. doi:10.3389/fpsyg.2017.01403 Rubio, F., Thomas, J. M., & Li, Q. (2018). The role of teaching presence and student participation in Spanish 55 blended courses. Computer Assisted Y Y N Y N Y N N Y Y Language Learning, 31(3), 226–250. https://doi.org/10.1080/09588221.2017. Saqr, M., Fors, U., & Tedre, M. (2018). How the study of online collaborative learning can guide teachers and predict 56 students’ performance in a medical Y Y N N N N Y N N Y course. BMC Medical Education, 18(1). https://doi.org/10.1186/s12909-018- 1126-1 Saqr, Mohammed, Fors, U., & Tedre, 57 M. (2017). How learning analytics can Y Y N N N N N N N Y early predict under-achieving students in iJAI ‒ Vol. 2, No. 2, 2020 77 Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … a blended medical education course. Medical Teacher, 39(7), 757–767. Tempelaar, D. ( 1 ), Rienties, B. ( 2 ), & Nguyen, Q. ( 2 ). (2018). A multi-modal study into students’ timing and learning 58 regulation: time is ticking. Interactive Y Y N N N N N D N Y Technology and Smart Education, 15(4), 298–313. https://doi.org/10.1108/ITSE- 02-2018-0015 Tempelaar, D. T., Rienties, B., & Gies- bers, B. (2015). In search for the most informative data for feedback genera- 59 M Y N N N N N N N Y tion: Learning analytics in a data-rich context. Computers in Human Behavior, 47, 157–167. Tempelaar, D. T., Rienties, B., & Nguyen, Q. (2017). Towards Actionable 60 Learning Analytics Using Dispositions. M Y N N N N N D N Y IEEE Transactions on Learning Tech- nologies, 10(1), 6–16. Tempelaar, Dirk. (2017). How Disposi- tional Learning Analytics Helps Under- 61 standing the Worked-Example Principle. Y Y N N N N N D N Y International Association for Develop- ment of the Information Society, 14. Tempelaar, Dirk, Rienties, B., Mittel- meier, J., & Nguyen, Q. (2018). Student profiling in a dispositional learning ana- 62 M Y N N N N N D N Y lytics application using formative as- sessment. Computers in Human Behav- ior, 78, 408–420. Tempelaar, Dirk, Rienties, B., & Ngu- yen, Q. (2018). Investigating learning strategies in a dispositional learning ana- lytics context: the case of worked exam- ples. In Proceedings of the 8th Interna- 63 tional Conference on Learning Analytics M Y N N N N N D N Y and Knowledge - LAK ’18 (s. 201–205). Sydney, New South Wales, Australia: ACM Press. https://doi.org/10.1145/3170358.317038 Tempelaar, D. (2020). Supporting the less-adaptive student: the role of learn- ing analytics, formative assessment and 64 M Y N N N N N N N Y blended learning. Assessment & Evalua- tion in Higher Education, 45(4), 579- Van Goidsenhoven, S., Bogdanova, D., Deeva, G., Broucke, S. V., De Weerdt, J., & Snoeck, M. (2020, March). Pre- dicting student success in a blended 65 Y Y N N N N Y N N Y learning environment. In Proceedings of the Tenth International Conference on Learning Analytics & Knowledge (pp. 17-25). van Laer, S., & Elen, J. (2018). Adults’ Self-Regulatory Behaviour Profiles in Blended Learning Environments and 66 Their Implications for Design. Technol- M Y N N N N N D N Y ogy, Knowledge and Learning, 1–31. https://doi.org/10.1007/s10758-017- 9351-y 78 http://www.i-jai.org Paper—Learning Analytics for Blended Learning A Systematic Review of Theory, Methodology, … Van Leeuwen, A. (2019). Teachers’ per- ceptions of the usability of learning ana- lytics reports in a flipped university 67 course: when and how does information M N N N N N N N N Y become actionable knowledge? Educa- tional Technology Research and Devel- opment, 67(5), 1043-1064. Villamañe, M., Álvarez, A., Larrañaga, M., Hernández-Rivas, O., & Caballero, J. (2018). Using Visualizations to Im- prove Assessment in Blended Learning Environments. In Proceedings of the 68 Sixth International Conference on Tech- M Y N N Y N N N N Y nological Ecosystems for Enhancing Multiculturality - TEEM’18 (s. 165– 169). Salamanca, Spain: ACM Press. https://doi.org/10.1145/3284179.328420 Whitelock-Wainwright, A., Tsai, Y. S., Lyons, K., Kaliff, S., Bryant, M., Ryan, K., & Gašević, D. (2020, March). Disci- plinary differences in blended learning 69 M Y N N N N N D N Y design: a network analytic study. In Pro- ceedings of the Tenth International Con- ference on Learning Analytics & Knowledge (pp. 579-588) Zacharis, N. Z. (2015). A multivariate approach to predicting student outcomes in web-enabled blended learning 70 courses. Internet and Higher Education, Y Y N N N N N N N Y 27, 44–53. https://doi.org/10.1016/j.iheduc.2015.05 iJAI ‒ Vol. 2, No. 2, 2020 79

Journal

International Journal of Learning Analytics and Artificial Intelligence for Education (iJAI)Unpaywall

Published: Oct 29, 2020

There are no references for this article.