Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Student Attitudes toward Learning Analytics in Higher Education: “The Fitbit Version of the Learning World”

Student Attitudes toward Learning Analytics in Higher Education: “The Fitbit Version of the... ORIGINAL RESEARCH published: 19 December 2016 doi: 10.3389/fpsyg.2016.01959 Student Attitudes toward Learning Analytics in Higher Education: “The Fitbit Version of the Learning World” 1, 2 1 1 3 Lynne D. Roberts *, Joel A. Howell , Kristen Seaman and David C. Gibson 1 2 School of Psychology and Speech Pathology, Curtin University, Perth, WA, Australia, Faculty of Health Sciences, Curtin University, Perth, WA, Australia, Curtin Institute for Computation; UNESCO Chair of Data Science in Higher Education Learning & Teaching, Curtin University, Perth, WA, Australia Increasingly, higher education institutions are exploring the potential of learning analytics to predict student retention, understand learning behaviors, and improve student learning through providing personalized feedback and support. The technical development of learning analytics has outpaced consideration of ethical issues surrounding their use. Of particular concern is the absence of the student voice in decision-making about learning analytics. We explored higher education students’ knowledge, attitudes, and concerns about big data and learning analytics through four focus groups (N = 41). Thematic analysis of the focus group transcripts identified six key themes. The first theme, “Uninformed and Uncertain,” represents students’ lack of knowledge about Edited by: learning analytics prior to the focus groups. Following the provision of information, Douglas Kauffman, Boston University School of Medicine, viewing of videos and discussion of learning analytics scenarios three further themes; USA “Help or Hindrance to Learning,” “More than a Number,” and “Impeding Independence”; Reviewed by: represented students’ perceptions of the likely impact of learning analytics on their Feifei Li, Educational Testing Service, USA learning. “Driving Inequality” and “Where Will it Stop?” represent ethical concerns raised Phil Newton, by the students about the potential for inequity, bias and invasion of privacy and the Swansea University School of need for informed consent. A key tension to emerge was how “personal” vs. “collective” Medicine, UK purposes or principles can intersect with “uniform” vs. “autonomous” activity. The findings *Correspondence: Lynne D. Roberts highlight the need the need to engage students in the decision making process about lynne.roberts@curtin.edu.au learning analytics. Specialty section: Keywords: learning analytics, higher education, student attitudes, dashboards, big data This article was submitted to Educational Psychology, a section of the journal INTRODUCTION Frontiers in Psychology Received: 03 September 2016 Higher education institutions collect a wide range of electronic data (“big data”) from students Accepted: 30 November 2016 (Picciano, 2012; Daniel, 2015). “Big data” may include information on student demographics, Published: 19 December 2016 enrolments, university learning management systems, surveys, library usage, student performance, Citation: and external data sets (de Freitas et al., 2015). The collection, analysis and reporting of big data on Roberts LD, Howell JA, Seaman K students to predict student retention, understand learning behaviors, and improve learning through and Gibson DC (2016) Student providing personalized feedback and support is referred to as learning analytics (Siemens, 2013). Attitudes toward Learning Analytics in Big data can be used for learning analytics purposes at range of levels within the university, from Higher Education: “The Fitbit Version university wide models predicting retention (e.g., de Freitas et al., 2015 modeled retention based of the Learning World”. on 1272 measures of behavior), through to course level data providing feedback on learning on a Front. Psychol. 7:1959. doi: 10.3389/fpsyg.2016.01959 particular subject to individual students (Arnold and Pistilli, 2012). Frontiers in Psychology | www.frontiersin.org 1 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics The majority of universities are investigating, or are already Prinsloo and Slade, 2014). Students should have an active voice using, learning analytics, typically with a focus on predicting in determining what data is collected about themselves, how it is student retention (Arnold and Pistilli, 2012; Corrin and de Barba, used and stored, who will have access to the data and how student 2014; de Freitas et al., 2015). The use of learning analytics for identities will be protected (Slade and Prinsloo, 2013). Despite predictive purposes is projected to expand to university- and this necessity, there are few studies that have extended beyond system-wide projects (Heath and Leinonen, 2016; Roberts et al., surface level collaborations (Liu et al., 2015), predominately 2016). However, at the current time the application of big data focusing on examining student preferences for analytic features to learning analytics for the purposes of learning instruction (Atif et al., 2015; Reimers and Neovesky, 2015; McPherson et al., is less common (Dede et al., 2016), typically involving small- 2016). scale projects with a focus on understanding learning and The increasingly competitive nature of higher education and teaching practices (Siemens et al., 2013; Colvin et al., 2015). The pressure to quickly fulfill government demands in creating disproportionate focus on prediction over learning highlights nationally and globally competitive graduates may serve as an the gap between the use of big data and learning analytics for explanation for the rapid expansion of learning analytics without prediction and its application to enhancing learning (Dede et al., student involvement (Daniel, 2015). Furthermore, decreased 2016). As argued by Dede et al. (2016), the criterion for learning government funding, increased tuition costs and declining analytics should be the impact on student learning, with research admission rates combine to pressure universities to exceed their required into how teachers and students could use learning competitors and entice students with the provision of new analytic tools to increase learning. In order to develop tools to and “best teaching methods,” in this case learning analytics facilitate student learning, an important first step is to understand (Thornton, 2013). student attitudes toward, and concerns about, learning analytics. To date, universities have predominately focused on the In this article we first describe the current learning analytics role of learning analytics in fulfilling institutional aims such landscape in relation to student involvement in learning analytics as institutional performance assessment, financial planning, research and development. Next we outline the posited benefits recruitment and admissions tracking, and student retention and risks to students associated with learning analytics, before (Daniel, 2015; Hoel et al., 2015). Learning analytic data is describing what is currently known about student attitudes used by universities to enact informed change to improve toward learning analytics from the limited research that has been institutional efficacy and effectiveness (Drachsler and Greller, conducted. We then present our research on student attitudes 2012; Greller and Drachsler, 2012; Daniel, 2015). Despite the toward learning analytics based on a series of focus groups with institutional focus, a range of benefits of learning analytics for undergraduate and postgraduate students. students have been posited (Siemens and Long, 2011; Greller and The rapid adoption and expansion of learning analytics in the Drachsler, 2012; Pardo and Siemens, 2014). Learning analytics higher education sector has occurred at a faster pace than the have the potential to provide students with insight into their consideration of ethical issues surrounding their use (Slade and own learning habits, with the self-evaluation of data considered Prinsloo, 2013; Swenson, 2014). Within the Australian higher critical in obtaining self-knowledge (Greller and Drachsler, education context, the “relative silence” (Colvin et al., 2015) 2012). Higher education learning analytic systems can facilitate on ethical issues has been noted. Of particular concern is the informed decision-making by students, allowing them to alter absence of the student voice in decision-making about learning their learning strategies accordingly (Slade and Prinsloo, 2013). analytics. Involving students as collaborators in decisions about Learning analytic systems are also proposed to improve the big data and learning analytics has been recommended as a feasibility of effective early intervention strategies (Greller and general ethical principle (Slade and Prinsloo, 2013; Roberts et al., Drachsler, 2012; Pardo and Siemens, 2014), with predictive 2016) but is seldom realized. Neglecting student involvement analytics enabling timely and personalized interventions to in the decision making process may pose challenges to the support struggling students before negative outcomes such as acceptability of learning analytics systems (Beattie et al., 2014). failing occurs (Slade and Prinsloo, 2013). Interventions may Learning analytics systems may be seen as a risk to academic include specific recommendations for improvement (Siemens freedom where students perceive they no longer have the ability and Long, 2011) facilitated by the mapping of student activity to autonomously negotiate their learning environment, instead and student profiles. Analytics could form the basis for directing feeling forced to use a system designed by undisclosed “others” resources relevant to students’ learning goals and current (Beattie et al., 2014). Not valuing student input also serves knowledge of the topic (Siemens and Long, 2011). Such an to foster skewed power relationships within higher education approach provides personalized learning (Drachsler and Greller, settings and frames learning analytics as a means to achieving 2012). institutional aims rather than serving students’ learning (Slade Despite these posited benefits, there are also risks for students and Prinsloo, 2013; Beattie et al., 2014). Neglecting the student associated with learning analytics. Perhaps the most important voice also undermines transparency, autonomy and informed of these is that the prediction of at-risk students risks profiling consent (Slade and Prinsloo, 2013; Beattie et al., 2014; Prinsloo students and creating self-fulfilling prophecies (Greller and and Slade, 2014). Drachsler, 2012; Beattie et al., 2014; Willis and Pistilli, 2014). To satisfy ethical guidelines and create a respectful learning While there has always been the potential for teachers to environment student involvement in decision making process profile students based on observable characteristics, learning is necessary (Slade and Prinsloo, 2013; Beattie et al., 2014; analytics provides a wider range of student characteristics for Frontiers in Psychology | www.frontiersin.org 2 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics profiling. Making judgments based on a limited set of parameters are used to create more opportunities to engage in self-regulated creates a context for profiling, and profiling can result in learning. limiting students’ potential and damaging self-efficacy (Greller Research to date provides some support for the role of and Drachsler, 2012). For example, data showing that students dashboards in promoting self-regulated learning and motivating from a particular suburb struggle with comprehension skills students. In a longitudinal study Arnold and Pistilli (2012) could be used to facilitate appropriate support interventions tracked three groups of first year university students using the or could result in stereotyping and discrimination based on Course Signals (“traffic light”) dashboard via anonymous user student demographics (Greller and Drachsler, 2012). Further, feedback surveys and focus groups. The majority of students while the results from predictive analytics can be used to reported a positive experience (89%), increased motivation “nudge” students toward learning activities that increase the (74%), and a desire for the system to be expanded to other units probabilities of learning success (see, for example, Martinez, (58%). However, student feedback also indicated a desire for 2014), there is the potential for nudges to turn into “shoves” more detailed feedback up-dated in real-time and communicated (increasing requirements) or “smacks” (restricting activities), through other media such as emails or text (Arnold and Pistilli, decreasing student autonomy over their learning (Desouza and 2012). Another study sought to examine the usefulness of Smith, 2016). At risk identification also positions the students their dashboard on student’s self-reflection, awareness and sense as being “wrong” (Liu et al., 2015) and may create self-fulfilling making (Santos et al., 2013). Students reported the dashboard prophecies where students withdraw (Willis and Pistilli, 2014). helped them assess how they were performing in the course At risk-identification may foster negative student constructions and their position in the cohort, however did not aid with time and prevent the identification of teaching and institutional management or direction toward needed resources. However, deficiencies (Liu et al., 2015). contrary to Arnold and Pistilli (2012) students’ motivation did Learning analytics also poses risk to student privacy and not increase (Santos et al., 2013). Two further studies have sparks debate over issues such as data ownership (Greller and reported that dashboards improved students self-assessment, Drachsler, 2012). Questions posed include what data is collected? self-efficacy, and satisfaction with the course, however did not Who has access? How will data be de-identified? And how long affect grades (Kosba et al., 2005; Kerly et al., 2008). Differences in does the data remain accessible? (Slade and Prinsloo, 2013). dashboards features and designs may account for differences in Limited research has been conducted with students concerning findings. In their review of 15 dashboards, Verbert et al. (2013) privacy in learning analytics (Drachsler et al., 2015) but theorized noted that only four dashboards have undergone evaluations risks are linked to profiling, stereotyping and poor acceptability linked to learning processes, highlighting the need for further of learning analytic systems (Greller and Drachsler, 2012). longitudinal research in this area (Verbert et al., 2013; Gaševic Students are also at risk of being involved in learning analytics et al., 2015). A further body of research has focused on dashboard without their consent, or upon providing uninformed consent. features (e.g., Reimers and Neovesky, 2015; McPherson et al., In one study, none of the nine students interviewed recalled 2016), outside the scope of this article. providing consent for their university to use student-generated Dashboard systems can be complemented by early alert data from the learning management system (Fisher et al., 2014). systems that provide information to teaching staff and students It appears students may be overwhelmed with administrative of potential difficulties faced by the student (Atif et al., 2015). paperwork when beginning university, transparency of university Three studies have examined student attitudes toward early data usage is poor, or potentially both. Each creates a context alert systems. Atif et al. (2015) surveyed 85 predominately first where students may provide uninformed consent to participation year university students, reporting the majority (90%) wanted in learning analytics. to be contacted immediately when their performance in a As discussed, limited research has examined student unit become unsatisfactory, an assignment was missed or their perceptions of learning analytics. Research in this area to participation was low. Students preferred contact via email rather date has focused on student attitudes toward dashboards and than face to face contact and wanted to be informed of where alert systems (Corrin and de Barba, 2014; Atif et al., 2015; to seek help (Atif et al., 2015). Similarly, Reimers and Neovesky Reimers and Neovesky, 2015). Learning analytics are typically (2015) reported students supported the use of automated alerts displayed to students through a dashboard. A dashboard in their survey of university students. Corrin and de Barba provides a consolidated view of multiple sources of data used (2014) examined how students interpret and act upon early to deliver feedback, direct students toward resources and alerts/feedback delivered via dashboards. Survey and interview provide performance indicators (Corrin and de Barba, 2014). data indicated most students used the dashboard as a means to It is theorized that dashboards can be used by students to reflect on their performance, as a way to create new or amended self-regulate learning based on feedback (Corrin and de Barba, study plans and as a source of motivation. Students also reported 2014). Feedback enables students to monitor the progress of that they liked the ability to compare their performance with their learning goals and if needed, adjust their strategies for peers. However, at times this would obscure success goals, for achieving those (Butler and Winne, 1995). Dashboards provide example, those desiring a high distinction would be satisfied with students with timely, or depending on the system, real-time a distinction if it was above the class average (Corrin and de feedback (Pardo and Siemens, 2014) providing students with Barba, 2014). increased opportunities for feedback compared to traditional As described above, the limited research on student attitudes methods such as waiting for assignment feedback. Dashboards toward learning analytics has largely focused on student support Frontiers in Psychology | www.frontiersin.org 3 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics for dashboards and early alert systems (Arnold and Pistilli, students and 3 postgraduate students from Curtin University 2012). It is important to note the novelty of the field (de Freitas aged 18–47 (M = 23.63, SD = 6.88). The first focus group et al., 2015; Slade and Prinsloo, 2015), the reported recruitment involved five female first year psychology students aged 18–24 difficulties and low responses rates to surveys (Corrin and de (M = 21.2, SD = 3.03). The second focus group comprised 15 Barba, 2014; Atif et al., 2015) and the focus on first year students second year psychology students aged 18–47 (12 women and 3 (Arnold and Pistilli, 2012; Corrin and de Barba, 2014; Atif men, M = 24.4, SD = 7.87). Participants in the third focus age et al., 2015; Sclater, 2015b). Little is known about how attitudes group were 14 third year psychology students aged 19–44 (10 may vary across years of higher education, or student attitudes women and 4 men, M = 24.21, SD = 7.74). The final focus age toward potential ethical issues associated with the use of learning group involved seven students, from a range of disciplines and analytics. years across the university, aged 18–30 (3 women and 4 men, M Key ethical issues related to the use of big data and learning = 22.57, SD = 5.16). Participants for the first three focus groups analytics are privacy, consent, and how data is used, stored, and were recruited through an undergraduate psychology research protected and acted upon (Alexander and Brown, 1998; Cumbley participant pool and received participation points. Participants and Church, 2013; Rubel and Jones, 2016). Slade and Prinsloo for the final focus group were recruited via posters and flyers (2015) hosted an online forum posting nine questions designed distributed around the university campus and snowballing. To to elicit discussions related to these ethical issues. Fifty university recompense the time commitment required, focus group four student representatives engaged in the discussion. Generating participants were provided a cash payment of $25.00. the most posts was the issue of transparency: students indicated the university could make an increased effort to inform them Materials and Procedure of what data is collected, for what purpose, how it is used and The research was approved by the Curtin University Human who would have access to this. Students demonstrated a clear Research Ethics Committee (RDHS-37-16/AR01). Data was desire to be, and to remain, informed and expressed the need collected through four audio-recorded focus groups conducted for governance with a strong ethics base. Students viewed their by the research team and transcribed verbatim. After providing data as valuable and needing protection via mechanisms such as written informed consent and a definition of learning analytics, opt in/out options and informed choices. Students also expressed participants were asked about their current knowledge of concern about learning analytics used alongside personal tutor learning analytics prior to watching brief videos on learning support during a discussion about how to best support the analytics and student dashboards in higher education (Teaching student experience. Students were concerned tutor involvement with Technology, 2013; Sclater, 2015a). The videos were could lead to (mis)labeling and bias that could impact negatively presented as examples of learning analytics systems. Students upon tutor-student relationships. These findings highlight the were also provided with information on the current state importance of involving students early in the decision making of learning analytics within their own university. Participants process about big data and learning analytics in order to develop discussed reactions, perceived advantages, and concerns about “student-centric” approaches that meet students’ learning needs learning analytics in response to the videos, information (Kruse and Pongsajapan, 2012; Slade and Prinsloo, 2013; Gaševic´ on dashboards, and a series of learning analytics scenarios et al., 2015). Slade and Prinsloo (2015) acknowledged the views that depicted dashboards and possible automated or teacher- expressed in the forum cannot be taken as representative of all generated learning analytics alerts. As participants discussed their students, but the rich contextual data found highlights the need reactions and perceptions about learning analytics, the facilitators for further research in this area. (LR, JH, and KS) also used prompts such as, “what would that The current study builds on the limited previous research [concept] mean for you?” or “can you tell me a bit more about to explore students’ knowledge, attitudes and concerns about that [concept]?” to better understand student views without big data and learning analytics. To address the previous noted changing the potential meaning of the students discussion. Focus limitation of research focusing on first year students, separate groups lasted approximately one and a half hours. After each focus groups were conducted with first, second, and third focus group LR and JH discussed the key findings emerging. year students, enabling an examination of similarities and Once transcribed focus group data were imported to NVivo differences across year groups. The results from this research (Castleberry, 2014) and subject to a thematic analysis, according can be used to inform the development and implementation to the procedure outlined by Braun and Clarke (2006). of learning analytics programs in higher education, ensuring Following data familiarization, data was sorted into starting learning analytics are developed, and delivered in a manner that nodes of attitudes, preferences, misconceptions, and concerns, is acceptable to students. with further child nodes (representing codes) generated using an inductive process during coding. These codes were then grouped to develop overarching themes. The initial coding and theme METHODS development was conducted by KS. Themes were further refined Participants through revision of transcripts and team discussions (LR, JH, To better understand student perceptions of learning analytics, and KS). During these discussions, relationships between themes four focus groups with current undergraduate and postgraduate were identified and a series of thematic maps depicting these students were conducted at a large metropolitan university in relationships were created to aid the discussion and finalization Australia. Across the focus groups there were 38 undergraduate of themes. Frontiers in Psychology | www.frontiersin.org 4 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics There are two indicators of the adequacy of the sample and area that normally we use, and you can’t take away from them into an area that needs more funding. the themes developed. First, four focus groups comprising 41 participants were conducted. Previous research has suggested that 80% of all themes can be identified in two to three focus As students speculated about learning analytics, two concerns about the use of learning analytics emerged. Students were groups (Guest et al., 2016). Second, we systematically sampled first, second, and third year students respectively from one degree concerned about who would have access to their information; for the first three focus groups to ensure we could identify “I think that the main concern would probably be privacy”; and that learning analytics could bias their treatment in a higher possible similarities and differences across cohorts, and followed this with a focus group comprising students from varying degrees education institution, for example: “[if] a person who’s marking your work gets your results—your blackboard, log in amounts, and and years. No new themes emerged from this final focus group, suggesting that we were approaching saturation. stuff like that. And it’s—and it’s like, ‘Oh this person doesn’t do enough from blackboard.’ ... that could affect their marking.” Even when students were not certain about what learning analytics Findings entailed, once students began discussing how learning analytics Six key themes emerged from the analysis. The first theme, could be used, many were quick to consider the functional impact “Uninformed and Uncertain,” represents students’ views on their educational experience. on learning analytics at the commencement of the focus Once students were provided with more information about group. The remaining themes emerged following the learning analytics, their ideas developed and they were able provision of information, viewing of videos, and discussion to articulate a range of views and concerns about learning of learning analytics scenarios. Three of these themes; “Help analytics, reflected in the remaining themes. The concerns or Hindrance to Learning,” “More than a Number,” and about privacy and bias evident in this theme are explored “Impeding Independence”; relate to students’ perceptions of again later in the themes “where will it stop” and “driving the likely impact of learning analytics on their learning. The inequality.” two remaining themes; “Driving Inequality” and “Where Will it Stop?” represent ethical concerns raised by the students. Each of Theme: More than a Number the themes is expanded upon below. The theme “more than a number” captures students’ reflections on the potential for learning analytics to provide a more Theme: Uninformed and Uncertain personalized experience. Students reported currently feeling The theme “Uninformed and Uncertain” reflects that most relatively anonymous within their courses: “I already feel students were unaware or unsure of what big data and learning like, there so many students in every course, they’re already analytics were at the start of the focus groups. This was reflected like a faceless number to some extent.” The presentation of in one student’s comment that, “I hadn’t heard of it until today.” individualized learning analytics was viewed by some as having Not only did some students explicitly state they were unsure the potential to acknowledge a student as a person rather than of what learning analytics was, even those students who offered a number. Students perceived that if teaching staff are able to identify students who were doing well, this may aid in the ideas of what it might be relied on speculative language when offering responses, for example: “Well, like, it might show what establishment of personalized relationships. services are needed, so if you have, like, a large population in ...it kind of helps the tutor identify people that like, have the certain areas; you could get, like, extra help in these areas.” potential to do really well, they’ll say, “Oh look, you’re already Learning analytics was seen as aligned with improvements in doing really, really well on your own.” You’re not being shown any technology, “Well, that’s the way the world is going. It’s become— favouritism but—at this point I’m like, just gonna, yeah, can send technology is making analytics so relevant.” you a message, and I’m like, “Oh, yeah. I think you can really get Although, students were uncertain about what learning something out of this book.” analytics was, a few students tentatively suggested that there may be benefits for higher education students and institutions. For This was not seen as restricted to only those who are example, one student reported learning analytics might be useful doing well. The importance of establishing a relationship “in designing how to teach certain units like, to suit everyone’s is also reflected when students considered those performing learning styles,” while another student reflected that “I think they poorly: [Universities] use it to improve the student experience.” One view expressed was that learning analytics may be used to benefit the I think any measure to personalize a learning experience especially institution economically: “Or if you can fit, sort of, extra people in, in the big university like Curtin, it can only be a good thing, I that’s more people paying for the class and that sort of thing.” Other think if I was in danger of failing a unit and I had a tutor send students thought that the higher education institution could use a personalized message going, “I see you’re struggling, come and see learning analytics as a way to determine where the institution me,” that would be a good thing, I think.” should allocate resources. One student reported: Here the student suggests a personalized message would help create a relationship between themselves and the tutor, thus You can like, look at it and apply what sort of, facilities are more needed than others so instead of putting a ton of money into one encouraging them to seek assistance. Frontiers in Psychology | www.frontiersin.org 5 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics Theme: Help or Hindrance to Learning? For example if you’re back and you’re struggling a bit and you have a meeting with the lecturer and they say, “Well we can actually see This theme represents students’ views on how learning analytics what you’ve accessed and perhaps we can explain it. It’s because you might impact on their learning. Students’ positive attitudes missed tutes four, five, and six and lectures one, two and three that toward learning analytics were underlined by an attitude that you’ve struggled. We could think that the way for you to improve is collecting more information could only be of benefit: “I think, to attend your lectures and tutorials perhaps” or “This piece of vital big data, I think it increases the chance of accuracy.” Students information was presented in this tutorial and you didn’t go.” identified that learning analytics may help teaching staff identify students who have not performed well in previous or current Students appreciated the role of learning analytics to keep them units, and that they could use this information to offer more informed. For example, when discussing the potential for alerts support: “Helpful to the lecturer to kind of go, ‘Okay so there’s a to be sent to students who are eligible to apply for scholarships group of students that are doing really badly in these areas.’” It one student stated, was also suggested that emails about poor performance include information on support services: “an automated email could be I think that’s a really good thing. Because there’s a lot of scholarships sent out to those people saying you’ve been identified in this zone, that are available that students aren’t aware of like—unless you we’re here to help you. These are the options available to help you, actually go and look for it. There’s a lot that don’t even get claimed feel free to come and see us.” Interestingly, the majority of students just because people are unaware that they are eligible for them.” reported a preference for automated emails over emails from teaching staff, and this was seen as an equity issue: “you shouldn’t The potential for learning analytics to provide data enabling be getting like a personalized message when other students aren’t.” students to compare their academic performance with their Students anticipated that the unit coordinators would see peers was more contentious. Some students would value this not only what resources were accessed, but how frequently opportunity: “...you might not have much of a clue on how you’re these resources were accessed, allowing them to continually going, so that would, I guess, demystify that area.” This was seen improve the content offered in their unit. However, students as important for students who do not attend campus frequently: also noted the potential for learning analytics to collect, display “... it can feel pretty isolated at times where you don’t know what’s and use information that did not accurately reflect their learning going on or how everybody else is traveling.” The ability to clarify activities. As one student commented, “there’s information how where the individual sits within their cohort appeared to be long you’ve been on Blackboard and how the books you got out. valued toward the end of student’s degrees: There—there’s like a risk of the data not being accurate,” which identified that a student’s Blackboard login or borrowing of a Especially if you want to get into honors, you’d be like, “You know book does not mean the student actually engaged with activities maybe I need to be putting more up into this unit.” on Blackboard or read the book. Students’ also expressed concern about their performance being predicted based on past cohorts of students, “I think each student is different so I don’t know if it’s However, not all students were in favor of receiving information right to predict from past students.” that compared their performance to the performance of peers. As Conversation focused on the personal gains each student may one student stated “I don’t think that peer aspect is necessary. I experience as a result of learning analytics. In particular, leaning think it should be more directed at your performance and really analytics was seen to have the potential to improve motivation: it’s like an individual assessment and not so a comparison between “It’s kind of like, the fitbit version of the learning world that everybody doing this.” This view was most widely held by the first it’s tracking your progress and rewarding you for, you know, for year students, who related it back to their high school experience, doing well, and telling you to keep it up.” The ability for learning “I think you get a little bit tired of ranking actually after Year analytics to be used to target opportunities to students based on 12. That was all anyone ever cared about—the ranking—just no, performance was largely supported: “There’s a feeling of being had enough of it.” Concern was raised that this practice could be awarded.” Similarly, some students expressed that it could be divisive; “it isolates like an upper tier of students, there’s kind of useful to identify when they need to do more work, for example: like that competitive fire between the students rather than a sense of community”; and work against the current student culture that was accepting of diversity: One of the main incentives for It’s a good wakeup call. If you haven’t been going to classes whatever me coming to Curtin was it is more accepting, it has a wider and you’re like, “That’s fine, that’s fine.” And then you look at that and you see a correlation between, “Oh man, my grades have demographic of students. dropped down. I haven’t been going, like and when you see it in Students discussed how learning analytics information paper, that’s when you sort of like, ‘Oh, okay, yeah.”’ displayed on dashboards or sent to students through alerts may have unintended negative consequences. Receiving information First year students, in particular, viewed learning analytics as that a student is not doing well in their studies may impact providing a directed learning experience, providing feedback on negatively on their emotions, student identity and future how they are going, where they needed to focus their efforts learning: “Probably like dejected...might give up and drop the and referral to appropriate resources. This arguably reflects their whole course or unit” and “like maybe I’m not fit for the course transitioning phase from high school to university. However, or something.” Even where support or additional resources are a third year psychology student also noted the advantages of suggested the result may be negative: “For someone who is directed learning through learning analytics, not doing well, and to get told about things, ... can be too Frontiers in Psychology | www.frontiersin.org 6 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics overwhelming.” The likelihood of a negative reaction was seen as I think that’s fair in primary and secondary education but when you go to a tertiary institution you presume that because you wanna more likely for students who were working hard in their studies: learn... you shouldn’t have people say, “Oh you need to this, you need to do that,” like, you should be—we’re adults, you should be For someone who struggles with concepts and is putting a lot effort held accountable for your actions. and yet still not making the grade. It’s—it could probably quite disheartening and in turn make it a lot harder for them to have the commitment to try even harder to reach that grade. Students appeared to be concerned learning analytics would diminish the expectation to be self-reliant and create an Students also raised the possibility that learning analytics could environment where students are no longer treated as adults: pressure students who are not suited to a particular degree or studying to remain with the university: We’re not here to be babied, we’re all like you’ve got to be self- motivated. There’s got to be an element of initiative when you are at university. You can’t expect somebody to hold your hand all the I can see there is a potential for universities to use this just to keep way through it. students on as long as possible while accruing [funding] and having tax payers pay the university degree, when maybe they just might not be suited to university. Students reported they were aware of what was expected and it was their own responsibility to manage the work; “I don’t feel like Potential negative consequences were also suggested in you need to constantly be told about it. You need to watch this circumstances where students were performing well, with lecture, you need to attend this tutorial. It’s common sense, we’re suggestions that motivation, studying behavior, and performance adults essentially”; and seek further assistance if required: “I know might suffer. Some students suggested this might take the form what the reading is ... if I wanna do further reading, I will pursue of reduced effort: “you might slack off a little bit.” Other students that myself or I will ask my lecturer for what information that I reported they would feel pressured to perform, particularly could read.” Each quote demonstrates students’ reluctance to be when the information came directly from the unit coordinator “babied” and the desire to self-direct their learning; something (rather than an automated message) or contained suggestions they fear will be removed if learning analytics becomes a way to for further work, and one student commented they might feel “micromanage” students. a “Bit pressured maybe, to keep up to that standard.” Students Students further noted that much of the learning analytics also noted that if multiple messages were received “The lecturer’s information, such as grades and comparison to peers, was already continuously watching you is pressuring.” available to them through other means, making learning analytics It is not only students who are doing extremely well or redundant. As one student commented when looking at an poorly that might be affected by learning analytics. Some students example dashboard: predicted they would experience pressure from the continual display of grades and participation in dashboards, “I think I’ll We can see that anyway with the line information that they give you be really stressed, like reflecting my attendance and participation on the bell chart [when marks are released], so you can see if you’re and every single score.” Personality was also suggested as a factor in the top ten percent, and it’s no different to what information is already out there. that may influence how students react to learning analytics information: Students were wary that the over-dependence on learning I would imagine an anxious person receiving a bad signal on that, analytic systems at university could then become a problem when like, I know someone that I study with now, and she’s just a stress students enter the workforce where similar systems may not exist: head –... She’d flip out, she wouldn’t sleep for days. ... in all likelihood if you have a professional job, you’re not going However, it should be noted that not all students expected to to be having someone hovering over your desk telling you about, have any emotional or behavioral reaction that differed from the you know looking at your every keystroke seeing whether or not current situation when students find out how they are doing in you’re doing any good, and sort of every month pulling you aside and telling you specifically what exactly you... you know you do comparison to other students: “I don’t think I’d be—I’d feel any have to sort of gain a level of self-awareness and responsibility to different to what I would feel now when people talk about their sort of tell for yourself how you are doing. marks.” Theme: Impeding Independence Theme: Driving Inequality The theme “Impeding Independence” represented a tension The theme “driving inequality” stemmed from students noting students expressed that while they appreciate the additional that learning analytics may result in only some students being supports that learning analytics could offer, the students valued advantaged. Students considered potential ethical implications being in charge of their own education. Several comments learning analytics may pose. Specifically, students raised concerns reflected this view, such as, “I can handle my own education” about equity and bias. and “Education needs to be—going on your own merit yourself.” The desire to have control over one’s education was fostered Equity by the differentiation between secondary and tertiary learning Students highlighted an underlying tension regarding the use of expectations: learning analytics. Although, students identified that they would Frontiers in Psychology | www.frontiersin.org 7 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics appreciate personalized or automated messages indicating they Informed consent are performing well in comparison to their cohort or providing Students highlighted the need for informed consent for the use information on additional resources, this was seen as inequitable: of their data for learning analytics: “you’d have to explain to “you shouldn’t be getting like a personalized message when other every single student exactly what Learning Analytics is, what you’re students aren’t.” Students were also concerned extra guidance doing with all of their data otherwise they can’t get properly from coordinators would unfairly impact on student grades, “if he informed consent.” They noted the difficulties in assuming gets that email and that influences his overall grade, did everyone informed consent from documents used for other purposes such else get that email?” Students described how they would feel as admission: annoyed; “I would be complaining”; or discouraged if they found out others had received an email and they had not: “Could be a I signed-up for uni[versity] four years ago, I signed a document, four years, I don’t know anything on that document. So I imagine self-fulling prophecy. ...—Oh, I didn’t get the extra readings—oh, I even if I’m fully informed, the day you actually signed up for—I think I’m dumb, I must be dumb.” imagine a week later you’ve probably completely forgotten what’s in that. Bias The greatest concern raised by students was the potential for staff Students discussed the importance of providing opt-in or opt- to form preconceived judgments of students and biased opinions out consent options for learning analytics, expressing their desire based on learning analytics. Students were particularly concerned to make independent decisions concerning their involvement these biases would affect how staff interacted with them and in learning analytics. They recognized that while some students their chances of future studies: “If they start a class knowing would be keen to obtain comparative data from the whole cohort; that someone is likely to fail, they might not just bother putting “the people that opt in are obviously wanting to know how they’re as much effort into that because they got such a track record of progressing”; others may not share this interest: “ignorance is bliss, having low grades” and “there could be preconceived judgment I—just take me out of the equation, like, I don’t want to know about my abilities to be able to complete or do something, which anything about it.” may inadvertently make me singled out from being available to do something.” Concern extended to students who have performed well monopolized teaching staff’s attention: “if a teacher can see DISCUSSION your grades they might just pay attention to the one who’s getting The aim of this research was to explore students’ knowledge, high grades and not everyone else.” Students clearly disliked staff attitudes and concerns about big data and learning analytics. We being able to link their identity with their grades and online found the majority of students engaged in focus groups had little, activity in fear of being treated differently or affecting future if any, knowledge of learning analytics (theme, “Uninformed and study opportunities. Uncertain”). The lack of knowledge extended to the types of Theme: Where Will It Stop? data collected by the university and its use, supporting previous The theme “Where Will it Stop” reflects students’ concerns that findings that students are unaware of having consented to the use learning analytics may represent an invasion of privacy, and of their data for learning analytic purposes (Fisher et al., 2014). the perceived importance of obtaining informed consent from This finding is not unexpected given the provision of learning students for the use of their data. analytics feedback to students is in its infancy at this university. It does however point to the absence of the student voice in Invasion of privacy the development of learning analytics, a recommended ethical A prominent sub-theme resonating throughout the focus groups principle (Slade and Prinsloo, 2013; Roberts et al., 2016) that was the potential for learning analytics to compromise students’ is seldom realized and a potential threat to the acceptability of privacy: “I kind of feel just it’s a bit. It’s a bit too much. Like, learning analytics systems (Beattie et al., 2014). The absence of it’s a bit—it’s very personal, it’s like it’s—you’re—yeah encroaching student involvement is perhaps not surprising given involving on personal space.” The invasion of privacy sub theme was students as co-creators of teaching approaches, course design, particularly evident when discussing the potential range of data and curricula is a recommended, but infrequently implemented, that could be included in data-analytics in the future: “like if practice in the higher education sector generally (Bovill et al., I’m in my personal time, I don’t really want that to be recorded.” 2011). Students’ also expressed a level of discomfort with learning When students were provided with further information and analytics, “I’ll be like, a little bit, sort of weirded out, because that I time for reflection, their attitudes toward learning analytics seem know, like, everything is being watched like, calculated I guess.” It to fall into “personal” vs. “collective” purposes or principles, is clear students are wary and apprehensive about how much data which intersect and cross between “uniform” vs. “autonomous” is collected from them and who may have access to this. activity. The intersections give rise to some troublesome areas Students considered receiving alerts about specific learning where conflicting purposes and audiences arise. For example, activities not completed as unnecessary: “It just seems a bit in the theme “Help or Hindrance to Learning” students invasive.” Students linked the reminder emails with unnecessary acknowledge that they might want to know how they compare to paternalism, “it’s just like when your parents all hover over you to others and how they are progressing (reflecting previous findings do every single homework.” that most students are interested in receiving this information; Frontiers in Psychology | www.frontiersin.org 8 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics Corrin and de Barba, 2014; Atif et al., 2015; Reimers and can pose a risk to students autonomously navigating their Neovesky, 2015), while other students might not want to know. learning. So on the principle of personal autonomy, every student should be able to choose whether to see this information or receive LIMITATIONS AND FUTURE DIRECTIONS messages about their relative performance. Yet, as highlighted FOR RESEARCH in the theme “Driving Inequality,” out of fairness, the students also want all students to be treated equitably with messages The findings from our research should be interpreted within and resources, not just a selected subgroup. So the principle of the context of its limitations. First, this research was conducted personal autonomous activity needed for independence conflicts primarily with undergraduate students in the health sciences. with a collective uniform activity needed for equity. It is possible that these students may have less knowledge Students supporting anonymous automated emails that are and understanding of learning analytics than students in other triggered and unseen by the instructors illustrates an equity goal disciplines such as information technology, and may be more that is in accord with personal concerns. In this case, bias cannot concerned with issues of equity and fair representation across build up in instructors; everyone hears the same messages and students. Disciplinary differences in the type and frequency of gets the same access to resources as everyone else. However, as assessments may also influence how students respond to learning indicated above, if via autonomy, some students turn off those analytics. Research across disciplines is required to understand messages or choose not to participate, inequity may follow as disciplinary differences in student attitudes toward learning some subgroups get more information and resources than others. analytics. Perhaps this form of inequity is more tolerable because it has Second, students were shown videos on two learning analytics arisen due to the choices of the students rather than to structural systems (JISC and Purdue), and while it was noted that these were inequities of an impersonal uniform system. examples it may have been difficult for students to conceptualize If autonomy is supported through personal choices of the other learning analytic approaches. Using these two learning student, then some inequities are likely to be formed with only analytic examples may have biased student discussion toward some subgroups getting certain messages and resources. This these particular systems rather than to learning analytics in raises the question of whether there is a benchmark for the line general. It would be of interest to explore if student perceptions between equity and inequity that is tolerable by the uniform of learning analytics differ if students are presented with system in order to not impede independence. For example, if other learning analytic approaches. Illuminating the similarities all students have the right to not participate in seeing their or differences between findings, when students have different information or messages and their choice leads to missing out learning analytic examples, may also provide universities with on messages, resources and help and they become disadvantaged a clearer understanding of what students view as beneficial or due to their own actions, is that a tolerable inequity? potentially harmful. Seeing and acting on information places all actors (e.g., Third, the focus of the current research has been on instructors, unit coordinators, administrators as well as students) student attitudes to learning analytics, predicated on the relative in this same intersecting network of personal vs. collective absence of the student voice in decision-making about learning purposes and uniform vs. autonomous action. For example, analytics. The other “voice” largely absent from learning analytics as represented in the theme “Help of Hindrance to Learning” decisions in universities is that of the academics who teach. students see the benefit of giving instructors anonymous group Along with students, academics are an intended “end-user” information that would help them teach better to all groups of learning analytics and further research is warranted into (e.g., students who are struggling as well as those who are high attitudes to learning analytics held by academics with teaching achieving). But while some students welcome the opportunity responsibilities. for this to enable personalized relationships (theme “More than a Number”) others do not want the instructor to know who specifically is in those groups for fear of bias and preconceptions APPLICATION OF FINDINGS (theme “Driving Inequality”). The tension resulting from the intersection of students’ Whilst there are no easy options in developing policies and preferences for personal vs. collective purposes with uniform systems that address the intersecting and conflicting attitudes vs. autonomous activity highlights the difficulty in developing held by students, the starting points needs to be engaging uniform policies concerning the uniform application of students in the decision making process. We echo previous rules and processes that can also allow for autonomous calls for student engagement in decision making to ensure the and personalized decision-making and action by each acceptability of the learning analytics systems developed (Slade individual student. Students held concerns about invasion and Prinsloo, 2013; Beattie et al., 2014; Prinsloo and Slade, 2014). of privacy (theme “Where Will it Stop”), echoing “creepy” This may take the form of representation from student guilds concerns held more widely about big data (Cumbley and or related organizations that represent the wider student body. Church, 2013). Further, some students rejected the need for The findings from this research also highlight the need to inform learning analytics, viewing it as a retrospective step away students about big data and learning analytics activities that are from independence (theme “Impeding Independence”). This planned or taking place within the university. Related to this is echoes Beattie et al.’s (2014) concern that learning analytics the need for each university to develop policy and procedures Frontiers in Psychology | www.frontiersin.org 9 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics for obtaining student consent for the collection and use of their analytics, and invasion of personal privacy. The findings highlight data. Ideally, this will occur as part of developing a university- the need the need to engage students in the decision making wide code of practice/ethics for learning analytics, such as that process about learning analytics. developed by Jisc (2015). AUTHOR CONTRIBUTIONS SUMMARY LR and JH contributed to all stages of the research project and writing. KS contributed to the focus groups and writing. DG Our research highlights the limited knowledge students have contributed to the research design, interpretation of findings and about big data and learning analytics within higher education. writing. While students expressed an appreciation that learning analytics could provide more personalized learning experiences, they held reservations about the functional impact of learning analytics on FUNDING their education and sought the ability to make autonomous and personalized decisions about their learning. Further, they were This project was funded by Curtin University Teaching concerned about the potential inequities resulting from learning Excellence Development Fund. REFERENCES Drachsler, H., Hoel, T., Scheffel, M., Kismihók, G., Berg, A., Ferguson, R., et al. (2015). “Ethical and privacy issues in the application of learning analytics,” Alexander, P., and Brown, S. (1998). “Attitudes toward information privacy: in Proceedings of the Fifth International Conference on Learning Analytics and differences among and between faculty and students,” in AMCIS Proceedings, Knowledge (Poughkeepsie, NY). 17 (Baltimore, MA). Fisher, J., Valenzuela, F.-R., and Whale, S. (2014). Learning Analytics: A Bottom- Arnold, K. E., and Pistilli, M. D. (2012). “Course signals at Purdue: using learning Up Approach to Enhancing and Evaluating Students’ Online Learning. Available analytics to increase student success,” in Proceedings of the 2nd International online at: http://www.olt.gov.au/project-learning-analytics-bottom-approach- Conference on Learning Analytics and Knowledge (Vancouver, BC). enhancing-and-evaluating-studentsapos-online-learning-201 Atif, A., Bilgin, A., and Richards, D. (2015). Student Preferences and Attitudes to Gaševic´, D., Dawson, S., and Siemens, G. (2015). Let’s not forget: learning the use of Early Alerts. Puerto Rico: Paper presented at Twenty-first Americas analytics are about learning. Techtrends 59, 64–71. doi: 10.1007/s11528-014- Conference on Information Systems. 0822-x Beattie, S., Woodley, C., and Souter, K. (2014). “Creepy analytics and learner data Greller, W., and Drachsler, H. (2012). Translating learning into numbers: a generic rights,” in Rhetoric and Reality: Critical Perspectives on Educational Techology- framework for learning analytics. Educ. Technol. Soc. 15, 42–57. Available Conference Proceedings (Dunedin). online at: http://www.jstor.org/stable/jeductechsoci.15.3.42 Bovill, C., Cook-Sather, A., and Felten, P. (2011). Students as co-creators of Guest, G., Namey, E., and McKenna, K. (2016). How many focus groups are teaching approaches, course design, and curricula: implications for academic enough? Building an evidence base for nonprobability sample sizes. Field developers. Int. J. Acad. Dev. 16, 133–145. doi: 10.1080/1360144X.2011.568690 Methods. doi: 10.1177/1525822X16639015. [Epub ahead of print]. Braun, V., and Clarke, V. (2006). Using thematic analysis in psychology. Qual. Res. Heath, J., and Leinonen, E. (2016). “An institution wide approach to learning Psychol. 3, 77–101. doi: 10.1191/1478088706qp063oa analytics,” in Developing Effective Educational Experiences through Learning Butler, D. L., and Winne, P. H. (1995). Feedback and self-regulated Analytics, ed M. Anderson (Hershey, PA: IGI Global), 73–87. learning: a theoretical synthesis. Rev. Educ. Res. 65, 245–281. Hoel, T., Mason, J., and Chen, W. (2015). “Data sharing for learning analytics– doi: 10.3102/00346543065003245 Questioning the risks and benefits,” in Proceedings of the 23rd International Castleberry, A. (2014). NVivo qualitative data analysis Software; QSR International Conference on Computers in Education. China: Asia-Pacific Society for Pty Ltd. Version 10, 2012. Am. J. Pharm. Educ. 78. doi: 10.5688/ajpe78125 Computers in Education (Hangzhou). Colvin, C., Rogers, T., Wade, A., Dawson, S., Gaševic´, D., Buckingham Shum, S., Jisc (2015). Code of Practice for Learning Analytics. Available online at: et al. (2015). Student Retention and Learning Analytics: A Snapshot of Australian https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics Practices and a Framework for Advancement. Sydney, NSW: Australian Office Kerly, A., Ellis, R., and Bull, S. (2008). CALMsystem: a conversational for Learning and Teaching. agent for learner modelling. Knowl. Based Syst. 21, 238–246. Corrin, L., and de Barba, P. (2014). “Exploring students’ interpretation of feedback doi: 10.1016/j.knosys.2007.11.015 delivered through learning analytics dashboards,” in Proceedings of the Ascilite Kosba, E., Dimitrova, V., and Boyle, R. (2005). “Using student and group models to 2014 Conference (Dunedin). support teachers in web-based distance education,” in International Conference Cumbley, R., and Church, P. (2013). Is “big data” creepy? Comp. Law Sec. Rev. 29, on User Modeling (Edinburgh). 601–609. doi: 10.1016/j.clsr.2013.07.007 Kruse, A., and Pongsajapan, R. (2012). Student-centered learning analytics. CNDLS Daniel, B. (2015). Big data and analytics in higher education: opportunities and Thought Papers, 1–9. challenges. Br. J. Educ. Technol. 46, 904–920. doi: 10.1111/bjet.12230 Liu, D. Y.-T., Rogers, T., and Pardo, A. (2015). “Learning analytics-are we at risk of Dede, C., Ho, A., and Mitros, P. (2016). Big data analysis in higher education: missing the point,” in Proceedings of the 32nd Ascilite Conference (Perth, WA). promises and pitfalls. EDUCAUSE Rev. 51, 22–34. Martinez, I. (2014). The Effects of Nudges on Students’ Effort and Performance: de Freitas, S., Gibson, D., Du Plessis, C., Halloran, P., Williams, E., Ambrose, Lessons from a MOOC. Available online at: http://curry.virginia.edu/uploads/ M., et al. (2015). Foundations of dynamic learning analytics: using university resourceLibrary/19_Martinez_Lessons_from_a_MOOC.pdf student data to increase retention. Br. J. Educ. Technol. 46, 1175–1188. McPherson, J., Tong, H. L., Fatt, S. J., and Liu, D. Y. (2016). “Student perspectives doi: 10.1111/bjet.12212 on data provision and use: starting to unpack disciplinary differences,” in Desouza, K. C., and Smith, K. L. (2016). Predictive analytics: nudging, shoving, and Proceedings of the Sixth International Conference on Learning Analytics and smacking behaviors in higher education. EDUCAUSE Rev. 51, 10–20. Knowledge (Edinburgh). Drachsler, H., and Greller, W. (2012). “The pulse of learning analytics Pardo, A., and Siemens, G. (2014). Ethical and privacy principles for learning understandings and expectations from the stakeholders,” in Proceedings of analytics. Br. J. Educ. Technol. 45, 438–450. doi: 10.1111/bjet.12152 the 2nd International Conference on Learning Analytics and Knowledge Picciano, A. G. (2012). The evolution of big data and learning analytics in (Vancouver, BC). American higher education. J. Async. Learn. Networks 16, 9–20. Frontiers in Psychology | www.frontiersin.org 10 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics Prinsloo, P., and Slade, S. (2014). Educational triage in open distance learning: Slade, S., and Prinsloo, P. (2015). Student perspectives on the use of their data: walking a moral tightrope. Int. Rev. Res. Open Distrib. Learn. 15, 306–331. between intrusion, surveillance and care. Eur. J. Open Dist. E-Learn. 18. doi: 10.19173/irrodl.v15i4.1881 Available online at: http://www.eurodl.org/index.php?p=special&sp=articles& Reimers, G., and Neovesky, A. (2015). “Student focused dashboards,” in 7th inum=6&abstract=672&article=679 International Conferenceon Computer Supported Education (Lisbon). Swenson, J. (2014). “Establishing an ethical literacy for learning analytics,” in Roberts, L., Chang, V., and Gibson, D. (2016). “Ethical considerations in adopting a Proceedings of the Fourth International Conference on Learning Analytics and university- and system-wide approach to data and learning analytics,” Big Data Knowledge (Indianapolis). and Learning Analytics in Higher Education, ed B. Kei Daniel (Cham: Springer), Teaching with Technology (Producer) (2013). Course Signals Explanation. 89–108. Avaialble online at: http://www.youtube.com/watch?v=-BI9E7qP9jA Rubel, A., and Jones, K. M. (2016). Student privacy in learning Thornton, G. (2013). The State of Higher Education in 2013 Pressures, Changes and analytics: an information ethics perspective. Inform. Soc. 32, 143–159. New Priorities. Avaialble online at: https://www.grantthornton.com/$\sim$/ doi: 10.1080/01972243.2016.1130502 media/content-page-files/nfp/pdfs/2013/NFP-2013-05-state-of-higher- Santos, J. L., Verbert, K., Govaerts, S., and Duval, E. (2013). “Addressing learner education-in-2013.ashx issues with StepUp!: an evaluation,” in Proceedings of the Third International Verbert, K., Duval, E., Klerkx, J., Govaerts, S., and Santos, J. L. (2013). Conference on Learning Analytics and Knowledge (Leuven). Learning analytics dashboard applications. Am. Behav. Sci. 57, 1500–1509. Sclater, N. (Producer) (2015a). Jisc Learning Analytics Architecture. Available doi: 10.1177/0002764213479363 online at: https://www.youtube.com/watch?v=PoH0NXUbrjw Willis, J. E. III., and Pistilli, M. D. (2014). Ethical Discourse: Guiding the Future Sclater, N. (2015b). What Do Students Want from a Learning Analytics App? of Learning Analytics. EDUCAUSE Review Online. Avaialble online at: http:// Available online at: http://analytics.jiscinvolve.org/wp/2015/04/29/what-do- er.educause.edu/articles/2014/4/ethical-discourse-guiding-the-future-of- students-want-from-a-learning-analyticsapp/ learning-analytics Siemens, G. (2013). Learning analytics: the emergence of a discipline. Am. Behav. Sci. 57, 1380–1400. doi: 10.1177/0002764213498851 Conflict of Interest Statement: The authors declare that the research was Siemens, G., Dawson, S., and Lynch, G. (2013). Improving the Quality and conducted in the absence of any commercial or financial relationships that could Productivity of the Higher Education Sector. Policy and Strategy for Systems- be construed as a potential conflict of interest. Level Deployment of Learning Analytics. Canberra, ACT: Society for Learning Analytics Research for the Australian Office for Learning and Teaching. Copyright © 2016 Roberts, Howell, Seaman and Gibson. This is an open-access Siemens, G., and Long, P. (2011). Penetrating the fog: analytics in learning and article distributed under the terms of the Creative Commons Attribution License (CC education. EDUCAUSE Rev. 46, 30. BY). The use, distribution or reproduction in other forums is permitted, provided the Slade, S., and Prinsloo, P. (2013). Learning analytics ethical issues and original author(s) or licensor are credited and that the original publication in this dilemmas. Am. Behav. Sci. 57, 1510–1529. doi: 10.1177/00027642134 journal is cited, in accordance with accepted academic practice. No use, distribution 79366 or reproduction is permitted which does not comply with these terms. Frontiers in Psychology | www.frontiersin.org 11 December 2016 | Volume 7 | Article 1959 http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Frontiers in Psychology Pubmed Central

Student Attitudes toward Learning Analytics in Higher Education: “The Fitbit Version of the Learning World”

Frontiers in Psychology , Volume 7 – Dec 19, 2016

Loading next page...
 
/lp/pubmed-central/student-attitudes-toward-learning-analytics-in-higher-education-the-7HHOj4qMcS

References

References for this paper are not available at this time. We will be adding them shortly, thank you for your patience.

Publisher
Pubmed Central
Copyright
Copyright © 2016 Roberts, Howell, Seaman and Gibson.
ISSN
1664-1078
eISSN
1664-1078
DOI
10.3389/fpsyg.2016.01959
Publisher site
See Article on Publisher Site

Abstract

ORIGINAL RESEARCH published: 19 December 2016 doi: 10.3389/fpsyg.2016.01959 Student Attitudes toward Learning Analytics in Higher Education: “The Fitbit Version of the Learning World” 1, 2 1 1 3 Lynne D. Roberts *, Joel A. Howell , Kristen Seaman and David C. Gibson 1 2 School of Psychology and Speech Pathology, Curtin University, Perth, WA, Australia, Faculty of Health Sciences, Curtin University, Perth, WA, Australia, Curtin Institute for Computation; UNESCO Chair of Data Science in Higher Education Learning & Teaching, Curtin University, Perth, WA, Australia Increasingly, higher education institutions are exploring the potential of learning analytics to predict student retention, understand learning behaviors, and improve student learning through providing personalized feedback and support. The technical development of learning analytics has outpaced consideration of ethical issues surrounding their use. Of particular concern is the absence of the student voice in decision-making about learning analytics. We explored higher education students’ knowledge, attitudes, and concerns about big data and learning analytics through four focus groups (N = 41). Thematic analysis of the focus group transcripts identified six key themes. The first theme, “Uninformed and Uncertain,” represents students’ lack of knowledge about Edited by: learning analytics prior to the focus groups. Following the provision of information, Douglas Kauffman, Boston University School of Medicine, viewing of videos and discussion of learning analytics scenarios three further themes; USA “Help or Hindrance to Learning,” “More than a Number,” and “Impeding Independence”; Reviewed by: represented students’ perceptions of the likely impact of learning analytics on their Feifei Li, Educational Testing Service, USA learning. “Driving Inequality” and “Where Will it Stop?” represent ethical concerns raised Phil Newton, by the students about the potential for inequity, bias and invasion of privacy and the Swansea University School of need for informed consent. A key tension to emerge was how “personal” vs. “collective” Medicine, UK purposes or principles can intersect with “uniform” vs. “autonomous” activity. The findings *Correspondence: Lynne D. Roberts highlight the need the need to engage students in the decision making process about lynne.roberts@curtin.edu.au learning analytics. Specialty section: Keywords: learning analytics, higher education, student attitudes, dashboards, big data This article was submitted to Educational Psychology, a section of the journal INTRODUCTION Frontiers in Psychology Received: 03 September 2016 Higher education institutions collect a wide range of electronic data (“big data”) from students Accepted: 30 November 2016 (Picciano, 2012; Daniel, 2015). “Big data” may include information on student demographics, Published: 19 December 2016 enrolments, university learning management systems, surveys, library usage, student performance, Citation: and external data sets (de Freitas et al., 2015). The collection, analysis and reporting of big data on Roberts LD, Howell JA, Seaman K students to predict student retention, understand learning behaviors, and improve learning through and Gibson DC (2016) Student providing personalized feedback and support is referred to as learning analytics (Siemens, 2013). Attitudes toward Learning Analytics in Big data can be used for learning analytics purposes at range of levels within the university, from Higher Education: “The Fitbit Version university wide models predicting retention (e.g., de Freitas et al., 2015 modeled retention based of the Learning World”. on 1272 measures of behavior), through to course level data providing feedback on learning on a Front. Psychol. 7:1959. doi: 10.3389/fpsyg.2016.01959 particular subject to individual students (Arnold and Pistilli, 2012). Frontiers in Psychology | www.frontiersin.org 1 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics The majority of universities are investigating, or are already Prinsloo and Slade, 2014). Students should have an active voice using, learning analytics, typically with a focus on predicting in determining what data is collected about themselves, how it is student retention (Arnold and Pistilli, 2012; Corrin and de Barba, used and stored, who will have access to the data and how student 2014; de Freitas et al., 2015). The use of learning analytics for identities will be protected (Slade and Prinsloo, 2013). Despite predictive purposes is projected to expand to university- and this necessity, there are few studies that have extended beyond system-wide projects (Heath and Leinonen, 2016; Roberts et al., surface level collaborations (Liu et al., 2015), predominately 2016). However, at the current time the application of big data focusing on examining student preferences for analytic features to learning analytics for the purposes of learning instruction (Atif et al., 2015; Reimers and Neovesky, 2015; McPherson et al., is less common (Dede et al., 2016), typically involving small- 2016). scale projects with a focus on understanding learning and The increasingly competitive nature of higher education and teaching practices (Siemens et al., 2013; Colvin et al., 2015). The pressure to quickly fulfill government demands in creating disproportionate focus on prediction over learning highlights nationally and globally competitive graduates may serve as an the gap between the use of big data and learning analytics for explanation for the rapid expansion of learning analytics without prediction and its application to enhancing learning (Dede et al., student involvement (Daniel, 2015). Furthermore, decreased 2016). As argued by Dede et al. (2016), the criterion for learning government funding, increased tuition costs and declining analytics should be the impact on student learning, with research admission rates combine to pressure universities to exceed their required into how teachers and students could use learning competitors and entice students with the provision of new analytic tools to increase learning. In order to develop tools to and “best teaching methods,” in this case learning analytics facilitate student learning, an important first step is to understand (Thornton, 2013). student attitudes toward, and concerns about, learning analytics. To date, universities have predominately focused on the In this article we first describe the current learning analytics role of learning analytics in fulfilling institutional aims such landscape in relation to student involvement in learning analytics as institutional performance assessment, financial planning, research and development. Next we outline the posited benefits recruitment and admissions tracking, and student retention and risks to students associated with learning analytics, before (Daniel, 2015; Hoel et al., 2015). Learning analytic data is describing what is currently known about student attitudes used by universities to enact informed change to improve toward learning analytics from the limited research that has been institutional efficacy and effectiveness (Drachsler and Greller, conducted. We then present our research on student attitudes 2012; Greller and Drachsler, 2012; Daniel, 2015). Despite the toward learning analytics based on a series of focus groups with institutional focus, a range of benefits of learning analytics for undergraduate and postgraduate students. students have been posited (Siemens and Long, 2011; Greller and The rapid adoption and expansion of learning analytics in the Drachsler, 2012; Pardo and Siemens, 2014). Learning analytics higher education sector has occurred at a faster pace than the have the potential to provide students with insight into their consideration of ethical issues surrounding their use (Slade and own learning habits, with the self-evaluation of data considered Prinsloo, 2013; Swenson, 2014). Within the Australian higher critical in obtaining self-knowledge (Greller and Drachsler, education context, the “relative silence” (Colvin et al., 2015) 2012). Higher education learning analytic systems can facilitate on ethical issues has been noted. Of particular concern is the informed decision-making by students, allowing them to alter absence of the student voice in decision-making about learning their learning strategies accordingly (Slade and Prinsloo, 2013). analytics. Involving students as collaborators in decisions about Learning analytic systems are also proposed to improve the big data and learning analytics has been recommended as a feasibility of effective early intervention strategies (Greller and general ethical principle (Slade and Prinsloo, 2013; Roberts et al., Drachsler, 2012; Pardo and Siemens, 2014), with predictive 2016) but is seldom realized. Neglecting student involvement analytics enabling timely and personalized interventions to in the decision making process may pose challenges to the support struggling students before negative outcomes such as acceptability of learning analytics systems (Beattie et al., 2014). failing occurs (Slade and Prinsloo, 2013). Interventions may Learning analytics systems may be seen as a risk to academic include specific recommendations for improvement (Siemens freedom where students perceive they no longer have the ability and Long, 2011) facilitated by the mapping of student activity to autonomously negotiate their learning environment, instead and student profiles. Analytics could form the basis for directing feeling forced to use a system designed by undisclosed “others” resources relevant to students’ learning goals and current (Beattie et al., 2014). Not valuing student input also serves knowledge of the topic (Siemens and Long, 2011). Such an to foster skewed power relationships within higher education approach provides personalized learning (Drachsler and Greller, settings and frames learning analytics as a means to achieving 2012). institutional aims rather than serving students’ learning (Slade Despite these posited benefits, there are also risks for students and Prinsloo, 2013; Beattie et al., 2014). Neglecting the student associated with learning analytics. Perhaps the most important voice also undermines transparency, autonomy and informed of these is that the prediction of at-risk students risks profiling consent (Slade and Prinsloo, 2013; Beattie et al., 2014; Prinsloo students and creating self-fulfilling prophecies (Greller and and Slade, 2014). Drachsler, 2012; Beattie et al., 2014; Willis and Pistilli, 2014). To satisfy ethical guidelines and create a respectful learning While there has always been the potential for teachers to environment student involvement in decision making process profile students based on observable characteristics, learning is necessary (Slade and Prinsloo, 2013; Beattie et al., 2014; analytics provides a wider range of student characteristics for Frontiers in Psychology | www.frontiersin.org 2 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics profiling. Making judgments based on a limited set of parameters are used to create more opportunities to engage in self-regulated creates a context for profiling, and profiling can result in learning. limiting students’ potential and damaging self-efficacy (Greller Research to date provides some support for the role of and Drachsler, 2012). For example, data showing that students dashboards in promoting self-regulated learning and motivating from a particular suburb struggle with comprehension skills students. In a longitudinal study Arnold and Pistilli (2012) could be used to facilitate appropriate support interventions tracked three groups of first year university students using the or could result in stereotyping and discrimination based on Course Signals (“traffic light”) dashboard via anonymous user student demographics (Greller and Drachsler, 2012). Further, feedback surveys and focus groups. The majority of students while the results from predictive analytics can be used to reported a positive experience (89%), increased motivation “nudge” students toward learning activities that increase the (74%), and a desire for the system to be expanded to other units probabilities of learning success (see, for example, Martinez, (58%). However, student feedback also indicated a desire for 2014), there is the potential for nudges to turn into “shoves” more detailed feedback up-dated in real-time and communicated (increasing requirements) or “smacks” (restricting activities), through other media such as emails or text (Arnold and Pistilli, decreasing student autonomy over their learning (Desouza and 2012). Another study sought to examine the usefulness of Smith, 2016). At risk identification also positions the students their dashboard on student’s self-reflection, awareness and sense as being “wrong” (Liu et al., 2015) and may create self-fulfilling making (Santos et al., 2013). Students reported the dashboard prophecies where students withdraw (Willis and Pistilli, 2014). helped them assess how they were performing in the course At risk-identification may foster negative student constructions and their position in the cohort, however did not aid with time and prevent the identification of teaching and institutional management or direction toward needed resources. However, deficiencies (Liu et al., 2015). contrary to Arnold and Pistilli (2012) students’ motivation did Learning analytics also poses risk to student privacy and not increase (Santos et al., 2013). Two further studies have sparks debate over issues such as data ownership (Greller and reported that dashboards improved students self-assessment, Drachsler, 2012). Questions posed include what data is collected? self-efficacy, and satisfaction with the course, however did not Who has access? How will data be de-identified? And how long affect grades (Kosba et al., 2005; Kerly et al., 2008). Differences in does the data remain accessible? (Slade and Prinsloo, 2013). dashboards features and designs may account for differences in Limited research has been conducted with students concerning findings. In their review of 15 dashboards, Verbert et al. (2013) privacy in learning analytics (Drachsler et al., 2015) but theorized noted that only four dashboards have undergone evaluations risks are linked to profiling, stereotyping and poor acceptability linked to learning processes, highlighting the need for further of learning analytic systems (Greller and Drachsler, 2012). longitudinal research in this area (Verbert et al., 2013; Gaševic Students are also at risk of being involved in learning analytics et al., 2015). A further body of research has focused on dashboard without their consent, or upon providing uninformed consent. features (e.g., Reimers and Neovesky, 2015; McPherson et al., In one study, none of the nine students interviewed recalled 2016), outside the scope of this article. providing consent for their university to use student-generated Dashboard systems can be complemented by early alert data from the learning management system (Fisher et al., 2014). systems that provide information to teaching staff and students It appears students may be overwhelmed with administrative of potential difficulties faced by the student (Atif et al., 2015). paperwork when beginning university, transparency of university Three studies have examined student attitudes toward early data usage is poor, or potentially both. Each creates a context alert systems. Atif et al. (2015) surveyed 85 predominately first where students may provide uninformed consent to participation year university students, reporting the majority (90%) wanted in learning analytics. to be contacted immediately when their performance in a As discussed, limited research has examined student unit become unsatisfactory, an assignment was missed or their perceptions of learning analytics. Research in this area to participation was low. Students preferred contact via email rather date has focused on student attitudes toward dashboards and than face to face contact and wanted to be informed of where alert systems (Corrin and de Barba, 2014; Atif et al., 2015; to seek help (Atif et al., 2015). Similarly, Reimers and Neovesky Reimers and Neovesky, 2015). Learning analytics are typically (2015) reported students supported the use of automated alerts displayed to students through a dashboard. A dashboard in their survey of university students. Corrin and de Barba provides a consolidated view of multiple sources of data used (2014) examined how students interpret and act upon early to deliver feedback, direct students toward resources and alerts/feedback delivered via dashboards. Survey and interview provide performance indicators (Corrin and de Barba, 2014). data indicated most students used the dashboard as a means to It is theorized that dashboards can be used by students to reflect on their performance, as a way to create new or amended self-regulate learning based on feedback (Corrin and de Barba, study plans and as a source of motivation. Students also reported 2014). Feedback enables students to monitor the progress of that they liked the ability to compare their performance with their learning goals and if needed, adjust their strategies for peers. However, at times this would obscure success goals, for achieving those (Butler and Winne, 1995). Dashboards provide example, those desiring a high distinction would be satisfied with students with timely, or depending on the system, real-time a distinction if it was above the class average (Corrin and de feedback (Pardo and Siemens, 2014) providing students with Barba, 2014). increased opportunities for feedback compared to traditional As described above, the limited research on student attitudes methods such as waiting for assignment feedback. Dashboards toward learning analytics has largely focused on student support Frontiers in Psychology | www.frontiersin.org 3 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics for dashboards and early alert systems (Arnold and Pistilli, students and 3 postgraduate students from Curtin University 2012). It is important to note the novelty of the field (de Freitas aged 18–47 (M = 23.63, SD = 6.88). The first focus group et al., 2015; Slade and Prinsloo, 2015), the reported recruitment involved five female first year psychology students aged 18–24 difficulties and low responses rates to surveys (Corrin and de (M = 21.2, SD = 3.03). The second focus group comprised 15 Barba, 2014; Atif et al., 2015) and the focus on first year students second year psychology students aged 18–47 (12 women and 3 (Arnold and Pistilli, 2012; Corrin and de Barba, 2014; Atif men, M = 24.4, SD = 7.87). Participants in the third focus age et al., 2015; Sclater, 2015b). Little is known about how attitudes group were 14 third year psychology students aged 19–44 (10 may vary across years of higher education, or student attitudes women and 4 men, M = 24.21, SD = 7.74). The final focus age toward potential ethical issues associated with the use of learning group involved seven students, from a range of disciplines and analytics. years across the university, aged 18–30 (3 women and 4 men, M Key ethical issues related to the use of big data and learning = 22.57, SD = 5.16). Participants for the first three focus groups analytics are privacy, consent, and how data is used, stored, and were recruited through an undergraduate psychology research protected and acted upon (Alexander and Brown, 1998; Cumbley participant pool and received participation points. Participants and Church, 2013; Rubel and Jones, 2016). Slade and Prinsloo for the final focus group were recruited via posters and flyers (2015) hosted an online forum posting nine questions designed distributed around the university campus and snowballing. To to elicit discussions related to these ethical issues. Fifty university recompense the time commitment required, focus group four student representatives engaged in the discussion. Generating participants were provided a cash payment of $25.00. the most posts was the issue of transparency: students indicated the university could make an increased effort to inform them Materials and Procedure of what data is collected, for what purpose, how it is used and The research was approved by the Curtin University Human who would have access to this. Students demonstrated a clear Research Ethics Committee (RDHS-37-16/AR01). Data was desire to be, and to remain, informed and expressed the need collected through four audio-recorded focus groups conducted for governance with a strong ethics base. Students viewed their by the research team and transcribed verbatim. After providing data as valuable and needing protection via mechanisms such as written informed consent and a definition of learning analytics, opt in/out options and informed choices. Students also expressed participants were asked about their current knowledge of concern about learning analytics used alongside personal tutor learning analytics prior to watching brief videos on learning support during a discussion about how to best support the analytics and student dashboards in higher education (Teaching student experience. Students were concerned tutor involvement with Technology, 2013; Sclater, 2015a). The videos were could lead to (mis)labeling and bias that could impact negatively presented as examples of learning analytics systems. Students upon tutor-student relationships. These findings highlight the were also provided with information on the current state importance of involving students early in the decision making of learning analytics within their own university. Participants process about big data and learning analytics in order to develop discussed reactions, perceived advantages, and concerns about “student-centric” approaches that meet students’ learning needs learning analytics in response to the videos, information (Kruse and Pongsajapan, 2012; Slade and Prinsloo, 2013; Gaševic´ on dashboards, and a series of learning analytics scenarios et al., 2015). Slade and Prinsloo (2015) acknowledged the views that depicted dashboards and possible automated or teacher- expressed in the forum cannot be taken as representative of all generated learning analytics alerts. As participants discussed their students, but the rich contextual data found highlights the need reactions and perceptions about learning analytics, the facilitators for further research in this area. (LR, JH, and KS) also used prompts such as, “what would that The current study builds on the limited previous research [concept] mean for you?” or “can you tell me a bit more about to explore students’ knowledge, attitudes and concerns about that [concept]?” to better understand student views without big data and learning analytics. To address the previous noted changing the potential meaning of the students discussion. Focus limitation of research focusing on first year students, separate groups lasted approximately one and a half hours. After each focus groups were conducted with first, second, and third focus group LR and JH discussed the key findings emerging. year students, enabling an examination of similarities and Once transcribed focus group data were imported to NVivo differences across year groups. The results from this research (Castleberry, 2014) and subject to a thematic analysis, according can be used to inform the development and implementation to the procedure outlined by Braun and Clarke (2006). of learning analytics programs in higher education, ensuring Following data familiarization, data was sorted into starting learning analytics are developed, and delivered in a manner that nodes of attitudes, preferences, misconceptions, and concerns, is acceptable to students. with further child nodes (representing codes) generated using an inductive process during coding. These codes were then grouped to develop overarching themes. The initial coding and theme METHODS development was conducted by KS. Themes were further refined Participants through revision of transcripts and team discussions (LR, JH, To better understand student perceptions of learning analytics, and KS). During these discussions, relationships between themes four focus groups with current undergraduate and postgraduate were identified and a series of thematic maps depicting these students were conducted at a large metropolitan university in relationships were created to aid the discussion and finalization Australia. Across the focus groups there were 38 undergraduate of themes. Frontiers in Psychology | www.frontiersin.org 4 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics There are two indicators of the adequacy of the sample and area that normally we use, and you can’t take away from them into an area that needs more funding. the themes developed. First, four focus groups comprising 41 participants were conducted. Previous research has suggested that 80% of all themes can be identified in two to three focus As students speculated about learning analytics, two concerns about the use of learning analytics emerged. Students were groups (Guest et al., 2016). Second, we systematically sampled first, second, and third year students respectively from one degree concerned about who would have access to their information; for the first three focus groups to ensure we could identify “I think that the main concern would probably be privacy”; and that learning analytics could bias their treatment in a higher possible similarities and differences across cohorts, and followed this with a focus group comprising students from varying degrees education institution, for example: “[if] a person who’s marking your work gets your results—your blackboard, log in amounts, and and years. No new themes emerged from this final focus group, suggesting that we were approaching saturation. stuff like that. And it’s—and it’s like, ‘Oh this person doesn’t do enough from blackboard.’ ... that could affect their marking.” Even when students were not certain about what learning analytics Findings entailed, once students began discussing how learning analytics Six key themes emerged from the analysis. The first theme, could be used, many were quick to consider the functional impact “Uninformed and Uncertain,” represents students’ views on their educational experience. on learning analytics at the commencement of the focus Once students were provided with more information about group. The remaining themes emerged following the learning analytics, their ideas developed and they were able provision of information, viewing of videos, and discussion to articulate a range of views and concerns about learning of learning analytics scenarios. Three of these themes; “Help analytics, reflected in the remaining themes. The concerns or Hindrance to Learning,” “More than a Number,” and about privacy and bias evident in this theme are explored “Impeding Independence”; relate to students’ perceptions of again later in the themes “where will it stop” and “driving the likely impact of learning analytics on their learning. The inequality.” two remaining themes; “Driving Inequality” and “Where Will it Stop?” represent ethical concerns raised by the students. Each of Theme: More than a Number the themes is expanded upon below. The theme “more than a number” captures students’ reflections on the potential for learning analytics to provide a more Theme: Uninformed and Uncertain personalized experience. Students reported currently feeling The theme “Uninformed and Uncertain” reflects that most relatively anonymous within their courses: “I already feel students were unaware or unsure of what big data and learning like, there so many students in every course, they’re already analytics were at the start of the focus groups. This was reflected like a faceless number to some extent.” The presentation of in one student’s comment that, “I hadn’t heard of it until today.” individualized learning analytics was viewed by some as having Not only did some students explicitly state they were unsure the potential to acknowledge a student as a person rather than of what learning analytics was, even those students who offered a number. Students perceived that if teaching staff are able to identify students who were doing well, this may aid in the ideas of what it might be relied on speculative language when offering responses, for example: “Well, like, it might show what establishment of personalized relationships. services are needed, so if you have, like, a large population in ...it kind of helps the tutor identify people that like, have the certain areas; you could get, like, extra help in these areas.” potential to do really well, they’ll say, “Oh look, you’re already Learning analytics was seen as aligned with improvements in doing really, really well on your own.” You’re not being shown any technology, “Well, that’s the way the world is going. It’s become— favouritism but—at this point I’m like, just gonna, yeah, can send technology is making analytics so relevant.” you a message, and I’m like, “Oh, yeah. I think you can really get Although, students were uncertain about what learning something out of this book.” analytics was, a few students tentatively suggested that there may be benefits for higher education students and institutions. For This was not seen as restricted to only those who are example, one student reported learning analytics might be useful doing well. The importance of establishing a relationship “in designing how to teach certain units like, to suit everyone’s is also reflected when students considered those performing learning styles,” while another student reflected that “I think they poorly: [Universities] use it to improve the student experience.” One view expressed was that learning analytics may be used to benefit the I think any measure to personalize a learning experience especially institution economically: “Or if you can fit, sort of, extra people in, in the big university like Curtin, it can only be a good thing, I that’s more people paying for the class and that sort of thing.” Other think if I was in danger of failing a unit and I had a tutor send students thought that the higher education institution could use a personalized message going, “I see you’re struggling, come and see learning analytics as a way to determine where the institution me,” that would be a good thing, I think.” should allocate resources. One student reported: Here the student suggests a personalized message would help create a relationship between themselves and the tutor, thus You can like, look at it and apply what sort of, facilities are more needed than others so instead of putting a ton of money into one encouraging them to seek assistance. Frontiers in Psychology | www.frontiersin.org 5 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics Theme: Help or Hindrance to Learning? For example if you’re back and you’re struggling a bit and you have a meeting with the lecturer and they say, “Well we can actually see This theme represents students’ views on how learning analytics what you’ve accessed and perhaps we can explain it. It’s because you might impact on their learning. Students’ positive attitudes missed tutes four, five, and six and lectures one, two and three that toward learning analytics were underlined by an attitude that you’ve struggled. We could think that the way for you to improve is collecting more information could only be of benefit: “I think, to attend your lectures and tutorials perhaps” or “This piece of vital big data, I think it increases the chance of accuracy.” Students information was presented in this tutorial and you didn’t go.” identified that learning analytics may help teaching staff identify students who have not performed well in previous or current Students appreciated the role of learning analytics to keep them units, and that they could use this information to offer more informed. For example, when discussing the potential for alerts support: “Helpful to the lecturer to kind of go, ‘Okay so there’s a to be sent to students who are eligible to apply for scholarships group of students that are doing really badly in these areas.’” It one student stated, was also suggested that emails about poor performance include information on support services: “an automated email could be I think that’s a really good thing. Because there’s a lot of scholarships sent out to those people saying you’ve been identified in this zone, that are available that students aren’t aware of like—unless you we’re here to help you. These are the options available to help you, actually go and look for it. There’s a lot that don’t even get claimed feel free to come and see us.” Interestingly, the majority of students just because people are unaware that they are eligible for them.” reported a preference for automated emails over emails from teaching staff, and this was seen as an equity issue: “you shouldn’t The potential for learning analytics to provide data enabling be getting like a personalized message when other students aren’t.” students to compare their academic performance with their Students anticipated that the unit coordinators would see peers was more contentious. Some students would value this not only what resources were accessed, but how frequently opportunity: “...you might not have much of a clue on how you’re these resources were accessed, allowing them to continually going, so that would, I guess, demystify that area.” This was seen improve the content offered in their unit. However, students as important for students who do not attend campus frequently: also noted the potential for learning analytics to collect, display “... it can feel pretty isolated at times where you don’t know what’s and use information that did not accurately reflect their learning going on or how everybody else is traveling.” The ability to clarify activities. As one student commented, “there’s information how where the individual sits within their cohort appeared to be long you’ve been on Blackboard and how the books you got out. valued toward the end of student’s degrees: There—there’s like a risk of the data not being accurate,” which identified that a student’s Blackboard login or borrowing of a Especially if you want to get into honors, you’d be like, “You know book does not mean the student actually engaged with activities maybe I need to be putting more up into this unit.” on Blackboard or read the book. Students’ also expressed concern about their performance being predicted based on past cohorts of students, “I think each student is different so I don’t know if it’s However, not all students were in favor of receiving information right to predict from past students.” that compared their performance to the performance of peers. As Conversation focused on the personal gains each student may one student stated “I don’t think that peer aspect is necessary. I experience as a result of learning analytics. In particular, leaning think it should be more directed at your performance and really analytics was seen to have the potential to improve motivation: it’s like an individual assessment and not so a comparison between “It’s kind of like, the fitbit version of the learning world that everybody doing this.” This view was most widely held by the first it’s tracking your progress and rewarding you for, you know, for year students, who related it back to their high school experience, doing well, and telling you to keep it up.” The ability for learning “I think you get a little bit tired of ranking actually after Year analytics to be used to target opportunities to students based on 12. That was all anyone ever cared about—the ranking—just no, performance was largely supported: “There’s a feeling of being had enough of it.” Concern was raised that this practice could be awarded.” Similarly, some students expressed that it could be divisive; “it isolates like an upper tier of students, there’s kind of useful to identify when they need to do more work, for example: like that competitive fire between the students rather than a sense of community”; and work against the current student culture that was accepting of diversity: One of the main incentives for It’s a good wakeup call. If you haven’t been going to classes whatever me coming to Curtin was it is more accepting, it has a wider and you’re like, “That’s fine, that’s fine.” And then you look at that and you see a correlation between, “Oh man, my grades have demographic of students. dropped down. I haven’t been going, like and when you see it in Students discussed how learning analytics information paper, that’s when you sort of like, ‘Oh, okay, yeah.”’ displayed on dashboards or sent to students through alerts may have unintended negative consequences. Receiving information First year students, in particular, viewed learning analytics as that a student is not doing well in their studies may impact providing a directed learning experience, providing feedback on negatively on their emotions, student identity and future how they are going, where they needed to focus their efforts learning: “Probably like dejected...might give up and drop the and referral to appropriate resources. This arguably reflects their whole course or unit” and “like maybe I’m not fit for the course transitioning phase from high school to university. However, or something.” Even where support or additional resources are a third year psychology student also noted the advantages of suggested the result may be negative: “For someone who is directed learning through learning analytics, not doing well, and to get told about things, ... can be too Frontiers in Psychology | www.frontiersin.org 6 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics overwhelming.” The likelihood of a negative reaction was seen as I think that’s fair in primary and secondary education but when you go to a tertiary institution you presume that because you wanna more likely for students who were working hard in their studies: learn... you shouldn’t have people say, “Oh you need to this, you need to do that,” like, you should be—we’re adults, you should be For someone who struggles with concepts and is putting a lot effort held accountable for your actions. and yet still not making the grade. It’s—it could probably quite disheartening and in turn make it a lot harder for them to have the commitment to try even harder to reach that grade. Students appeared to be concerned learning analytics would diminish the expectation to be self-reliant and create an Students also raised the possibility that learning analytics could environment where students are no longer treated as adults: pressure students who are not suited to a particular degree or studying to remain with the university: We’re not here to be babied, we’re all like you’ve got to be self- motivated. There’s got to be an element of initiative when you are at university. You can’t expect somebody to hold your hand all the I can see there is a potential for universities to use this just to keep way through it. students on as long as possible while accruing [funding] and having tax payers pay the university degree, when maybe they just might not be suited to university. Students reported they were aware of what was expected and it was their own responsibility to manage the work; “I don’t feel like Potential negative consequences were also suggested in you need to constantly be told about it. You need to watch this circumstances where students were performing well, with lecture, you need to attend this tutorial. It’s common sense, we’re suggestions that motivation, studying behavior, and performance adults essentially”; and seek further assistance if required: “I know might suffer. Some students suggested this might take the form what the reading is ... if I wanna do further reading, I will pursue of reduced effort: “you might slack off a little bit.” Other students that myself or I will ask my lecturer for what information that I reported they would feel pressured to perform, particularly could read.” Each quote demonstrates students’ reluctance to be when the information came directly from the unit coordinator “babied” and the desire to self-direct their learning; something (rather than an automated message) or contained suggestions they fear will be removed if learning analytics becomes a way to for further work, and one student commented they might feel “micromanage” students. a “Bit pressured maybe, to keep up to that standard.” Students Students further noted that much of the learning analytics also noted that if multiple messages were received “The lecturer’s information, such as grades and comparison to peers, was already continuously watching you is pressuring.” available to them through other means, making learning analytics It is not only students who are doing extremely well or redundant. As one student commented when looking at an poorly that might be affected by learning analytics. Some students example dashboard: predicted they would experience pressure from the continual display of grades and participation in dashboards, “I think I’ll We can see that anyway with the line information that they give you be really stressed, like reflecting my attendance and participation on the bell chart [when marks are released], so you can see if you’re and every single score.” Personality was also suggested as a factor in the top ten percent, and it’s no different to what information is already out there. that may influence how students react to learning analytics information: Students were wary that the over-dependence on learning I would imagine an anxious person receiving a bad signal on that, analytic systems at university could then become a problem when like, I know someone that I study with now, and she’s just a stress students enter the workforce where similar systems may not exist: head –... She’d flip out, she wouldn’t sleep for days. ... in all likelihood if you have a professional job, you’re not going However, it should be noted that not all students expected to to be having someone hovering over your desk telling you about, have any emotional or behavioral reaction that differed from the you know looking at your every keystroke seeing whether or not current situation when students find out how they are doing in you’re doing any good, and sort of every month pulling you aside and telling you specifically what exactly you... you know you do comparison to other students: “I don’t think I’d be—I’d feel any have to sort of gain a level of self-awareness and responsibility to different to what I would feel now when people talk about their sort of tell for yourself how you are doing. marks.” Theme: Impeding Independence Theme: Driving Inequality The theme “Impeding Independence” represented a tension The theme “driving inequality” stemmed from students noting students expressed that while they appreciate the additional that learning analytics may result in only some students being supports that learning analytics could offer, the students valued advantaged. Students considered potential ethical implications being in charge of their own education. Several comments learning analytics may pose. Specifically, students raised concerns reflected this view, such as, “I can handle my own education” about equity and bias. and “Education needs to be—going on your own merit yourself.” The desire to have control over one’s education was fostered Equity by the differentiation between secondary and tertiary learning Students highlighted an underlying tension regarding the use of expectations: learning analytics. Although, students identified that they would Frontiers in Psychology | www.frontiersin.org 7 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics appreciate personalized or automated messages indicating they Informed consent are performing well in comparison to their cohort or providing Students highlighted the need for informed consent for the use information on additional resources, this was seen as inequitable: of their data for learning analytics: “you’d have to explain to “you shouldn’t be getting like a personalized message when other every single student exactly what Learning Analytics is, what you’re students aren’t.” Students were also concerned extra guidance doing with all of their data otherwise they can’t get properly from coordinators would unfairly impact on student grades, “if he informed consent.” They noted the difficulties in assuming gets that email and that influences his overall grade, did everyone informed consent from documents used for other purposes such else get that email?” Students described how they would feel as admission: annoyed; “I would be complaining”; or discouraged if they found out others had received an email and they had not: “Could be a I signed-up for uni[versity] four years ago, I signed a document, four years, I don’t know anything on that document. So I imagine self-fulling prophecy. ...—Oh, I didn’t get the extra readings—oh, I even if I’m fully informed, the day you actually signed up for—I think I’m dumb, I must be dumb.” imagine a week later you’ve probably completely forgotten what’s in that. Bias The greatest concern raised by students was the potential for staff Students discussed the importance of providing opt-in or opt- to form preconceived judgments of students and biased opinions out consent options for learning analytics, expressing their desire based on learning analytics. Students were particularly concerned to make independent decisions concerning their involvement these biases would affect how staff interacted with them and in learning analytics. They recognized that while some students their chances of future studies: “If they start a class knowing would be keen to obtain comparative data from the whole cohort; that someone is likely to fail, they might not just bother putting “the people that opt in are obviously wanting to know how they’re as much effort into that because they got such a track record of progressing”; others may not share this interest: “ignorance is bliss, having low grades” and “there could be preconceived judgment I—just take me out of the equation, like, I don’t want to know about my abilities to be able to complete or do something, which anything about it.” may inadvertently make me singled out from being available to do something.” Concern extended to students who have performed well monopolized teaching staff’s attention: “if a teacher can see DISCUSSION your grades they might just pay attention to the one who’s getting The aim of this research was to explore students’ knowledge, high grades and not everyone else.” Students clearly disliked staff attitudes and concerns about big data and learning analytics. We being able to link their identity with their grades and online found the majority of students engaged in focus groups had little, activity in fear of being treated differently or affecting future if any, knowledge of learning analytics (theme, “Uninformed and study opportunities. Uncertain”). The lack of knowledge extended to the types of Theme: Where Will It Stop? data collected by the university and its use, supporting previous The theme “Where Will it Stop” reflects students’ concerns that findings that students are unaware of having consented to the use learning analytics may represent an invasion of privacy, and of their data for learning analytic purposes (Fisher et al., 2014). the perceived importance of obtaining informed consent from This finding is not unexpected given the provision of learning students for the use of their data. analytics feedback to students is in its infancy at this university. It does however point to the absence of the student voice in Invasion of privacy the development of learning analytics, a recommended ethical A prominent sub-theme resonating throughout the focus groups principle (Slade and Prinsloo, 2013; Roberts et al., 2016) that was the potential for learning analytics to compromise students’ is seldom realized and a potential threat to the acceptability of privacy: “I kind of feel just it’s a bit. It’s a bit too much. Like, learning analytics systems (Beattie et al., 2014). The absence of it’s a bit—it’s very personal, it’s like it’s—you’re—yeah encroaching student involvement is perhaps not surprising given involving on personal space.” The invasion of privacy sub theme was students as co-creators of teaching approaches, course design, particularly evident when discussing the potential range of data and curricula is a recommended, but infrequently implemented, that could be included in data-analytics in the future: “like if practice in the higher education sector generally (Bovill et al., I’m in my personal time, I don’t really want that to be recorded.” 2011). Students’ also expressed a level of discomfort with learning When students were provided with further information and analytics, “I’ll be like, a little bit, sort of weirded out, because that I time for reflection, their attitudes toward learning analytics seem know, like, everything is being watched like, calculated I guess.” It to fall into “personal” vs. “collective” purposes or principles, is clear students are wary and apprehensive about how much data which intersect and cross between “uniform” vs. “autonomous” is collected from them and who may have access to this. activity. The intersections give rise to some troublesome areas Students considered receiving alerts about specific learning where conflicting purposes and audiences arise. For example, activities not completed as unnecessary: “It just seems a bit in the theme “Help or Hindrance to Learning” students invasive.” Students linked the reminder emails with unnecessary acknowledge that they might want to know how they compare to paternalism, “it’s just like when your parents all hover over you to others and how they are progressing (reflecting previous findings do every single homework.” that most students are interested in receiving this information; Frontiers in Psychology | www.frontiersin.org 8 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics Corrin and de Barba, 2014; Atif et al., 2015; Reimers and can pose a risk to students autonomously navigating their Neovesky, 2015), while other students might not want to know. learning. So on the principle of personal autonomy, every student should be able to choose whether to see this information or receive LIMITATIONS AND FUTURE DIRECTIONS messages about their relative performance. Yet, as highlighted FOR RESEARCH in the theme “Driving Inequality,” out of fairness, the students also want all students to be treated equitably with messages The findings from our research should be interpreted within and resources, not just a selected subgroup. So the principle of the context of its limitations. First, this research was conducted personal autonomous activity needed for independence conflicts primarily with undergraduate students in the health sciences. with a collective uniform activity needed for equity. It is possible that these students may have less knowledge Students supporting anonymous automated emails that are and understanding of learning analytics than students in other triggered and unseen by the instructors illustrates an equity goal disciplines such as information technology, and may be more that is in accord with personal concerns. In this case, bias cannot concerned with issues of equity and fair representation across build up in instructors; everyone hears the same messages and students. Disciplinary differences in the type and frequency of gets the same access to resources as everyone else. However, as assessments may also influence how students respond to learning indicated above, if via autonomy, some students turn off those analytics. Research across disciplines is required to understand messages or choose not to participate, inequity may follow as disciplinary differences in student attitudes toward learning some subgroups get more information and resources than others. analytics. Perhaps this form of inequity is more tolerable because it has Second, students were shown videos on two learning analytics arisen due to the choices of the students rather than to structural systems (JISC and Purdue), and while it was noted that these were inequities of an impersonal uniform system. examples it may have been difficult for students to conceptualize If autonomy is supported through personal choices of the other learning analytic approaches. Using these two learning student, then some inequities are likely to be formed with only analytic examples may have biased student discussion toward some subgroups getting certain messages and resources. This these particular systems rather than to learning analytics in raises the question of whether there is a benchmark for the line general. It would be of interest to explore if student perceptions between equity and inequity that is tolerable by the uniform of learning analytics differ if students are presented with system in order to not impede independence. For example, if other learning analytic approaches. Illuminating the similarities all students have the right to not participate in seeing their or differences between findings, when students have different information or messages and their choice leads to missing out learning analytic examples, may also provide universities with on messages, resources and help and they become disadvantaged a clearer understanding of what students view as beneficial or due to their own actions, is that a tolerable inequity? potentially harmful. Seeing and acting on information places all actors (e.g., Third, the focus of the current research has been on instructors, unit coordinators, administrators as well as students) student attitudes to learning analytics, predicated on the relative in this same intersecting network of personal vs. collective absence of the student voice in decision-making about learning purposes and uniform vs. autonomous action. For example, analytics. The other “voice” largely absent from learning analytics as represented in the theme “Help of Hindrance to Learning” decisions in universities is that of the academics who teach. students see the benefit of giving instructors anonymous group Along with students, academics are an intended “end-user” information that would help them teach better to all groups of learning analytics and further research is warranted into (e.g., students who are struggling as well as those who are high attitudes to learning analytics held by academics with teaching achieving). But while some students welcome the opportunity responsibilities. for this to enable personalized relationships (theme “More than a Number”) others do not want the instructor to know who specifically is in those groups for fear of bias and preconceptions APPLICATION OF FINDINGS (theme “Driving Inequality”). The tension resulting from the intersection of students’ Whilst there are no easy options in developing policies and preferences for personal vs. collective purposes with uniform systems that address the intersecting and conflicting attitudes vs. autonomous activity highlights the difficulty in developing held by students, the starting points needs to be engaging uniform policies concerning the uniform application of students in the decision making process. We echo previous rules and processes that can also allow for autonomous calls for student engagement in decision making to ensure the and personalized decision-making and action by each acceptability of the learning analytics systems developed (Slade individual student. Students held concerns about invasion and Prinsloo, 2013; Beattie et al., 2014; Prinsloo and Slade, 2014). of privacy (theme “Where Will it Stop”), echoing “creepy” This may take the form of representation from student guilds concerns held more widely about big data (Cumbley and or related organizations that represent the wider student body. Church, 2013). Further, some students rejected the need for The findings from this research also highlight the need to inform learning analytics, viewing it as a retrospective step away students about big data and learning analytics activities that are from independence (theme “Impeding Independence”). This planned or taking place within the university. Related to this is echoes Beattie et al.’s (2014) concern that learning analytics the need for each university to develop policy and procedures Frontiers in Psychology | www.frontiersin.org 9 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics for obtaining student consent for the collection and use of their analytics, and invasion of personal privacy. The findings highlight data. Ideally, this will occur as part of developing a university- the need the need to engage students in the decision making wide code of practice/ethics for learning analytics, such as that process about learning analytics. developed by Jisc (2015). AUTHOR CONTRIBUTIONS SUMMARY LR and JH contributed to all stages of the research project and writing. KS contributed to the focus groups and writing. DG Our research highlights the limited knowledge students have contributed to the research design, interpretation of findings and about big data and learning analytics within higher education. writing. While students expressed an appreciation that learning analytics could provide more personalized learning experiences, they held reservations about the functional impact of learning analytics on FUNDING their education and sought the ability to make autonomous and personalized decisions about their learning. Further, they were This project was funded by Curtin University Teaching concerned about the potential inequities resulting from learning Excellence Development Fund. REFERENCES Drachsler, H., Hoel, T., Scheffel, M., Kismihók, G., Berg, A., Ferguson, R., et al. (2015). “Ethical and privacy issues in the application of learning analytics,” Alexander, P., and Brown, S. (1998). “Attitudes toward information privacy: in Proceedings of the Fifth International Conference on Learning Analytics and differences among and between faculty and students,” in AMCIS Proceedings, Knowledge (Poughkeepsie, NY). 17 (Baltimore, MA). Fisher, J., Valenzuela, F.-R., and Whale, S. (2014). Learning Analytics: A Bottom- Arnold, K. E., and Pistilli, M. D. (2012). “Course signals at Purdue: using learning Up Approach to Enhancing and Evaluating Students’ Online Learning. Available analytics to increase student success,” in Proceedings of the 2nd International online at: http://www.olt.gov.au/project-learning-analytics-bottom-approach- Conference on Learning Analytics and Knowledge (Vancouver, BC). enhancing-and-evaluating-studentsapos-online-learning-201 Atif, A., Bilgin, A., and Richards, D. (2015). Student Preferences and Attitudes to Gaševic´, D., Dawson, S., and Siemens, G. (2015). Let’s not forget: learning the use of Early Alerts. Puerto Rico: Paper presented at Twenty-first Americas analytics are about learning. Techtrends 59, 64–71. doi: 10.1007/s11528-014- Conference on Information Systems. 0822-x Beattie, S., Woodley, C., and Souter, K. (2014). “Creepy analytics and learner data Greller, W., and Drachsler, H. (2012). Translating learning into numbers: a generic rights,” in Rhetoric and Reality: Critical Perspectives on Educational Techology- framework for learning analytics. Educ. Technol. Soc. 15, 42–57. Available Conference Proceedings (Dunedin). online at: http://www.jstor.org/stable/jeductechsoci.15.3.42 Bovill, C., Cook-Sather, A., and Felten, P. (2011). Students as co-creators of Guest, G., Namey, E., and McKenna, K. (2016). How many focus groups are teaching approaches, course design, and curricula: implications for academic enough? Building an evidence base for nonprobability sample sizes. Field developers. Int. J. Acad. Dev. 16, 133–145. doi: 10.1080/1360144X.2011.568690 Methods. doi: 10.1177/1525822X16639015. [Epub ahead of print]. Braun, V., and Clarke, V. (2006). Using thematic analysis in psychology. Qual. Res. Heath, J., and Leinonen, E. (2016). “An institution wide approach to learning Psychol. 3, 77–101. doi: 10.1191/1478088706qp063oa analytics,” in Developing Effective Educational Experiences through Learning Butler, D. L., and Winne, P. H. (1995). Feedback and self-regulated Analytics, ed M. Anderson (Hershey, PA: IGI Global), 73–87. learning: a theoretical synthesis. Rev. Educ. Res. 65, 245–281. Hoel, T., Mason, J., and Chen, W. (2015). “Data sharing for learning analytics– doi: 10.3102/00346543065003245 Questioning the risks and benefits,” in Proceedings of the 23rd International Castleberry, A. (2014). NVivo qualitative data analysis Software; QSR International Conference on Computers in Education. China: Asia-Pacific Society for Pty Ltd. Version 10, 2012. Am. J. Pharm. Educ. 78. doi: 10.5688/ajpe78125 Computers in Education (Hangzhou). Colvin, C., Rogers, T., Wade, A., Dawson, S., Gaševic´, D., Buckingham Shum, S., Jisc (2015). Code of Practice for Learning Analytics. Available online at: et al. (2015). Student Retention and Learning Analytics: A Snapshot of Australian https://www.jisc.ac.uk/guides/code-of-practice-for-learning-analytics Practices and a Framework for Advancement. Sydney, NSW: Australian Office Kerly, A., Ellis, R., and Bull, S. (2008). CALMsystem: a conversational for Learning and Teaching. agent for learner modelling. Knowl. Based Syst. 21, 238–246. Corrin, L., and de Barba, P. (2014). “Exploring students’ interpretation of feedback doi: 10.1016/j.knosys.2007.11.015 delivered through learning analytics dashboards,” in Proceedings of the Ascilite Kosba, E., Dimitrova, V., and Boyle, R. (2005). “Using student and group models to 2014 Conference (Dunedin). support teachers in web-based distance education,” in International Conference Cumbley, R., and Church, P. (2013). Is “big data” creepy? Comp. Law Sec. Rev. 29, on User Modeling (Edinburgh). 601–609. doi: 10.1016/j.clsr.2013.07.007 Kruse, A., and Pongsajapan, R. (2012). Student-centered learning analytics. CNDLS Daniel, B. (2015). Big data and analytics in higher education: opportunities and Thought Papers, 1–9. challenges. Br. J. Educ. Technol. 46, 904–920. doi: 10.1111/bjet.12230 Liu, D. Y.-T., Rogers, T., and Pardo, A. (2015). “Learning analytics-are we at risk of Dede, C., Ho, A., and Mitros, P. (2016). Big data analysis in higher education: missing the point,” in Proceedings of the 32nd Ascilite Conference (Perth, WA). promises and pitfalls. EDUCAUSE Rev. 51, 22–34. Martinez, I. (2014). The Effects of Nudges on Students’ Effort and Performance: de Freitas, S., Gibson, D., Du Plessis, C., Halloran, P., Williams, E., Ambrose, Lessons from a MOOC. Available online at: http://curry.virginia.edu/uploads/ M., et al. (2015). Foundations of dynamic learning analytics: using university resourceLibrary/19_Martinez_Lessons_from_a_MOOC.pdf student data to increase retention. Br. J. Educ. Technol. 46, 1175–1188. McPherson, J., Tong, H. L., Fatt, S. J., and Liu, D. Y. (2016). “Student perspectives doi: 10.1111/bjet.12212 on data provision and use: starting to unpack disciplinary differences,” in Desouza, K. C., and Smith, K. L. (2016). Predictive analytics: nudging, shoving, and Proceedings of the Sixth International Conference on Learning Analytics and smacking behaviors in higher education. EDUCAUSE Rev. 51, 10–20. Knowledge (Edinburgh). Drachsler, H., and Greller, W. (2012). “The pulse of learning analytics Pardo, A., and Siemens, G. (2014). Ethical and privacy principles for learning understandings and expectations from the stakeholders,” in Proceedings of analytics. Br. J. Educ. Technol. 45, 438–450. doi: 10.1111/bjet.12152 the 2nd International Conference on Learning Analytics and Knowledge Picciano, A. G. (2012). The evolution of big data and learning analytics in (Vancouver, BC). American higher education. J. Async. Learn. Networks 16, 9–20. Frontiers in Psychology | www.frontiersin.org 10 December 2016 | Volume 7 | Article 1959 Roberts et al. Student Attitudes toward Learning Analytics Prinsloo, P., and Slade, S. (2014). Educational triage in open distance learning: Slade, S., and Prinsloo, P. (2015). Student perspectives on the use of their data: walking a moral tightrope. Int. Rev. Res. Open Distrib. Learn. 15, 306–331. between intrusion, surveillance and care. Eur. J. Open Dist. E-Learn. 18. doi: 10.19173/irrodl.v15i4.1881 Available online at: http://www.eurodl.org/index.php?p=special&sp=articles& Reimers, G., and Neovesky, A. (2015). “Student focused dashboards,” in 7th inum=6&abstract=672&article=679 International Conferenceon Computer Supported Education (Lisbon). Swenson, J. (2014). “Establishing an ethical literacy for learning analytics,” in Roberts, L., Chang, V., and Gibson, D. (2016). “Ethical considerations in adopting a Proceedings of the Fourth International Conference on Learning Analytics and university- and system-wide approach to data and learning analytics,” Big Data Knowledge (Indianapolis). and Learning Analytics in Higher Education, ed B. Kei Daniel (Cham: Springer), Teaching with Technology (Producer) (2013). Course Signals Explanation. 89–108. Avaialble online at: http://www.youtube.com/watch?v=-BI9E7qP9jA Rubel, A., and Jones, K. M. (2016). Student privacy in learning Thornton, G. (2013). The State of Higher Education in 2013 Pressures, Changes and analytics: an information ethics perspective. Inform. Soc. 32, 143–159. New Priorities. Avaialble online at: https://www.grantthornton.com/$\sim$/ doi: 10.1080/01972243.2016.1130502 media/content-page-files/nfp/pdfs/2013/NFP-2013-05-state-of-higher- Santos, J. L., Verbert, K., Govaerts, S., and Duval, E. (2013). “Addressing learner education-in-2013.ashx issues with StepUp!: an evaluation,” in Proceedings of the Third International Verbert, K., Duval, E., Klerkx, J., Govaerts, S., and Santos, J. L. (2013). Conference on Learning Analytics and Knowledge (Leuven). Learning analytics dashboard applications. Am. Behav. Sci. 57, 1500–1509. Sclater, N. (Producer) (2015a). Jisc Learning Analytics Architecture. Available doi: 10.1177/0002764213479363 online at: https://www.youtube.com/watch?v=PoH0NXUbrjw Willis, J. E. III., and Pistilli, M. D. (2014). Ethical Discourse: Guiding the Future Sclater, N. (2015b). What Do Students Want from a Learning Analytics App? of Learning Analytics. EDUCAUSE Review Online. Avaialble online at: http:// Available online at: http://analytics.jiscinvolve.org/wp/2015/04/29/what-do- er.educause.edu/articles/2014/4/ethical-discourse-guiding-the-future-of- students-want-from-a-learning-analyticsapp/ learning-analytics Siemens, G. (2013). Learning analytics: the emergence of a discipline. Am. Behav. Sci. 57, 1380–1400. doi: 10.1177/0002764213498851 Conflict of Interest Statement: The authors declare that the research was Siemens, G., Dawson, S., and Lynch, G. (2013). Improving the Quality and conducted in the absence of any commercial or financial relationships that could Productivity of the Higher Education Sector. Policy and Strategy for Systems- be construed as a potential conflict of interest. Level Deployment of Learning Analytics. Canberra, ACT: Society for Learning Analytics Research for the Australian Office for Learning and Teaching. Copyright © 2016 Roberts, Howell, Seaman and Gibson. This is an open-access Siemens, G., and Long, P. (2011). Penetrating the fog: analytics in learning and article distributed under the terms of the Creative Commons Attribution License (CC education. EDUCAUSE Rev. 46, 30. BY). The use, distribution or reproduction in other forums is permitted, provided the Slade, S., and Prinsloo, P. (2013). Learning analytics ethical issues and original author(s) or licensor are credited and that the original publication in this dilemmas. Am. Behav. Sci. 57, 1510–1529. doi: 10.1177/00027642134 journal is cited, in accordance with accepted academic practice. No use, distribution 79366 or reproduction is permitted which does not comply with these terms. Frontiers in Psychology | www.frontiersin.org 11 December 2016 | Volume 7 | Article 1959

Journal

Frontiers in PsychologyPubmed Central

Published: Dec 19, 2016

References