Conference Session – Thursday

Thursday, March 16, 2017

Morning Sessions (10:30 AM – 12:00 PM)

Click to Expand

Session: Keynote Discussion and Panel

Keynote Q & A 1A1 (30 min):
Panel 1A2 Round Table (55 min):

Session:Understanding Discourse II

Presentation 1B1 (30 min):“How Effective is Your Facilitation? Group-Level Analytics of MOOC Forums”
(Full Research Paper)
by Oleksandra Poquet, Shane Dawson & Nia Dowell
Abstract:
The facilitation of interpersonal relationships within a respectful learning climate is an important aspect of teaching practice. However, in large-scale online contexts, such as MOOCs, the number of learners and highly asynchronous nature militates against the development of a sense of belonging and dyadic trust. Given these challenges, instead of conventional instruments that reflect learners’ affective perceptions, we suggest a set of indicators that can be used to evaluate social activity in relation to the participation structure. These group-level indicators can then help teachers to gain insights into the evolution of social activity shaped by their facilitation choices. For this study, group-level indicators were derived from measuring information exchange activity between the returning MOOC posters. By conceptualizing this group as an identity-based community, we can apply exponential random graph modelling to explain the network’s structure through the configurations of direct reciprocity, generalized exchange, and the effect of participants demonstrating super-posting behavior. The findings provide novel insights into network amplification, and highlight the differences between the courses employing different facilitation strategies. Direct reciprocation was characteristic of non-facilitated groups. Generalized exchange was more prominent in highly facilitated online communities with instructor’s involvement. Super-posting activity was less pronounced in networks with higher generalized exchange, and more pronounced in networks with higher direct reciprocity.
Presentation 1B2 (30 min):“Words Matter: Automatic Detection of Questions in Classroom Discourse using Linguistics, Paralinguistics, and Context”
(Full Research Paper)
by Patrick J Donnelly, Nathaniel Blanchard, Andrew M Olney, Sean KellyMartin Nystrand, Sidney K D’Mello
Abstract:
We investigate automatic detection of teacher questions from audio recordings collected in live classrooms with the goal of providing automated feedback to teachers. Using a dataset of audio recordings from 11 teachers across 37 class sessions, we automatically segment the audio into individual teacher utterances and code each as containing a question or not. We train supervised machine learning models to detect the human-coded questions using high-level linguistic features extracted from automatic speech recognition (ASR) transcripts, acoustic and prosodic features from the audio recordings, as well as context features, such as timing and turn-taking dynamics. Models are trained and validated independently of the teacher to ensure generalization to new teachers. We are able to distinguish questions and non-questions with a weighted F1 score of 0.69. A comparison of the three feature sets indicates that a model using linguistic features outperforms those using acoustic-prosodic and context features for question detection, but the combination of features yields a 5% improvement in overall accuracy compared to linguistic features alone. We discuss applications for pedagogical research, teacher formative assessment, and teacher professional development.
Presentation 1B3 (20 min):“Towards Mining Sequences and Dispersion of Rhetorical Moves in Student Written Texts”
(Short Research Paper)
by Simon Knight, Roberto Martinez-Maldonado, Andrew Gibson & Simon Buckingham Shum
Abstract:
There is an increasing interest in the analysis of both student’s writing and the temporal aspects of learning data. The analysis of higher-level learning features in writing contexts requires analyses of data that could be characterised in terms of the sequences and processes of textual features present. This paper (1) discusses the extant literature on sequential and process analyses of writing; and, based on this and our own first-hand experience on sequential analysis, (2) proposes a number of approaches to both preprocess and analyse sequences in whole-texts. We illustrate how the approaches could be applied to examples drawn from our own datasets of ‘rhetorical moves’ in written texts, and the potential each approach holds for providing insight into that data. Work is in progress to apply this model to provide empirical insights. Although, similar sequence or process mining techniques have not yet been applied to student writing, techniques applied to event data could readily be operationalised to undercover patterns in texts.

Session: LA Policies

Presentation 1C1 (30 min):“Learning Analytics in Higher Education – Challenges and Policies: A Review of Eight Learning Analytics Policies”
(Full Research Paper)
by Yi-Shan Tsai & Dragan Gašević
Abstract:
This paper presents the results of a review of eight policies for learning analytics of relevance for higher education, and discusses how these policies have tried to address prominent challenges in the adoption of learning analytics, as identified in the literature. The results show that more considerations need to be given to establishing communication channels among stakeholders and adopting pedagogy-based approaches to learning analytics. It also reveals the shortage of guidance for developing data literacy among end-users and evaluating the progress and impact of learning analytics. Moreover, the review highlights the need to establish formalised guidelines to monitor the soundness, effectiveness, and legitimacy of learning analytics. As interest in learning analytics among higher education institutions continues to grow, this review will provide insights into policy and strategic planning for the adoption of learning analytics.
Presentation 1C2 (30 min):“The Influence of Data Protection and Privacy Frameworks on the Design of Learning Analytics Systems”
(Full Research Paper)
by Tore Hoel, David Griffiths, Weiqin Chen 
Abstract:
Learning analytics open up a complex landscape of privacy and policy issues, which will influence how learning analytics systems and practices are designed. Research and development is governed by regulations for data storage and management, and by research ethics. Consequently, when moving solutions out the research labs implementers meet constraints defined in national laws and justified in privacy frameworks. This paper explores how the OECD, APEC and EU privacy frameworks seek to regulate data privacy, with significant implications for the discourse of learning, and ultimately, an impact on the design of tools, architectures and practices that now are on the drawing board. A detailed list of requirements for learning analytics systems is developed, based on the new legal requirements defined in the European General Data Protection Regulation, which from 2018 will be enforced as European law. The paper also gives an initial account of how the privacy discourse in Europe, Japan, South-Korea and China is developing and reflects upon the possible impact of the different privacy frameworks on the design of LA privacy solutions in these countries. This research contributes to knowledge of how concerns about privacy and data protection related to educational data can drive a discourse on new approaches to privacy engineering based on the principles of Privacy by Design. For the LAK community, this study represents the first attempt to conceptualise the issues of privacy and learning analytics in a cross-cultural context. The paper concludes with a plan to follow up this research on privacy policies and learning analytics systems development with a new international study.
Presentation 1C3 (20 min):“An Information Policy Perspective on Learning Analytics”
(Short Research paper)
by Caroline Haythornthwaite
Abstract:
Policy for learning analytics joins a stream of initiatives aimed at understanding the expanding world of information collection, storage, processing and dissemination being driven by computing technologies. This paper offers a information policy perspective on learning analytics, joining work by others on ethics and privacy in the management of learning analytics data [8], but extending to consider how issues play out across the information lifecycle and in the formation of policy. Drawing on principles from information policy both informs learning analytics and brings learning analytics into the information policy domain. The resulting combination can help inform policy development for educational institutions as they implement and manage learning analytics policy and practices. The paper begins with a brief summary of the information policy perspective, then addresses learning analytics with attention to various categories of consideration for policy development.

Session: Teacher Support Tools I

Presentation 1D1 (30 min):“Intelligent Tutors as Teachers’ Aides: Exploring Teacher Needs for Real-time Analytics in Blended Classrooms”
(Full Research Paper)
by Kenneth Holstein, Bruce M. McLaren & Vincent Aleven
Abstract:
Intelligent tutoring systems (ITSs) are commonly designed to enhance student learning. However, they are not typically designed to meet the needs of teachers who use them in their classrooms. ITSs generate a wealth of analytics about student learning and behavior, opening a rich design space for real-time teacher support tools such as dashboards. Whereas real-time dashboards for teachers have become popular with many learning technologies, we are not aware of projects that have designed dashboards for ITSs based on a broad investigation of teachers’ needs. We conducted design interviews with ten middle school math teachers to explore their needs for on-the-spot support during blended class sessions, as a first step in a user-centered design process of a real-time dashboard. Based on multi-methods analyses of this interview data, we identify several opportunities for ITSs to better support teachers’ needs. We highlight key tensions and tradeoffs in the design of such real-time supports for teachers, as revealed by “Speed Dating” possible futures with teachers. This paper has implications for our ongoing co-design of a real-time dashboard for ITSs, as well as broader implications for the design of ITSs that can effectively collaborate with teachers in classroom settings.
Presentation 1D2 (30 min):“Supporting classroom instruction with data visualization”
(Practitioner Presentation)
by Pei Xian Chia, Chee Hing Thong, Yiru Qiu & Zachary Kang
Abstract:
This paper elaborates on our understanding of data visualization and efforts to empower teachers by providing information to support just-in-time instruction in the context of an inquiry-based lesson with a game on identifying the names and formulae of ionic compounds (wRiteFormula). It highlights the process of identifying the data teachers need to gauge student learning and determining how to visualize the data appropriately. It also describes the efforts taken to ensure a high degree of accuracy in the data visualization, and usability and customizability for teachers. It concludes with suggestions on how teachers can use the visualized data in class.
Presentation 1D3 (20 min):“Implementing Predictive Learning Analytics on a Large Scale: The Teacher’s Perspective”
(Short Research Paper)
by Christothea Herodotou, Bart Rienties, Avinash Boroowa, Zdenek Zdrahal, Martin Hlosta & Galina Naydenova
Abstract:
In this paper, we describe a large-scale study about the use of predictive learning analytics data with 240 teachers in 10 modules at a distance learning higher education institution. The aim of the study was to illuminate teachers’ uses and practices of predictive data, in particular identify how predictive data was used to support students at risk of not completing or failing a module. Data were collected from statistical analysis of 17,033 students’ performance by the end of the intervention, teacher usage statistics, and five individual semi-structured interviews with teachers. Findings revealed that teachers endorse the use of predictive data to support their practice yet in diverse ways and raised the need for devising appropriate intervention strategies to support students at risk.

Afternoon Sessions (02:00 PM – 03:40 PM)

Click to Expand

Session: Skill Assessment 

Presentation 2A1 (30 min):“Scientific Modeling: Using learning analytics to examine student practices and classroom variation”
(Full Research Paper)
by David Quigley, Jonathan Ostwald & Tamara Sumner
Abstract:
Modeling has a strong focus in current science learning frameworks as a critical skill for students to learn. However, understanding students’ scientific models and their modeling practices at scale is a difficult task that has not been taken up by the research literature. The complex variables involved in classroom learning, such as teacher differences, increase the difficulty of understanding this problem. This work begins with an exploration of the methods used to explore students’ scientific modeling in the learning sciences space and the frameworks developed to characterize student modeling practices. Learning analytics can be used to leverage these frameworks of scientific modeling practices to explore questions around students’ scientific models and their modeling practices. These analyses are focused around the use of EcoSurvey, a collaborative, digital tool used in high-school biology classrooms to model the local ecosystem. This tool was deployed in ten biology classrooms and used with varying degrees of success. There are significant teacher-level differences found in the activity sequences of students using the EcoSurvey tool. The theoretical metrics around scientific modeling practices and automatically extracted feature sequences were also used in a classification task to automatically determine a particular student’s teacher. These results underline the power of learning analytics methods to give insight into how modeling practices are realized in the classroom. This work also informs changes to modeling tools, associated curricula, and supporting professional development around scientific modeling.
Presentation 2A2 (30 min):“MAP: Multimodal Assessment Platform for Interactive Communication Competency”
(Practitioner Presentation)
by Saad Khan
Abstract:
We describe a prototype system for automated human communication assessment. The system analyzes multimodal data of human-human and human-computer interactions to assess verbal and non-verbal communication competencies including speech delivery, language use, social affect and engagement.
Presentation 2A3 (30 min):“Predicting Math Performance Using Natural Language Processing Tools”
(Full Research Paper)

by Scott Crossley, Ran Liu & Danielle McNamara
Abstract:
A number of studies have demonstrated links between linguistic knowledge and performance in math. Studies examining these links in first language speakers of English have traditionally relied on correlational analyses between linguistic knowledge tests and standardized math tests. For second language (L2) speakers, the majority of studies have compared math performance between proficient and non-proficient speakers of English. In this study, we take a novel approach and examine the linguistic features of student language while they are engaged in collaborative problem solving within an on-line math tutoring system. We transcribe the students’ speech and use natural language processing tools to extract linguistic information related to text cohesion, lexical sophistication, and sentiment. Our criterion variables are individuals’ pretest and posttest math performance scores. In addition to examining relations between linguistic features of student language production and math scores, we also control for a number of non-linguistic factors including gender, age, grade, school, and content focus (procedural versus conceptual). Linear mixed effect modeling indicates that non-linguistic factors are not predictive of math scores. However, linguistic features related to cohesion affect and lexical proficiency explained approximately 30% of the variance (R2 = .303) in the math scores.

Session: Student Support Tools

Presentation 2B1 (30 min):“Widget, widget as you lead, I am performing well indeed! – Using results from a formative offline study to inform an empirical online study about a learning analytics widget in a collaborative learning environment”
(Full Research Paper)
by Maren Scheffel, Hendrik Drachsler, Karel Kreijns, Joop de Kraker & Marcus Specht
Abstract:
The collaborative learning processes of students in online learning environments can be supported by providing learning analytics-based visualisations that foster awareness and reflection about an individual’s as well as the team’s behaviour and their learning and collaboration processes. For this empirical study we implemented an activity widget into the online learning environment of a live five-months Master course and investigated the predictive power of the widget indicators towards the students grades and compared the results to those from an exploratory study with data collected in previous runs of the same course where the widget had not been in use. Together with information gathered from a quantitative as well as a qualitative evaluation of the activity widget during the course, the findings of this current study show that there are indeed predictive relations between the widget indicators and the grades, especially those regarding responsiveness, and indicate that some of the observed differences in the last run could be attributed to the implemented activity widget.
Presentation 2B2 (30 min):“Building a Transcript of the Future” (Full Research Paper)
by Benjamin Koester, James Fogel, William Murdock, Galina Grom & Timothy McKay
Abstract:
The pathways and learning outcomes of university students are the culmination of numerous experiences inside and outside of the classroom, with faculty and with other students, in both formal and casual settings. These interactions are guided by the general education requirements of the university and by the learning goals of the student. The only official record and representation of each student’s education is captured by their academic transcript: typically a list of courses described by name and number, grades recorded on an A-F scale and summarized by GPA, degrees awarded, and honors received. This limited approach reflects the technological affordances of a 20th century industrial age.
In recent years, scholars have begun to imagine a transcript of the future, perhaps combining a richer record of the student experience along with a portfolio of authentic products of student work. In this paper, we concentrate on first, and develop analytic methods for improving measures of both classroom performance and intellectual breadth. In each case, this is done by placing elements of individual transcripts in context using information about their peers. We frame the study by addressing basic questions. Were the courses taken by the student difficult on average? Did the individual stand out from their peers? Were the courses representative of a broad intellectual experience, or did the student delve into detail in the chosen field of study? And with whom did they take courses? We hope that these new approaches will inspire further investigation into multi-dimensional approaches to characterizing the student experience.
Presentation 2B3 (30 min):“Community Building around a Shared History: Rebooting Academic Reporting Tools at the University of Michigan”
(Practitioner Presentation)
by August Evrard & Chris Teplovs
Abstract:
In spring 2016, the central student government, faculty senate and provost at the University of Michigan agreed to release student evaluations of teaching (SET) back to active students. Coincidentally, a dashboard service known as Academic Reporting Tools, which had served visual summaries of historical course information to faculty and staff since 2006, was being rebooted as part of a campus-wide Digital Innovation Greenhouse initiative. We describe development of the rebooted service, known as ART 2.0, including rollout of a CourseProfile tool offering SET summaries (and more) and our ongoing efforts to build a sustainable community of practice around curricular information.

Session: Teacher Support Tools II

Presentation 2C1 (30 min):“An Instructor Dashboard for Real-Time Analytics in Interactive Programming Assignments”
(Full Research Paper)
by Nicholas Diana, Michael Eagle, John Stamper, Shuchi Grover, Marie Bienkowski & Satabdi Basu
Abstract:
Many introductory programming environments generate a large amount of log data, but making insights from these data accessible to instructors remains a challenge. This research demonstrates that student outcomes can be accurately predicted from student program states at various time points throughout the course, and integrates the resulting predictive models into an instructor dashboard. The effectiveness of the dashboard is evaluated by measuring how well the dashboard analytics correctly suggest that the instructor help students classified as most in need. Finally, we describe a method of matching low-performing students with high-performing peer tutors, and show that the inclusion of peer tutors not only increases the amount of help given, but the consistency of help availability as well.
Presentation 2C2 (30 min):“Connectivist Learning using SuiteC – Create, Connect, Collaborate, Compete !!”
(Practitioner Presentation)
by Sandeep Markondiah Jayaprakash, John Scott & Paul Kerschen
Abstract:
In this paper, we study students’ learning effectiveness through their use of a homegrown educational technology, Web Programming Grading Assistant (WPGA), which facilitates grading and feedback delivery of paper-based assessments. We designed a classroom study and collected data from a lower- division blended-instruction computer science class. We tracked and modeled students’ reviewing and reflecting behaviors from WPGA. Results show that students demonstrated an effort and desire to review assessments regardless of if they were graded for academic performance or for attendance. Hardworking students achieved higher exam scores on average and were found to review their exams and the correct questions frequently. Additionally, student cohorts exhibited similar initial reviewing patterns, but different in-depth reviewing and reflecting strategies. Ultimately, the work contributes to multidimensional learning analytics aggregation across physical and cybersphere.
Presentation 2C3 (30 min):“Real-time Learning Analytics for C Programming Language Courses”
(Full Research Paper)
by Xinyu Fu, Atsushi Shimada, Yuta Taniguchi, Daiki Suehiro & Hiroaki Ogata
Abstract:
Many universities choose the C programming language (C) as the first one they teach their students, early on in their program. However, students often consider programming courses difficult, and these courses often have among the highest dropout rates of computer science courses offered. It is therefore critical to provide more effective instruction to help students understand the syntax of C and prevent them losing interest in programming. Further, homework and paper-based exams are still the main assessment methods in the majority of classrooms. It is difficult for teachers to grasp students’ learning situation. To facilitate teaching and learning of C, in this article we propose a system—LAPLE (Learning Analytics in Programming Language Education)—that provides a learning dashboard to capture the behavior of students in the classroom and identify the different difficulties faced by different students looking at different knowledge. With LAPLE, teachers may better grasp students’ learning situation in real time and better improve educational materials using analysis results. For their part, novice undergraduate programmers may use LAPLE to locate syntax errors in C and get recommendations from educational materials on how to fix them.

Session: Feedback Systems

Presentation 2D1 (30 min):“Trends and Issues in Student-Facing Learning Analytics Reporting Systems Research”
(Full Research Paper)
by Robert Bodily & Katrien Verbert
Abstract:
We conducted a literature review on systems that track learning analytics data (e.g., resource use, time spent, assessment data, etc.) and provide a report back to students in the form of visualizations, feedback, or recommendations. This review included a rigorous article search process; 945 articles were identified in the initial search. After filtering out articles that did not fit within the scope of this article, 94 articles were included in the final analysis. Articles were coded on five categories chosen based on previous work done in this area: functionality, data sources, design analysis, perceived effects, and actual effects. The purpose of this review is to identify trends in the current student-facing learning analytics reporting system literature and provide recommendations for learning analytics researchers and practitioners for future work.
Presentation 2D2 (30 min):“Playing with Student Data: The Learning Analytics Report Card (LARC)”
(Practitioner Presentation)
by Jeremy Knox
Abstract:
While the field of Learning Analytics continues to develop, examples of student-focused projects have been in short supply. This paper will describe the design and implementation of the Learning Analytics Report Card (or LARC), an experimental project undertaken at the University of Edinburgh. The LARC involved postgraduate students as research partners and active participants in their own data analysis, and used data-to-text methods to generate a written textual report. The LARC allowed students to ‘play’ with their own data; choosing what to include or exclude, and when to generate the report. This paper will summarise key findings from the project.
Presentation 2D3 (30 min):“Uncovering Reviewing and Reflecting Behaviors From Paper-based Formal Assessment”
(Full Research Paper)
by Sharon Hsiao, Po-Kai Huang & Hannah Murphy
Abstract:
In this paper, we study students’ learning effectiveness through their use of a homegrown educational technology, Web Programming Grading Assistant (WPGA), which facilitates grading and feedback delivery of paper-based assessments. We designed a classroom study and collected data from a lower- division blended-instruction computer science class. We tracked and modeled students’ reviewing and reflecting behaviors from WPGA. Results show that students demonstrated an effort and desire to review assessments regardless of if they were graded for academic performance or for attendance. Hardworking students achieved higher exam scores on average and were found to review their exams and the correct questions frequently. Additionally, student cohorts exhibited similar initial reviewing patterns, but different in-depth reviewing and reflecting strategies. Ultimately, the work contributes to multidimensional learning analytics aggregation across physical and cybersphere.

 


Top