Skip to main content
SHARE
Publication

Towards Adaptive Educational Assessments: Predicting Student Performance using Temporal Stability and Data Analytics in Learn...

by Gautam Thakur, Mohammed M Olama, Allen W Mcnair Iii, Sreenivas R Sukumar
Publication Type
Conference Paper
Publication Date
Conference Name
20th ACM SIGKDD conference on knowledge discovery and data mining - ACCESS
Conference Location
New York City, New York, United States of America
Conference Date
-

Data-driven assessments and adaptive feedback are becoming a cornerstone
research in educational data analytics and involve developing
methods for exploring the unique types of data that come
from the educational context. For example, predicting college student
performance is crucial for both the students and educational
institutions. It can support timely intervention to prevent students
from failing a course, increasing efficacy of advising functions,
and improving course completion rate. In this paper, we present
our efforts in using data analytics that enable educationists to design
novel data-driven assessment and feedback mechanisms. In
order to achieve this objective, we investigate temporal stability
of students grades and perform predictive analytics on academic
data collected from 2009 through 2013 in one of the most commonly
used learning management systems, called Moodle. First,
we have identified the data features useful for assessments and predicting
student outcomes such as students’ scores in homework assignments,
quizzes, exams, in addition to their activities in discussion
forums and their total Grade Point Average(GPA) at the same
term they enrolled in the course. Second, time series models in
both frequency and time domains are applied to characterize the
progression as well as overall projections of the grades. In particular,
the model analyzed the stability as well as fluctuation of
grades among students during the collegiate years (from freshman
to senior) and disciplines. Third, Logistic Regression and Neural
Network predictive models are used to identify students as early as
possible who are in danger of failing the course they are currently
enrolled in. These models compute the likelihood of any given student
failing (or passing) the current course. The time series analysis
indicates that assessments and continuous feedback are critical for freshman and sophomores (even with easy courses) than for seniors,
and those assessments may be provided using the predictive
models. Numerical results are presented to evaluate and compare
the performance of the developed models and their predictive accuracy.
Our results show that there are strong ties associated with
the first few weeks for coursework and they have an impact on the
design and distribution of individual modules.