Analyzing students' perceptions to improve the design of an automated assessment tool in online distributed programming

Designing an automated assessment tool in online distributed programming can provide students with a meaningful distributed learning environment that improves their academic performance. However, it is a complex and challenging endeavor that, as far as we know, has not been investigated yet. To addr...

Full description

Saved in:
Bibliographic Details
Published in:Computers and education Vol. 128; pp. 159 - 170
Main Authors: Daradoumis, Thanasis, Marquès Puig, Joan Manuel, Arguedas, Marta, Calvet Liñan, Laura
Format: Journal Article
Language:English
Published: Elsevier Ltd 01.01.2019
Subjects:
ISSN:0360-1315, 1873-782X
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Designing an automated assessment tool in online distributed programming can provide students with a meaningful distributed learning environment that improves their academic performance. However, it is a complex and challenging endeavor that, as far as we know, has not been investigated yet. To address this research gap, this work presents a new automated assessment tool in online distributed programming, called DSLab. The tool was evaluated in a real long-term online educational experience by analyzing students' perceptions with the aim of improving its design. A quantitative analysis method was employed to collect and analyze data concerning students’ perceptions as to whether using the DSLab tool was really a worthwhile experience. Our study shows that the DSLab tool includes acceptable utility and efficiency features. It also identifies factors that influence current design efficiency with the aim of improving DSLab design by suggesting new functionalities and ideas. •A new automated assessment tool in online distributed programming is proposed.•The students' and teachers' perceptions are analyzed in a real learning environment.•The aim of the analysis is to improve the design of the automated assessment tool.•The analysis focused on tool utility both for students and teachers.•Factors were identified that influence current design efficiency & suggest new ideas.
ISSN:0360-1315
1873-782X
DOI:10.1016/j.compedu.2018.09.021