Analyzing students' perceptions to improve the design of an automated assessment tool in online distributed programming

Designing an automated assessment tool in online distributed programming can provide students with a meaningful distributed learning environment that improves their academic performance. However, it is a complex and challenging endeavor that, as far as we know, has not been investigated yet. To addr...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Computers and education Ročník 128; s. 159 - 170
Hlavní autori: Daradoumis, Thanasis, Marquès Puig, Joan Manuel, Arguedas, Marta, Calvet Liñan, Laura
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: Elsevier Ltd 01.01.2019
Predmet:
ISSN:0360-1315, 1873-782X
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Designing an automated assessment tool in online distributed programming can provide students with a meaningful distributed learning environment that improves their academic performance. However, it is a complex and challenging endeavor that, as far as we know, has not been investigated yet. To address this research gap, this work presents a new automated assessment tool in online distributed programming, called DSLab. The tool was evaluated in a real long-term online educational experience by analyzing students' perceptions with the aim of improving its design. A quantitative analysis method was employed to collect and analyze data concerning students’ perceptions as to whether using the DSLab tool was really a worthwhile experience. Our study shows that the DSLab tool includes acceptable utility and efficiency features. It also identifies factors that influence current design efficiency with the aim of improving DSLab design by suggesting new functionalities and ideas. •A new automated assessment tool in online distributed programming is proposed.•The students' and teachers' perceptions are analyzed in a real learning environment.•The aim of the analysis is to improve the design of the automated assessment tool.•The analysis focused on tool utility both for students and teachers.•Factors were identified that influence current design efficiency & suggest new ideas.
ISSN:0360-1315
1873-782X
DOI:10.1016/j.compedu.2018.09.021