Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator

Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA as a tangible deliverable, a number of automated tools which are called Automated Programming Assessment Systems (APAS) have been developed...

Full description

Saved in:
Bibliographic Details
Published in:Procedia computer science Vol. 72; pp. 186 - 193
Main Authors: Romli, Rohaida, Sulaiman, Shahida, Zamli, Kamal Zuhairi
Format: Journal Article
Language:English
Published: Elsevier B.V 2015
Subjects:
ISSN:1877-0509, 1877-0509
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA as a tangible deliverable, a number of automated tools which are called Automated Programming Assessment Systems (APAS) have been developed and tested for decades. Basically, the need for decreasing the load of work among lecturers, timely feedback to students and accuracy on the grading results are the common reasons that motivate the need for APAS. In order to carry out a dynamic testing in APA, it is necessary to prepare an appropriate and adequate set of test data to judge the correctness quality of students’ programming solutions in terms of the aspects of functional and/or structural testing. Manual preparation of quality test data becomes a hard, time consuming, and feasible task in the practice of both software testing and APA. Thus, the generation of automated test data is highly desirable to alleviate the humans’ burden from performing repetitive tasks. This paper aims to describe the design, implementation and user experience when evaluating a tool developed to support APA as a test data generator that is called FaSt-generator. The tool plays an important role to furnish a test set that includes an adequate set of test data to execute both the functional and structural testing in APA. Results collected from the conducted user experience evaluation using FaSt-generator reveal that all the subjects had relatively positive opinions and greatly favour the criteria of User Perception and End-User Computing Satisfaction (EUCS).
AbstractList Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA as a tangible deliverable, a number of automated tools which are called Automated Programming Assessment Systems (APAS) have been developed and tested for decades. Basically, the need for decreasing the load of work among lecturers, timely feedback to students and accuracy on the grading results are the common reasons that motivate the need for APAS. In order to carry out a dynamic testing in APA, it is necessary to prepare an appropriate and adequate set of test data to judge the correctness quality of students’ programming solutions in terms of the aspects of functional and/or structural testing. Manual preparation of quality test data becomes a hard, time consuming, and feasible task in the practice of both software testing and APA. Thus, the generation of automated test data is highly desirable to alleviate the humans’ burden from performing repetitive tasks. This paper aims to describe the design, implementation and user experience when evaluating a tool developed to support APA as a test data generator that is called FaSt-generator. The tool plays an important role to furnish a test set that includes an adequate set of test data to execute both the functional and structural testing in APA. Results collected from the conducted user experience evaluation using FaSt-generator reveal that all the subjects had relatively positive opinions and greatly favour the criteria of User Perception and End-User Computing Satisfaction (EUCS).
Author Sulaiman, Shahida
Romli, Rohaida
Zamli, Kamal Zuhairi
Author_xml – sequence: 1
  givenname: Rohaida
  surname: Romli
  fullname: Romli, Rohaida
  email: aida@uum.edu.my
  organization: School of Computing(SOC), College of Arts and Sciences, Universiti Utara Malaysia, 06010 UUM Sintok, Kedah, Malaysia
– sequence: 2
  givenname: Shahida
  surname: Sulaiman
  fullname: Sulaiman, Shahida
  organization: Faculty of Computing, Universiti Teknologi Malaysia, 81310 UTM Skudai, Johor, Malaysia
– sequence: 3
  givenname: Kamal Zuhairi
  surname: Zamli
  fullname: Zamli, Kamal Zuhairi
  organization: Faculty of Computer System and Engineering, Universiti Malaysia Pahang, Lebuhraya Tun Razak, 26300 Gambang, Kuantan, Pahang, Malaysia
BookMark eNqFkFFLwzAQx4NMcM59Al_6BVqTtmsawYcxNh0MFHRvQkiTy8hYm5FkQ7-9WeeD-KDHwR139zvu_tdo0NkOELolOCOYVHfbbO-s9FmOySQjeXR8gYakpjTFE8wGP_IrNPZ-i6MVdc0IHaL3ZRvpo-k2yfQQbCsCqOTF2Y0TbdtXvQfvW-iCv0_WHlwy_9iDM9BJSOZHsTuIYGwXW6fphXgN6QY6cCJYd4Mutdh5GH_HEVov5m-zp3T1_LicTVepLEkZUmC6lkzLqlaglWxAlCzHDaONUlQ2ha6qiiloKqlqMRFlrpucVjSvFQNNhSpGiJ33Sme9d6C5NKE_KzhhdpxgflKKb3mvFD8pxUkeHUe2-MXunWmF-_yHejhTEN86GnDcy14SZRzIwJU1f_JfUYSJlg
CitedBy_id crossref_primary_10_1109_ACCESS_2020_3024102
crossref_primary_10_1109_ACCESS_2024_3525061
crossref_primary_10_1002_cae_21974
crossref_primary_10_1109_ACCESS_2024_3430558
crossref_primary_10_1109_ACCESS_2021_3119145
Cites_doi 10.1006/imms.1993.1022
10.1145/1930464.1930480
10.15388/infedu.2004.19
10.1145/1151954.1067452
10.2307/248851
10.1145/1878335.1878381
10.1145/1294325.1294327
10.1145/544414.544433
10.1145/1163405.1163410
10.1145/1315803.1315819
10.1145/268084.268210
10.4236/jilsa.2012.41006
10.1109/ICSE.2013.6606662
10.1109/ITCC.2004.1286454
10.1109/TE.2010.2098442
10.1109/MySec.2014.6985993
10.1002/spe.839
10.1002/(SICI)1097-024X(19990710)29:8<721::AID-SPE257>3.0.CO;2-0
10.1145/2325296.2325344
10.1007/11528043_26
ContentType Journal Article
Copyright 2015 The Authors
Copyright_xml – notice: 2015 The Authors
DBID 6I.
AAFTH
AAYXX
CITATION
DOI 10.1016/j.procs.2015.12.120
DatabaseName ScienceDirect Open Access Titles
Elsevier:ScienceDirect:Open Access
CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 1877-0509
EndPage 193
ExternalDocumentID 10_1016_j_procs_2015_12_120
S1877050915035814
GroupedDBID --K
0R~
0SF
1B1
457
5VS
6I.
71M
AACTN
AAEDT
AAEDW
AAFTH
AAIKJ
AALRI
AAQFI
AAXUO
ABMAC
ACGFS
ADBBV
ADEZE
AEXQZ
AFTJW
AGHFR
AITUG
ALMA_UNASSIGNED_HOLDINGS
AMRAJ
E3Z
EBS
EJD
EP3
FDB
FNPLU
HZ~
IXB
KQ8
M41
M~E
NCXOZ
O-L
O9-
OK1
P2P
RIG
ROL
SES
SSZ
9DU
AAYWO
AAYXX
ABWVN
ACRPL
ACVFH
ADCNI
ADNMO
ADVLN
AEUPX
AFPUW
AIGII
AKBMS
AKRWK
AKYEP
CITATION
~HD
ID FETCH-LOGICAL-c414t-e9f8c9fc68defdcbea4920b97bdd7cb3f6669deb6cd8a5a42fb276728d9ef7ad3
ISICitedReferencesCount 8
ISICitedReferencesURI http://www.webofscience.com/api/gateway?GWVersion=2&SrcApp=Summon&SrcAuth=ProQuest&DestLinkType=CitingArticles&DestApp=WOS_CPL&KeyUT=000373775700023&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
ISSN 1877-0509
IngestDate Tue Nov 18 22:09:42 EST 2025
Sat Nov 29 02:44:49 EST 2025
Wed May 17 01:33:55 EDT 2023
IsDoiOpenAccess true
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Keywords functional testing
automatic programming assessment
structural testing
user experience
test data generation
Language English
License http://creativecommons.org/licenses/by-nc-nd/4.0
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-c414t-e9f8c9fc68defdcbea4920b97bdd7cb3f6669deb6cd8a5a42fb276728d9ef7ad3
OpenAccessLink https://dx.doi.org/10.1016/j.procs.2015.12.120
PageCount 8
ParticipantIDs crossref_citationtrail_10_1016_j_procs_2015_12_120
crossref_primary_10_1016_j_procs_2015_12_120
elsevier_sciencedirect_doi_10_1016_j_procs_2015_12_120
PublicationCentury 2000
PublicationDate 2015
2015-00-00
PublicationDateYYYYMMDD 2015-01-01
PublicationDate_xml – year: 2015
  text: 2015
PublicationDecade 2010
PublicationTitle Procedia computer science
PublicationYear 2015
Publisher Elsevier B.V
Publisher_xml – name: Elsevier B.V
References Choy M, Nazir U, Poon CK and Yu YT. Experiences in Using an Automated System for Improving Students’ of Computer Programming. Lecture Notes in Computer Science Learning. Springer Berlin/Heidelberg; 2005, p. 267-272.
Jackson D and Usher M. Grading Student Programs using ASSYST. In Proceedings of the 28
Tremblay G, Gúerin F, Pons A, and Salah A. Oto, A Generic and Extensible Tool for Marking Programming Assignments.
ACM Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE’12), Haifa, Israel; 2012, p. 192-197.
2005; 37(3): 9-13.
Doll WJ, and Torkzadeh G. The Measurement of End-User Computing Satisfaction
Sherman M, Bassil S, Lipman D, Tuck N and Martin F. Impact of Auto-Grading on an Introductory Computing Course.
2004; 3(2): 267-288.
Edition. Pearson Education, Inc: USA; 2003.
Higgins CA, Gray G, Symeonidis P, and Tsintsifas A. Automated Assessment and Experiences of Teaching Programming.
Romli R, Sulaiman S, and Zamli KZ. (2014), Test Data Generation Framework for Automatic Programming Assessment, Proceeding of 8th Malaysian Software Engineering Conference 2014 (MySEC’14), Langkawi, Malaysia.
Queiros R and Leal JS. PETCHA- A Programming Exercises Teaching Assistant. In Proceeding of the 17
Malmi L, Saikkonen R and Korhonen A. Experiences in Automatic Assessment on Mass Courses and Issues for Designing Virtual Courses. In Proceedings of The 7
Romli R, Sulaiman S, and Zamli KZ. (2013), Designing a Test Set for Structural Testing in Automatic Programming Assessment, International Journal of Advances in Soft Computing and Its Application (Special Issues on Application of Soft Computing in Software Engineering) 2013; 5(3): 41-64.
Romli R, Sulaiman S, and Zamli KZ. Test Data Generation in Automatic Programming Assessment: The Design of Test Set Schema for Functional Testing. Proceeding of 2nd International Conference on Advancements in Computing Technology (ICACT’11), Jeju Island, South Korea; 2011, p. 1078-1082.
Koli Calling International Conference on Computing Education Research; 2010, p. 86-93.
Davis FD. User Acceptance of Information Technology: System Characteristics, User Perceptions and Behavioural impacts.
2005; 42(D): 23-40.
2
Blumenstein M, Green S, Nguyen A and Muthukkumarasamy V. GAME: A Generic Automated Marking Environment for Programming Assessment. In Proceedings of the International Conference on Information Technology: Coding and Computing (ITCC’04); 2004, Vol. 2, p. 212-216.
1999; 29(8): 721-740.
McGraw-Hill/Osborne, California: USA; 2003.
Ihantola P. Automatic Test Data Generation for Programming Exercises with Symbolic Execution and Java PathFinder. Master Thesis of Helsinki University of Technology, Finland; 2006.
Auffarth B, Lopez-Sanchez M, Miralles JC, and Puig A. System for Automated Assistance in Correction of Programming Exercises (SAC). In Proceedings of the fifth CIDUI – V International Congress of University Teaching and Innovation; 2008.
International Symposium on Principles and Practice of Programming in Java (PPPJ’07), Lisboa, Portugal; 2007, p. 3-12.
Malmi L, Karavirta V, Korhonen A, Nikander J, Seppala O and Silvasti P. Visual Algorithm Simulation Exercise System with Automatic Assessment: TRAKLA2.
Conallen J.
Ajzen I, and Fishbein M.
1993; 38 (3): 475-487.
2008; 38 (3): 307-333.
Venkatesh V, Morris MG, Davis GB, and Davis FD. (2003). User acceptance of information technology: toward a unified view. Journal MIS Quarterly 2003; 27 (3): 425-478.
Roff JT.
2011; 54(4): 576-581.
Shukur Z, Romli R and Hamdan AB. Skema Penjanaan Data dan Pemberat Ujian Berasaskan Kaedah Analisis Nilai Sempadan (A Schema of Generating Test Data and Test Weight Based on Boundary Value Analysis Technique).
Annual Fall Conference on SIGUCCS, Norfolk, VA, USA; 2010, p. 181-186.
Luck M and Joy MS. Secure On-line Submission System.
Prentice-Hall, Englewood Cliffs: NJ; 1980.
SIGCSE Technical Symposium on Computer Science Education; 1997, p. 35-339.
2003; 28(6): 69-75.
Tillmann N, Halleux JD, Xie T, Gulwani S and Bishop J. Teaching and Learning Programming and Software Engineering via Interactive Gaming. In Proceedings of the 2013 International Conference on Software Engineering (ICSE’13), San Francisco,CA, USA; 2013, p. 1117-1126.
Shamsi FA and Elnagar A. An Intelligent Assessment Tool for Students’ Java Submissions in Introductory Programming Courses.
2012; 4(1): 59-69.
Gotel O, Scharff C and Wildenberg A. Extending and Contributing to an Open Source Web-Based System for the Assessment of Programming Problems. In Proceedings of the 5
2006; 5(3): Article 5.
Quarterly 1988; 12 (2): 259-274.
Nunome A, Hirata H, Fukuzawa M and Shibayama K. Development of an E-learning Back-end System for Code Assessment in Elementary Programming Practice. In Proceeding of the 38
Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE’ 02), Aarhus Denmark; 2002, p. 55-59.
Aleman JLF. Automated Assessment in Programming Tools Course.
Ihantola P, Ahoniemi T, and Karavirta V. Review of Recent Systems for Automatic Assessment of Programming Assignments. In Proceedings of the 10
Truong N, Bancroft P and Roe P. Learning to Program Through the Web.
10.1016/j.procs.2015.12.120_bib0135
10.1016/j.procs.2015.12.120_bib0015
10.1016/j.procs.2015.12.120_bib0115
10.1016/j.procs.2015.12.120_bib0010
10.1016/j.procs.2015.12.120_bib0055
10.1016/j.procs.2015.12.120_bib0110
10.1016/j.procs.2015.12.120_bib0035
10.1016/j.procs.2015.12.120_bib0050
10.1016/j.procs.2015.12.120_bib0095
10.1016/j.procs.2015.12.120_bib0030
10.1016/j.procs.2015.12.120_bib0075
10.1016/j.procs.2015.12.120_bib0130
10.1016/j.procs.2015.12.120_bib0090
10.1016/j.procs.2015.12.120_bib0070
10.1016/j.procs.2015.12.120_bib0025
10.1016/j.procs.2015.12.120_bib0125
10.1016/j.procs.2015.12.120_bib0005
10.1016/j.procs.2015.12.120_bib0105
10.1016/j.procs.2015.12.120_bib0065
10.1016/j.procs.2015.12.120_bib0120
10.1016/j.procs.2015.12.120_bib0045
10.1016/j.procs.2015.12.120_bib0100
10.1016/j.procs.2015.12.120_bib0145
10.1016/j.procs.2015.12.120_bib0040
10.1016/j.procs.2015.12.120_bib0085
10.1016/j.procs.2015.12.120_bib0140
10.1016/j.procs.2015.12.120_bib0020
10.1016/j.procs.2015.12.120_bib0080
10.1016/j.procs.2015.12.120_bib0060
References_xml – reference: SIGCSE Technical Symposium on Computer Science Education; 1997, p. 35-339.
– reference: ACM Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE’12), Haifa, Israel; 2012, p. 192-197.
– reference: Aleman JLF. Automated Assessment in Programming Tools Course.
– reference: . McGraw-Hill/Osborne, California: USA; 2003.
– reference: Venkatesh V, Morris MG, Davis GB, and Davis FD. (2003). User acceptance of information technology: toward a unified view. Journal MIS Quarterly 2003; 27 (3): 425-478.
– reference: 2006; 5(3): Article 5.
– reference: Romli R, Sulaiman S, and Zamli KZ. (2014), Test Data Generation Framework for Automatic Programming Assessment, Proceeding of 8th Malaysian Software Engineering Conference 2014 (MySEC’14), Langkawi, Malaysia.
– reference: Ajzen I, and Fishbein M.
– reference: Jackson D and Usher M. Grading Student Programs using ASSYST. In Proceedings of the 28
– reference: 2008; 38 (3): 307-333.
– reference: 2004; 3(2): 267-288.
– reference: 1999; 29(8): 721-740.
– reference: Sherman M, Bassil S, Lipman D, Tuck N and Martin F. Impact of Auto-Grading on an Introductory Computing Course.
– reference: Quarterly 1988; 12 (2): 259-274.
– reference: Malmi L, Saikkonen R and Korhonen A. Experiences in Automatic Assessment on Mass Courses and Issues for Designing Virtual Courses. In Proceedings of The 7
– reference: Malmi L, Karavirta V, Korhonen A, Nikander J, Seppala O and Silvasti P. Visual Algorithm Simulation Exercise System with Automatic Assessment: TRAKLA2.
– reference: . 2
– reference: 2011; 54(4): 576-581.
– reference: 2005; 42(D): 23-40.
– reference: Annual Fall Conference on SIGUCCS, Norfolk, VA, USA; 2010, p. 181-186.
– reference: Truong N, Bancroft P and Roe P. Learning to Program Through the Web.
– reference: Tillmann N, Halleux JD, Xie T, Gulwani S and Bishop J. Teaching and Learning Programming and Software Engineering via Interactive Gaming. In Proceedings of the 2013 International Conference on Software Engineering (ICSE’13), San Francisco,CA, USA; 2013, p. 1117-1126.
– reference: Choy M, Nazir U, Poon CK and Yu YT. Experiences in Using an Automated System for Improving Students’ of Computer Programming. Lecture Notes in Computer Science Learning. Springer Berlin/Heidelberg; 2005, p. 267-272.
– reference: Nunome A, Hirata H, Fukuzawa M and Shibayama K. Development of an E-learning Back-end System for Code Assessment in Elementary Programming Practice. In Proceeding of the 38
– reference: Romli R, Sulaiman S, and Zamli KZ. (2013), Designing a Test Set for Structural Testing in Automatic Programming Assessment, International Journal of Advances in Soft Computing and Its Application (Special Issues on Application of Soft Computing in Software Engineering) 2013; 5(3): 41-64.
– reference: Doll WJ, and Torkzadeh G. The Measurement of End-User Computing Satisfaction,
– reference: Queiros R and Leal JS. PETCHA- A Programming Exercises Teaching Assistant. In Proceeding of the 17
– reference: Shamsi FA and Elnagar A. An Intelligent Assessment Tool for Students’ Java Submissions in Introductory Programming Courses.
– reference: Tremblay G, Gúerin F, Pons A, and Salah A. Oto, A Generic and Extensible Tool for Marking Programming Assignments.
– reference: Edition. Pearson Education, Inc: USA; 2003.
– reference: Shukur Z, Romli R and Hamdan AB. Skema Penjanaan Data dan Pemberat Ujian Berasaskan Kaedah Analisis Nilai Sempadan (A Schema of Generating Test Data and Test Weight Based on Boundary Value Analysis Technique).
– reference: 1993; 38 (3): 475-487.
– reference: Romli R, Sulaiman S, and Zamli KZ. Test Data Generation in Automatic Programming Assessment: The Design of Test Set Schema for Functional Testing. Proceeding of 2nd International Conference on Advancements in Computing Technology (ICACT’11), Jeju Island, South Korea; 2011, p. 1078-1082.
– reference: . Prentice-Hall, Englewood Cliffs: NJ; 1980.
– reference: Roff JT.
– reference: Blumenstein M, Green S, Nguyen A and Muthukkumarasamy V. GAME: A Generic Automated Marking Environment for Programming Assessment. In Proceedings of the International Conference on Information Technology: Coding and Computing (ITCC’04); 2004, Vol. 2, p. 212-216.
– reference: Higgins CA, Gray G, Symeonidis P, and Tsintsifas A. Automated Assessment and Experiences of Teaching Programming.
– reference: Luck M and Joy MS. Secure On-line Submission System.
– reference: Annual Conference on Innovation and Technology in Computer Science Education (ITiCSE’ 02), Aarhus Denmark; 2002, p. 55-59.
– reference: Davis FD. User Acceptance of Information Technology: System Characteristics, User Perceptions and Behavioural impacts.
– reference: 2005; 37(3): 9-13.
– reference: 2003; 28(6): 69-75.
– reference: Auffarth B, Lopez-Sanchez M, Miralles JC, and Puig A. System for Automated Assistance in Correction of Programming Exercises (SAC). In Proceedings of the fifth CIDUI – V International Congress of University Teaching and Innovation; 2008.
– reference: Conallen J.
– reference: Koli Calling International Conference on Computing Education Research; 2010, p. 86-93.
– reference: Ihantola P. Automatic Test Data Generation for Programming Exercises with Symbolic Execution and Java PathFinder. Master Thesis of Helsinki University of Technology, Finland; 2006.
– reference: 2012; 4(1): 59-69.
– reference: Ihantola P, Ahoniemi T, and Karavirta V. Review of Recent Systems for Automatic Assessment of Programming Assignments. In Proceedings of the 10
– reference: International Symposium on Principles and Practice of Programming in Java (PPPJ’07), Lisboa, Portugal; 2007, p. 3-12.
– reference: Gotel O, Scharff C and Wildenberg A. Extending and Contributing to an Open Source Web-Based System for the Assessment of Programming Problems. In Proceedings of the 5
– ident: 10.1016/j.procs.2015.12.120_bib0110
– ident: 10.1016/j.procs.2015.12.120_bib0130
  doi: 10.1006/imms.1993.1022
– ident: 10.1016/j.procs.2015.12.120_bib0010
  doi: 10.1145/1930464.1930480
– ident: 10.1016/j.procs.2015.12.120_bib0135
– ident: 10.1016/j.procs.2015.12.120_bib0030
  doi: 10.15388/infedu.2004.19
– ident: 10.1016/j.procs.2015.12.120_bib0040
  doi: 10.1145/1151954.1067452
– ident: 10.1016/j.procs.2015.12.120_bib0145
  doi: 10.2307/248851
– ident: 10.1016/j.procs.2015.12.120_bib0065
  doi: 10.1145/1878335.1878381
– ident: 10.1016/j.procs.2015.12.120_bib0050
  doi: 10.1145/1294325.1294327
– ident: 10.1016/j.procs.2015.12.120_bib0080
– ident: 10.1016/j.procs.2015.12.120_bib0085
  doi: 10.1145/544414.544433
– ident: 10.1016/j.procs.2015.12.120_bib0045
  doi: 10.1145/1163405.1163410
– ident: 10.1016/j.procs.2015.12.120_bib0095
  doi: 10.1145/1315803.1315819
– ident: 10.1016/j.procs.2015.12.120_bib0125
– ident: 10.1016/j.procs.2015.12.120_bib0015
  doi: 10.1145/268084.268210
– ident: 10.1016/j.procs.2015.12.120_bib0140
– ident: 10.1016/j.procs.2015.12.120_bib0090
– ident: 10.1016/j.procs.2015.12.120_bib0075
  doi: 10.4236/jilsa.2012.41006
– ident: 10.1016/j.procs.2015.12.120_bib0100
  doi: 10.1109/ICSE.2013.6606662
– ident: 10.1016/j.procs.2015.12.120_bib0025
  doi: 10.1109/ITCC.2004.1286454
– ident: 10.1016/j.procs.2015.12.120_bib0115
– ident: 10.1016/j.procs.2015.12.120_bib0005
  doi: 10.1109/TE.2010.2098442
– ident: 10.1016/j.procs.2015.12.120_bib0105
  doi: 10.1109/MySec.2014.6985993
– ident: 10.1016/j.procs.2015.12.120_bib0060
  doi: 10.1002/spe.839
– ident: 10.1016/j.procs.2015.12.120_bib0020
  doi: 10.1002/(SICI)1097-024X(19990710)29:8<721::AID-SPE257>3.0.CO;2-0
– ident: 10.1016/j.procs.2015.12.120_bib0070
  doi: 10.1145/2325296.2325344
– ident: 10.1016/j.procs.2015.12.120_bib0120
– ident: 10.1016/j.procs.2015.12.120_bib0035
  doi: 10.1007/11528043_26
– ident: 10.1016/j.procs.2015.12.120_bib0055
SSID ssj0000388917
Score 2.0262098
Snippet Automatic Programming Assessment (APA) has been known as a method used to automatically mark and grade students’ programming solutions. In order to realise APA...
SourceID crossref
elsevier
SourceType Enrichment Source
Index Database
Publisher
StartPage 186
SubjectTerms automatic programming assessment
functional testing
structural testing
test data generation
user experience
Title Improving Automated Programming Assessments: User Experience Evaluation Using FaSt-generator
URI https://dx.doi.org/10.1016/j.procs.2015.12.120
Volume 72
WOSCitedRecordID wos000373775700023&url=https%3A%2F%2Fcvtisr.summon.serialssolutions.com%2F%23%21%2Fsearch%3Fho%3Df%26include.ft.matches%3Dt%26l%3Dnull%26q%3D
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
journalDatabaseRights – providerCode: PRVHPJ
  databaseName: ROAD: Directory of Open Access Scholarly Resources
  customDbUrl:
  eissn: 1877-0509
  dateEnd: 99991231
  omitProxy: false
  ssIdentifier: ssj0000388917
  issn: 1877-0509
  databaseCode: M~E
  dateStart: 20100101
  isFulltext: true
  titleUrlDefault: https://road.issn.org
  providerName: ISSN International Centre
link http://cvtisr.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1LaxsxEBZu2kMvfZemL3TozV3wah-ScgslodASSpNACIVFK2lrB3sdnHXIqT8gv7qj526bYJpDwQivbGntnW9npNmZbxD6oDKhUl3kiWa8TvKCs6QuuIQbT6WFKpuGFTZR-Cs9OGAnJ_zbaHQdcmEu57Rt2dUVP_-vooY-ELZJnb2DuOOk0AHvQejQgtih_SfB926C3XW3hAWptukAJgpr4X0gnovTRsMdX5gwj0h4PN6L9N9jF06wLw675Kdlp3bMxHExa5MMAF82Lt2Uhhh7e9o_xVm47Ovvy6mY9Xv_w_VczILrdSqmg49OhR_yRSwAPadrGLiaDX0TLi_TK1JGaWK4ZZyduaXPa19XuMerzzTQYrsjVzvxhpJ3_oYzY2KkYVxPC-PRTcmkt2nhOf5fpi4GIIbYtrPKTlKZSaqUwGtyD90nFLZZJgz0V--vM6w53BZwjn8jkFjZcMEbP-b2hc5g8XL0BD3yuw6869DyFI10-ww9DhU9sFfwz9GPCB4cwYMH4MED8OxgAx3cQwf30MEWOvhP6LxAx_t7R58-J77-RiLzNO8SzRsmeSNLpnSjZK1Fzsmk5rRWiso6a2Dry5WuS6mYKEROmprQkhKmuG6oUNlLtNUuW_0K4ZJTxTOwykVN8gzaAi6ozvNGTEqiFNlGJFysSnpyelMjZV5tkNQ2-hgHnTtuls1fL4MUKn87uGVjBcDaNPD13c7zBj00R85P9xZtdau1foceyMtudrF6b1H1GxQ7ocM
linkProvider ISSN International Centre
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Improving+Automated+Programming+Assessments%3A+User+Experience+Evaluation+Using+FaSt-generator&rft.jtitle=Procedia+computer+science&rft.au=Romli%2C+Rohaida&rft.au=Sulaiman%2C+Shahida&rft.au=Zamli%2C+Kamal+Zuhairi&rft.date=2015&rft.issn=1877-0509&rft.eissn=1877-0509&rft.volume=72&rft.spage=186&rft.epage=193&rft_id=info:doi/10.1016%2Fj.procs.2015.12.120&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_procs_2015_12_120
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1877-0509&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1877-0509&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1877-0509&client=summon