Educational Behaviour Analysis Using Convolutional Neural Network and Particle Swarm Optimization Algorithm

With the continuous development of online technology, online education has become a trend. To improve the quality of online education, a comprehensive and effective analysis of educational behaviour is necessary. In this paper, we proposed a network model based on the ResNet50 network fused with a b...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Advances in Multimedia Ročník 2022; s. 1 - 10
Hlavný autor: Dong, Zhenjiang
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: New York Hindawi 07.07.2022
John Wiley & Sons, Inc
Wiley
Predmet:
ISSN:1687-5680, 1687-5699
On-line prístup:Získať plný text
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:With the continuous development of online technology, online education has become a trend. To improve the quality of online education, a comprehensive and effective analysis of educational behaviour is necessary. In this paper, we proposed a network model based on the ResNet50 network fused with a bilinear hybrid attention mechanism and proposed an adaptive pooling weight algorithm based on the average pooling algorithm for the problems of image feature extraction caused by traditional pooling algorithm such as mutilation and blurring. At the same time, the hyperparameters of the convolutional neural network model are adaptively adjusted based on the particle swarm algorithm to improve the model recognition accuracy further. Through the experimental validation on NTU-RGB + D and NTU-RGB + D120 data set, the recognition accuracy of this paper is 88.8% for cross-subject (CS), 94.7% for cross-view (CV), 82.8% for cross-subject (CSub) 83.2%, and 84.3% for cross-setup (CSet), respectively. The experimental results show that the algorithm in this paper is an effective method for educational behaviur recognition.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:1687-5680
1687-5699
DOI:10.1155/2022/9449328