Human-computer interaction pattern recognition based on dynamic tactile images

With the continuous development of technology, human-machine interaction (HMI) technology plays an increasingly important role in daily life and work, and is widely used in various applications, such as robot control, augmented reality and virtual reality. Human-computer interaction pattern recognit...

Celý popis

Uloženo v:
Podrobná bibliografie
Vydáno v:IEEE Advanced Information Management, Communicates, Electronic and Automation Control Conference (IMCEC ...) (Online) Ročník 6; s. 282 - 286
Hlavní autoři: Meng, Wujun, Kong, Yongkang, Yang, Fuping, Zhao, Yongting, Wei, Dapeng
Médium: Konferenční příspěvek
Jazyk:angličtina
Vydáno: IEEE 24.05.2024
Témata:
ISSN:2693-2776
On-line přístup:Získat plný text
Tagy: Přidat tag
Žádné tagy, Buďte první, kdo vytvoří štítek k tomuto záznamu!
Popis
Shrnutí:With the continuous development of technology, human-machine interaction (HMI) technology plays an increasingly important role in daily life and work, and is widely used in various applications, such as robot control, augmented reality and virtual reality. Human-computer interaction pattern recognition based on dynamic tactile can not only be applied to traditional computing devices, such as smartphones and computers, but can also be expanded to emerging fields such as virtual reality and augmented reality. By integrating tactile feedback into interaction design, users can more intuitively perceive the status and response of the system, improving the naturalness and immediacy of interaction. This paper proposes a convolutional neural network called TetNet, which uses depth-separable convolution, multi-scale feature extraction module, and asymmetric convolutional blocks to recognize human-computer interaction patterns of dynamic tactile images. A total of 20 testers' data are collected for experiments, and the data are pre-processed by kmeans keyframe extraction and fed into the neural network, and compared with several classical convolutional neural networks, the proposed model can achieve a test accuracy of 95.12%.
ISSN:2693-2776
DOI:10.1109/IMCEC59810.2024.10575721