An intelligent tactile imaging-recognition sensor system enabled via a methoxynitrobenzene-salicylaldehyde fluorescent material

Tactile sensors utilizing functional materials decode surface textures for object recognition. Herein, we engineer a donor-acceptor fluorescent material, MNIMP, that synergizes aggregation-induced emission (AIE) and twisted intramolecular charge transfer (TICT) mechanisms. Contact-induced nanoflake...

Celý popis

Uložené v:
Podrobná bibliografia
Vydané v:Materials horizons Ročník 12; číslo 19; s. 8095
Hlavní autori: Liu, Zihan, Zhao, Xinyi, Duan, Yuai, Li, Yaping, Wang, Zhijia, Wang, Zixuan, Zhang, Jiarong, Yuan, Jing, Geng, Hua, Han, Tianyu
Médium: Journal Article
Jazyk:English
Vydavateľské údaje: England 29.09.2025
ISSN:2051-6355, 2051-6355
On-line prístup:Zistit podrobnosti o prístupe
Tagy: Pridať tag
Žiadne tagy, Buďte prvý, kto otaguje tento záznam!
Popis
Shrnutí:Tactile sensors utilizing functional materials decode surface textures for object recognition. Herein, we engineer a donor-acceptor fluorescent material, MNIMP, that synergizes aggregation-induced emission (AIE) and twisted intramolecular charge transfer (TICT) mechanisms. Contact-induced nanoflake assembly on the MNIMP film triggers fluorescence amplification mediated by the combined AIE and TICT effects, through which the surface morphology of textured objects can be accurately visualized as fluorescent patterns. MNIMP maps micro-textures of materials such as rubber, fabrics, and elastic polymers under tactile pressure with kPa-level sensitivity, seamlessly integrating visual and tactile perceptions. These fluorescent signatures can be recognized using a deep-learning model with >98% accuracy. Hardware integration with the embedded algorithm model creates an intelligent tactile sensor system performing concurrent contact imaging, data analysis, and classification. This intelligent platform demonstrates micron-scale resolution and cost-effective manufacturability while maintaining high signal fidelity across diverse target objects.
Bibliografia:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2051-6355
2051-6355
DOI:10.1039/d5mh00731c