Enhancing low-frequency stitch code generation for knitted fabrics: an LFSCG-E-Net approach.

Saved in:
Bibliographic Details
Title: Enhancing low-frequency stitch code generation for knitted fabrics: an LFSCG-E-Net approach.
Authors: Liang, Jinxing, Han, Kaifang, Li, Dongsheng, Gao, Ruixin, Peng, Jiajia, Peng, Tao, Hu, Xinrong
Source: Visual Computer; Aug2025, Vol. 41 Issue 10, p7925-7938, 14p
Subject Terms: KNITTING patterns, FEATURE extraction, IMAGE intensifiers, SOURCE code, MANUFACTURING processes
Abstract: Stitch code generation technology plays a crucial role in enhancing designers' efficiency during the production process and shortening the production cycle. However, existing methods face the problem of low accuracy in generating low-frequency stitch codes, which are vital for complex knitting patterns. In this paper, we propose LFSCG-E-Net, an enhanced model based on InverseKnit, aiming to improve the generation accuracy of low-frequency stitch codes. By integrating a feature pyramid network within RefinerNet, we improve the texture detail enhancement of input images. Furthermore, we introduce residual attention block with convolutional block attention modules into InferNet to boost feature extraction capabilities. Additionally, a pyramidal feature hierarchy module is developed to extract and merge multi-scale feature maps by combining the spatial pyramid depth and atrous spatial pyramid pooling. To tackle class imbalance, we incorporate the Dice coefficient into the loss function. Experiments on a public dataset demonstrate that our model achieves an overall accuracy of 94.9% and a foreground accuracy of 82.87%, outperforming state-of-the-art methods, especially in low-frequency stitch code generation. This work not only enhances the precision of stitch code generation but also contributes to the automation of knitting pattern design, meeting the industry's demand for efficiency and cost-effectiveness. The source code is available at https://github.com/1033216625/LFSCG-E-Net-. [ABSTRACT FROM AUTHOR]
Copyright of Visual Computer is the property of Springer Nature and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Database: Complementary Index
Description
Abstract:Stitch code generation technology plays a crucial role in enhancing designers' efficiency during the production process and shortening the production cycle. However, existing methods face the problem of low accuracy in generating low-frequency stitch codes, which are vital for complex knitting patterns. In this paper, we propose LFSCG-E-Net, an enhanced model based on InverseKnit, aiming to improve the generation accuracy of low-frequency stitch codes. By integrating a feature pyramid network within RefinerNet, we improve the texture detail enhancement of input images. Furthermore, we introduce residual attention block with convolutional block attention modules into InferNet to boost feature extraction capabilities. Additionally, a pyramidal feature hierarchy module is developed to extract and merge multi-scale feature maps by combining the spatial pyramid depth and atrous spatial pyramid pooling. To tackle class imbalance, we incorporate the Dice coefficient into the loss function. Experiments on a public dataset demonstrate that our model achieves an overall accuracy of 94.9% and a foreground accuracy of 82.87%, outperforming state-of-the-art methods, especially in low-frequency stitch code generation. This work not only enhances the precision of stitch code generation but also contributes to the automation of knitting pattern design, meeting the industry's demand for efficiency and cost-effectiveness. The source code is available at https://github.com/1033216625/LFSCG-E-Net-. [ABSTRACT FROM AUTHOR]
ISSN:01782789
DOI:10.1007/s00371-025-03846-4