SSNFNet: An Enhanced Few-Shot Learning Model for Efficient Poultry Farming Detection.

Saved in:
Bibliographic Details
Title: SSNFNet: An Enhanced Few-Shot Learning Model for Efficient Poultry Farming Detection.
Authors: Wang, Bingli, Liu, Daixian, Wu, Jinghua
Source: Animals (2076-2615); Aug2025, Vol. 15 Issue 15, p2252, 25p
Subject Terms: POULTRY farming, OBJECT recognition (Computer vision), SUSTAINABLE agriculture, AGRICULTURAL technology, MACHINE learning, SMOOTHING (Numerical analysis)
Abstract: Simple Summary: Poultry farming struggles with inefficiency and disease control, while existing smart monitoring systems require large amounts of labeled data to detect new types, limiting their practical use. To address this, we developed SSNFNet, a method that works effectively with only a few training images, making it easier to adapt to different farms. Our approach improves detection in crowded environments where birds often overlap, and performs well even with new species. We trained the model mostly on ducks and then used a small number of images of chickens to teach it to recognize them, demonstrating its ability to learn quickly. We also tested it on goldfish to confirm its flexibility. Results show that our method outperforms standard detection systems and other few-shot learning models by 3.93%. Key improvements include Sharpness-Aware Minimization (SAM), which stabilizes training with limited data, and Soft-NMS, which reduces errors when detecting birds in dense groups. This technology helps farmers monitor poultry more accurately with less manual effort, making smart farming more accessible, especially where collecting large datasets is difficult. By improving efficiency and reducing reliance on extensive labeling, our method supports sustainable and cost-effective poultry farming. Smart agriculture addresses inefficient resource utilization and disease control in poultry farming. Existing smart monitoring systems effectively detect birds. However, applying these models to new environments or for detecting new species requires a large amount of labeled data and manual work, which limits their wide application. To address this limitation, this paper presents the SSNFNet method, leveraging an enhanced few-shot object detection framework tailored for poultry farming contexts. SSNFNet integrates Sharpness-Aware Minimization (SAM) to enhance model generalization by smoothing the loss landscape and improving training stability. To further improve detection in densely populated scenes, we incorporate the Soft Non-Maximum Suppression (Soft-NMS) algorithm to mitigate overlapping bounding box issues. Through quantitative analysis and comparison, exemplified by a five-shot scenario on the poultry farming dataset, our method demonstrates significantly better performance compared to traditional object detection models, Specifically, it achieves a mean Average Precision (mAP) improvement of 3.93% compared to the state-of-the-art HTRPN model, raising the mAP from 78.00% to 81.93% while maintaining 8 FPS inference speed on Jetson Nano-class hardware. These results confirm the effectiveness and adaptability of our approach in real-world smart farming environments. [ABSTRACT FROM AUTHOR]
Copyright of Animals (2076-2615) is the property of MDPI and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
Database: Biomedical Index
Description
Abstract:Simple Summary: Poultry farming struggles with inefficiency and disease control, while existing smart monitoring systems require large amounts of labeled data to detect new types, limiting their practical use. To address this, we developed SSNFNet, a method that works effectively with only a few training images, making it easier to adapt to different farms. Our approach improves detection in crowded environments where birds often overlap, and performs well even with new species. We trained the model mostly on ducks and then used a small number of images of chickens to teach it to recognize them, demonstrating its ability to learn quickly. We also tested it on goldfish to confirm its flexibility. Results show that our method outperforms standard detection systems and other few-shot learning models by 3.93%. Key improvements include Sharpness-Aware Minimization (SAM), which stabilizes training with limited data, and Soft-NMS, which reduces errors when detecting birds in dense groups. This technology helps farmers monitor poultry more accurately with less manual effort, making smart farming more accessible, especially where collecting large datasets is difficult. By improving efficiency and reducing reliance on extensive labeling, our method supports sustainable and cost-effective poultry farming. Smart agriculture addresses inefficient resource utilization and disease control in poultry farming. Existing smart monitoring systems effectively detect birds. However, applying these models to new environments or for detecting new species requires a large amount of labeled data and manual work, which limits their wide application. To address this limitation, this paper presents the SSNFNet method, leveraging an enhanced few-shot object detection framework tailored for poultry farming contexts. SSNFNet integrates Sharpness-Aware Minimization (SAM) to enhance model generalization by smoothing the loss landscape and improving training stability. To further improve detection in densely populated scenes, we incorporate the Soft Non-Maximum Suppression (Soft-NMS) algorithm to mitigate overlapping bounding box issues. Through quantitative analysis and comparison, exemplified by a five-shot scenario on the poultry farming dataset, our method demonstrates significantly better performance compared to traditional object detection models, Specifically, it achieves a mean Average Precision (mAP) improvement of 3.93% compared to the state-of-the-art HTRPN model, raising the mAP from 78.00% to 81.93% while maintaining 8 FPS inference speed on Jetson Nano-class hardware. These results confirm the effectiveness and adaptability of our approach in real-world smart farming environments. [ABSTRACT FROM AUTHOR]
ISSN:20762615
DOI:10.3390/ani15152252