Small targets are easily lost and misjudged in detection tasks due to their relatively low resolution in images. Aiming at the problems of large recognition error and weak generalisation ability of small goods in the commodity recognition task when facing complex conditions such as hand occlusion and poor lighting, this paper proposes an adaptive small goods recognition method based on adversarial networks. Firstly, an end-to-end adversarial process-adapted network model is designed using the adversarial and recognition process a priori, and the super-resolution network is used as a generator for reconstructing the semantic information of small commodities, and the recognition network is used as a discriminator for classification and authenticity judgement of the input image. In addition, a new target feature reconstruction function is designed for reconstructing discriminative feature information. In order to solve the problem of large recognition error caused by the difference in the appearance of goods, a feature map-based attention mechanism is proposed for enhancing the sensory field of the recognition network, so that the recognition of large and small goods achieves more accurate results. Tested on SVHN dataset, CIFAR10 dataset and homemade small goods dataset, the adversarial structure improves the recognition accuracy by 4.1% compared with the cascade structure. Meanwhile, ablation experiments are designed to verify the effectiveness of the target feature reconstruction function and the feature graph attention mechanism for feature reconstruction and small goods recognition, respectively. The experimental results show that the method improves the accuracy of small goods recognition and is robust to multi-scale goods recognition and partial occlusion.
@artical{f1342024ijsea13041004,
Title = "Identification of Small Goods under the Adversarial Network",
Journal ="International Journal of Science and Engineering Applications (IJSEA)",
Volume = "13",
Issue ="4",
Pages ="17 - 21",
Year = "2024",
Authors ="Feng Liu, Shunli Li"}