Focus-RCNet: a lightweight recyclable waste classification algorithm based on focus and knowledge distillation

Dashun Zheng, Rongsheng Wang, Yaofei Duan, Patrick Cheong Iao Pang, Tao Tan

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

Waste pollution is a significant environmental problem worldwide. With the continuous improvement in the living standards of the population and increasing richness of the consumption structure, the amount of domestic waste generated has increased dramatically, and there is an urgent need for further treatment. The rapid development of artificial intelligence has provided an effective solution for automated waste classification. However, the high computational power and complexity of algorithms make convolutional neural networks unsuitable for real-time embedded applications. In this paper, we propose a lightweight network architecture called Focus-RCNet, designed with reference to the sandglass structure of MobileNetV2, which uses deeply separable convolution to extract features from images. The Focus module is introduced to the field of recyclable waste image classification to reduce the dimensionality of features while retaining relevant information. To make the model focus more on waste image features while keeping the number of parameters small, we introduce the SimAM attention mechanism. In addition, knowledge distillation was used to further compress the number of parameters in the model. By training and testing on the TrashNet dataset, the Focus-RCNet model not only achieved an accuracy of 92 % but also showed high deployment mobility.

Original languageEnglish
Article number19
JournalVisual Computing for Industry, Biomedicine, and Art
Volume6
Issue number1
DOIs
Publication statusPublished - Dec 2023

Keywords

  • Attention
  • Knowledge distillation
  • Lightweight
  • Waste classification
  • Waste recycling

Fingerprint

Dive into the research topics of 'Focus-RCNet: a lightweight recyclable waste classification algorithm based on focus and knowledge distillation'. Together they form a unique fingerprint.

Cite this