ARGA-Unet: Advanced U-net segmentation model using residual grouped convolution and attention mechanism for brain tumor MRI image segmentation

Siyi XUN, Yan ZHANG, Sixu DUAN, Mingwei WANG, Jiangang CHEN, Tong TONG, Qinquan GAO, Chantong LAM, Menghan HU, Tao TAN

Research output: Contribution to journalArticlepeer-review

Abstract

Background: Magnetic resonance imaging (MRI) has played an important role in the rapid growth of medical imaging diagnostic technology, especially in the diagnosis and treatment of brain tumors owing to its non-invasive characteristics and superior soft tissue contrast. However, brain tumors are characterized by high non-uniformity and non-obvious boundaries in MRI images because of their invasive and highly heterogeneous nature. In addition, the labeling of tumor areas is time-consuming and laborious. Methods: To address these issues, this study uses a residual grouped convolution module, convolutional block attention module, and bilinear interpolation upsampling method to improve the classical segmentation network U-net. The influence of network normalization, loss function, and network depth on segmentation performance is further considered. Results: In the experiments, the Dice score of the proposed segmentation model reached 97.581%, which is 12.438% higher than that of traditional U-net, demonstrating the effective segmentation of MRI brain tumor images. Conclusions: In conclusion, we use the improved U-net network to achieve a good segmentation effect of brain tumor MRI images.

Original languageEnglish
Pages (from-to)203-216
Number of pages14
JournalVirtual Reality and Intelligent Hardware
Volume6
Issue number3
DOIs
Publication statusPublished - Jun 2024

Keywords

  • Attention mechanism
  • Brain tumor
  • Deep learning
  • MRI
  • Segmentation
  • U-net

Fingerprint

Dive into the research topics of 'ARGA-Unet: Advanced U-net segmentation model using residual grouped convolution and attention mechanism for brain tumor MRI image segmentation'. Together they form a unique fingerprint.

Cite this