TransMRSR: transformer-based self-distilled generative prior for brain MRI super-resolution

Shan Huang, Xiaohong Liu, Tao Tan, Menghan Hu, Xiaoer Wei, Tingli Chen, Bin Sheng

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)


Magnetic resonance images (MRI) acquired with low through-plane resolution compromise time and cost. The poor resolution in one orientation is insufficient to meet the requirement of high resolution for early diagnosis of brain disease and morphometric study. The common single-image super-resolution (SISR) solutions face two main challenges: (1) local detailed and global anatomical structural information combination; and (2) large-scale restoration when applied for reconstructing thick-slice MRI into high-resolution (HR) isotropic data. To address these problems, we propose a novel two-stage network for brain MRI SR named TransMRSR based on the convolutional blocks to extract local information and transformer blocks to capture long-range dependencies. TransMRSR consists of three modules: the shallow local feature extraction, the deep non-local feature capture, and the HR image reconstruction. We perform a generative task to encapsulate diverse priors into a generative network (GAN), which is the decoder sub-module of the deep non-local feature capture part, in the first stage. The pre-trained GAN is used for the second stage of SR task. We further eliminate the potential latent space shift caused by the two-stage training strategy through the self-distilled truncation trick. The extensive experiments show that our method achieves superior performance to other SSIR methods on both public and private datasets.

Original languageEnglish
Pages (from-to)3647-3659
Number of pages13
JournalVisual Computer
Issue number8
Publication statusPublished - Aug 2023


  • Generative piror
  • Magnetic resonance images
  • Super-resolution
  • Transformer


Dive into the research topics of 'TransMRSR: transformer-based self-distilled generative prior for brain MRI super-resolution'. Together they form a unique fingerprint.

Cite this