MASK GUIDED SPATIAL-TEMPORAL FUSION NETWORK FOR MULTIPLE OBJECT TRACKING

Shuangye Zhao, Yubin Wu, Shuai Wang, Wei Ke, Hao Sheng

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

Multi-object trackers make the association almost perfectly when no occlusion occurred between two or more targets. However, it is hard to extract reliable features on account of partial occlusion caused by a nearby object, which often leads to tracking failure. In this paper, we utilize mask to guide attention of the neural network in order to focus on the visible part of the target and design a tracklet-level feature extraction method. Then, a tracking framework is proposed based on a mask guided fusion network and multi-hypothesis tracking algorithm. Comprehensive evaluation on the MOT17 dataset shows that our approach achieves competitive results.

Original languageEnglish
Title of host publication2022 IEEE International Conference on Image Processing, ICIP 2022 - Proceedings
PublisherIEEE Computer Society
Pages3231-3235
Number of pages5
ISBN (Electronic)9781665496209
DOIs
Publication statusPublished - 2022
Event29th IEEE International Conference on Image Processing, ICIP 2022 - Bordeaux, France
Duration: 16 Oct 202219 Oct 2022

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Conference

Conference29th IEEE International Conference on Image Processing, ICIP 2022
Country/TerritoryFrance
CityBordeaux
Period16/10/2219/10/22

Keywords

  • Multi-object tracking
  • feature extraction
  • mask guided network
  • multiple hypothesis tracking
  • tracking by detection

Fingerprint

Dive into the research topics of 'MASK GUIDED SPATIAL-TEMPORAL FUSION NETWORK FOR MULTIPLE OBJECT TRACKING'. Together they form a unique fingerprint.

Cite this