HRM-CenterNet: A High-Resolution Real-time Fittings Detection Method.*

Ke Zhang, Kai Zhao, Xiwang Guo, Xiaohan Feng, Ying Tang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

Most successful fittings detectors are anchor-based, which is challenging to meet the lightweight and real-time requirements of the edge computing system. We propose a high-resolution real-time network HRM-CenterNet. Firstly, the lightweight MobileNetV3 is used to extract multi-level features from images. Then, to improve the resolution of the feature maps and reduce the spatial semantic information loss during the image downsampling process, a high-resolution feature fusion network based on iterative aggregation is introduced. Finally, we conduct experiments on the PASCAL VOC dataset and fittings dataset. The results show that HRM-CenterNet improves accuracy as well as robustness, and meets the performance requirements of real-time edge detection.

Original languageEnglish (US)
Title of host publication2021 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages564-569
Number of pages6
ISBN (Electronic)9781665442077
DOIs
StatePublished - 2021
Event2021 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2021 - Melbourne, Australia
Duration: Oct 17 2021Oct 20 2021

Publication series

NameConference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
ISSN (Print)1062-922X

Conference

Conference2021 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2021
Country/TerritoryAustralia
CityMelbourne
Period10/17/2110/20/21

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering
  • Control and Systems Engineering
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'HRM-CenterNet: A High-Resolution Real-time Fittings Detection Method.*'. Together they form a unique fingerprint.

Cite this