Detection of Moving Target Direction for Ground Surveillance Radar Based on Deep Learning Yer Gözetleme Radari Için Derin Öǧrenme Tabanli Hareketli Hedef Yönü Tespiti

ÖMEROĞLU A. N., Mohammed H. M. A., ORAL E. A., ÖZBEK İ. Y.

30th Signal Processing and Communications Applications Conference, SIU 2022, Safranbolu, Turkey, 15 - 18 May 2022 identifier

  • Publication Type: Conference Paper / Full Text
  • Doi Number: 10.1109/siu55565.2022.9864847
  • City: Safranbolu
  • Country: Turkey
  • Keywords: DCGAN, detection of moving targets direction, micro-Doppler signature
  • Ataturk University Affiliated: Yes


© 2022 IEEE.In defense and security applications, detection of moving target direction is as important as the target detection and/or target classification. In this study, a methodology for the detection of different mobile targets as approaching or receding was proposed for ground surveillance radar data, and convolutional neural networks (CNN) based on transfer learning were employed for this purpose. In order to improve the classification performance, the use of two key concepts, namely Deep Convolutional Generative Adversarial Network (DCGAN) and decision fusion, has been proposed. With DCGAN, the number of limited available data used for training was increased, thus creating a bigger training dataset with identical distribution to the original data for both moving directions. This generated synthetic data was then used along with the original training data to train three different pre-trained deep convolutional networks. Finally, the classification results obtained from these networks were combined with decision fusion approach. In order to evaluate the performance of the proposed method, publicly available RadEch dataset consisting of eight ground target classes was utilized. Based on the experimental results, it was observed that the combined use of the proposed DCGAN and decision fusion methods increased the detection accuracy of moving target for person, vehicle, group of person and all target groups, by 13.63%, 10.01%, 14.82% and 8.62%, respectively.