Multi-class detection and orientation recognition of vessels in maritime surveillance

A. Ghahremani, Yitian Kong, Y. Bondarau, P.H.N. de With

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

5 Citations (Scopus)
2 Downloads (Pure)

Abstract

For maritime surveillance, collecting information about vessels and their behavior is of vital importance. This implies reliable vessel detection and determination of the viewing angle to a vessel, which can help in analyzing the vessel behavior and in re-identification. This paper presents a vessel classification and orientation recognition system for maritime surveillance. For this purpose, we have established two novel multi-class vessel detection and vessel orientation datasets, provided to open public access. Each dataset contains 10,000 training and 1,000 evaluation images with 31,078 vessel labels (10 vessel types and 5 orientation classes). We deploy VGG/SSD to train two separate CNN models for multi-class detection and for orientation recognition of vessels. Both trained models provide a reliable F1 score of 82% and 76%, respectively.

Original languageEnglish
Title of host publication IS&T International Symposium on Electronic Imaging 2019, Image Processing: Algorithms and Systems XVII
Number of pages5
DOIs
Publication statusPublished - 13 Jan 2019
EventIS&T International Symposium on Electronic Imaging 2019, Image Processing: Algorithms and Systems XVII - Burlingame, United States
Duration: 13 Jan 201917 Jan 2019
Conference number: XVII
http://www.imaging.org/site/IST/IST/Conferences/EI/EI_2019/Conference/C_IPAS.aspx

Conference

ConferenceIS&T International Symposium on Electronic Imaging 2019, Image Processing: Algorithms and Systems XVII
Abbreviated titleIPAS2019
Country/TerritoryUnited States
CityBurlingame
Period13/01/1917/01/19
Internet address

Fingerprint

Dive into the research topics of 'Multi-class detection and orientation recognition of vessels in maritime surveillance'. Together they form a unique fingerprint.

Cite this