Published online by Cambridge University Press: 14 June 2019
Superimposing Electronic Navigational Chart (ENC) data on marine radar images can enrich information for navigation. However, direct image superposition is affected by the performance of various instruments such as Global Navigation Satellite Systems (GNSS) and compasses and may undermine the effectiveness of the resulting information. We propose a data fusion algorithm based on deep learning to extract robust features from radar images. By deep learning in this context we mean employing a class of machine learning algorithms, including artificial neural networks, that use multiple layers to progressively extract higher level features from raw input. We first exploit the ability of deep learning to perform target detection for the identification of marine radar targets. Then, image processing is performed on the identified targets to determine reference points for consistent data fusion of ENC and marine radar information. Finally, a more intelligent fusion algorithm is built to merge the marine radar and electronic chart data according to the determined reference points. The proposed fusion is verified through simulations using ENC data and marine radar images from real ships in narrow waters over a continuous period. The results suggest a suitable performance for edge matching of the shoreline and real-time applicability. The fused image can provide comprehensive information to support navigation, thus enhancing important aspects such as safety.