scholarly journals Geometrical Matching of SAR and Optical Images Utilizing ASIFT Features for SAR-based Navigation Aided Systems

Sensors ◽  
2019 ◽  
Vol 19 (24) ◽  
pp. 5500 ◽  
Author(s):  
Jakub Markiewicz ◽  
Karol Abratkiewicz ◽  
Artur Gromek ◽  
Wojciech Ostrowski ◽  
Piotr Samczyński ◽  
...  

This article presents a new approach to the estimation of shift and rotation between two images from different kinds of imaging sensors. The first of the image is an orthophotomap that is created using optical sensors with georeference information. The second one is created utilizing a Synthetic Aperture Radar (SAR) sensor.The proposed solution can be mounted on a flying platform, and, during the flight, the obtained SAR images are compared with the reference optical images, and thus it is possible to calculate the shift and rotation between these two images and then the direct georeferencing error. Since both images have georeference information, it is possible to calculate the navigation correction in cases when the drift of the calculated trajectory is expected. The method can be used in platforms where there is no satellite navigation signal and the trajectory is calculated on the basis of an inertial navigation system, which is characterized by a significant error. The proposed method of estimating the navigation error utilizing Affine Scale-Invariant Feature Transform (ASIFT) and Structure from Motion (SfM) is described, and techniques for improving the quality of SAR imaging using despeckling filters are presented. The methodology was tested and verified using real-life SAR images. Differences between the results obtained for a few selected despeckling methods were compared and commented on. Deep investigation of the nature of the SAR imaging technique and noise creation character allows new algorithms to be developed, which can be implemented on flying platforms to support existing navigation systems in which trajectory error occurs.

2011 ◽  
Vol 57 (1) ◽  
pp. 37-42
Author(s):  
Krzysztof Kulpa ◽  
Mateusz Malanowski ◽  
Jacek Misiurewicz ◽  
Piotr Samczynski

Radar and Optical Images Fusion Using Stripmap SAR Data with Multilook Processing The paper presents the real-life data results of SAR and optical images data fusion. The fusion has been carried out for SAR images obtained in stripmap SAR mode using multilook processing with different methods of final image creation. The aim of the fusion was to enhance the target recognition capabilities on the Earth surface for a simple single-channel SAR receiver.


Author(s):  
A. Masiero ◽  
A. Guarnieri ◽  
A. Vettore ◽  
F. Pirotti

The continuous technological improvement of mobile devices opens the frontiers of Mobile Mapping systems to very compact systems, i.e. a smartphone or a tablet. This motivates the development of efficient 3D reconstruction techniques based on the sensors typically embedded in such devices, i.e. imaging sensors, GPS and Inertial Navigation System (INS). Such methods usually exploits photogrammetry techniques (structure from motion) to provide an estimation of the geometry of the scene. <br><br> Actually, 3D reconstruction techniques (e.g. structure from motion) rely on use of features properly matched in different images to compute the 3D positions of objects by means of triangulation. Hence, correct feature matching is of fundamental importance to ensure good quality 3D reconstructions. <br><br> Matching methods are based on the appearance of features, that can change as a consequence of variations of camera position and orientation, and environment illumination. For this reason, several methods have been developed in recent years in order to provide feature descriptors robust (ideally invariant) to such variations, e.g. Scale-Invariant Feature Transform (SIFT), Affine SIFT, Hessian affine and Harris affine detectors, Maximally Stable Extremal Regions (MSER). <br><br> This work deals with the integration of information provided by the INS in the feature matching procedure: a previously developed navigation algorithm is used to constantly estimate the device position and orientation. Then, such information is exploited to estimate the transformation of feature regions between two camera views. This allows to compare regions from different images but associated to the same feature as seen by the same point of view, hence significantly easing the comparison of feature characteristics and, consequently, improving matching. SIFT-like descriptors are used in order to ensure good matching results in presence of illumination variations and to compensate the approximations related to the estimation process.


2020 ◽  
Vol 12 (18) ◽  
pp. 2928
Author(s):  
Jan Mortier ◽  
Gaël Pagès ◽  
Jordi Vilà-Valls

Global Navigation Satellite Systems (GNSS) is the technology of choice for outdoor positioning purposes but has many limitations when used in safety-critical applications such Intelligent Transportation Systems (ITS) and Unmanned Autonomous Systems (UAS). Namely, its performance clearly degrades in harsh propagation conditions and is not reliable due to possible attacks or interference. Moreover, GNSS signals may not be available in the so-called GNSS-denied environments, such as deep urban canyons or indoors, and standard GNSS architectures do not provide the precision needed in ITS. Among the different alternatives, cellular signals (LTE/5G) may provide coverage in constrained urban environments and Ultra-Wideband (UWB) ranging is a promising solution to achieve high positioning accuracy. The key points impacting any time-of-arrival (TOA)-based navigation system are (i) the transmitters’ geometry, (ii) a perfectly known transmitters’ position, and (iii) the environment. In this contribution, we analyze the performance loss of alternative TOA-based navigation systems in real-life applications where we may have both transmitters’ position mismatch, harsh propagation environments, and GNSS-denied conditions. In addition, we propose new robust filtering methods able to cope with both effects up to a certain extent. Illustrative results in realistic scenarios are provided to support the discussion and show the performance improvement brought by the new methodologies with respect to the state-of-the-art.


2021 ◽  
Vol 13 (7) ◽  
pp. 1295
Author(s):  
Massimo Selva

The need to observe and characterize the environment leads to a constant increase of the spatial, spectral, and radiometric resolution of new optical sensors [...]


Sign in / Sign up

Export Citation Format

Share Document