scholarly journals Laser and LIDAR in a System for Visibility Distance Estimation in Fog Conditions

Sensors ◽  
2020 ◽  
Vol 20 (21) ◽  
pp. 6322
Author(s):  
Razvan-Catalin Miclea ◽  
Ciprian Dughir ◽  
Florin Alexa ◽  
Florin Sandru ◽  
Ioan Silea

Visibility is a critical factor for transportation, even if we refer to air, water, or ground transportation. The biggest trend in the automotive industry is autonomous driving, the number of autonomous vehicles will increase exponentially, prompting changes in the industry and user segment. Unfortunately, these vehicles still have some drawbacks and one, always in attention and topical, will be treated in this paper—visibility distance issue in bad weather conditions, particularly in fog. The way and the speed with which vehicles will determine objects, obstacles, pedestrians, or traffic signs, especially in bad visibility, will determine how the vehicle will behave. In this paper, a new experimental set up is featured, for analyzing the effect of the fog when the laser and LIDAR (Light Detection And Ranging) radiation are used in visibility distance estimation on public roads. While using our experimental set up, in the laboratory, the information offered by these measurement systems (laser and LIDAR) are evaluated and compared with results offered by human observers in the same fog conditions. The goal is to validate and unitarily apply the results regarding visibility distance, based on information arrives from different systems that are able to estimate this parameter (in foggy weather conditions). Finally, will be notifying the drivers in case of unexpected situations. It is a combination of stationary and of moving systems. The stationary system will be installed on highways or express roads in areas prone to fog, while the moving systems are, or can be, directly installed on the vehicles (autonomous but also non-autonomous).

Author(s):  
Faouzi Kamoun ◽  
Hazar Chaabani ◽  
Fatma Outay ◽  
Ansar-Ul-Haque Yasar

The immaturity of fog abatement technologies for highway usage has led to growing interest towards developing intelligent transportation systems that are capable of estimating meteorological visibility distance under foggy weather conditions. This capability is crucial to support next-generation cooperative situational awareness and collision avoidance systems as well as onboard driver assistance systems. This chapter presents a survey and a comprehensive taxonomy of daytime visibility distance estimation approaches based on a review and synthesis of the literature. The proposed taxonomy is both comprehensive (i.e., captures a wide spectrum of earlier contributions) and effective (i.e., enables easy comparison among previously proposed approaches). The authors also highlight some open research issues that warrant further investigation.


Author(s):  
Ursula Kälin ◽  
Louis Staffa ◽  
David Eugen Grimm ◽  
Axel Wendt

To validate the accuracy and reliability of onboard sensors for object detection and localization in driver assistance, as well as autonomous driving applications under realistic conditions (indoors and outdoors), a novel tracking system is presented. This tracking system is developed to determine the position and orientation of a slow-moving vehicle (e.g. car during parking maneuvers), independent of the onboard sensors, during test maneuvers within a reference environment. One requirement is a 6 degree of freedom (DoF) pose with a position uncertainty below 5 mm (3σ), an orientation uncertainty below 0.3° (3σ) at a frequency higher than 20 Hz, and a latency smaller than 500 ms. To compare the results from the reference system with the vehicle’s onboard system, a synchronization via Precision Time Protocol (PTP) and a system interoperability to Robot Operating System (ROS) is implemented. The developed system combines motion capture cameras mounted in a 360° panorama view set-up on the vehicle with robotic total stations. A point cloud of the test site serves as a digital twin of the environment, in which the movement of the vehicle is simulated. Results have shown that the fused measurements of these sensors complement each other, so that the accuracy requirements for the 6 DoF pose can be met, while allowing a flexible installation in different environments.


2021 ◽  
Vol 11 (8) ◽  
pp. 3604
Author(s):  
Alessandro Severino ◽  
Salvatore Curto ◽  
Salvatore Barberi ◽  
Fabio Arena ◽  
Giovanni Pau

Autonomous driving is a technological innovation that involves the use of Artificial Intelligence (AI) in the automotive area, representing the future of transport and whose applications will influence the concept of driving and many other features of modern society. Indeed, the introduction of Autonomous Vehicles (AVs) on the market, along with the development of related technologies, will have a potential impact not only on the automotive industry but also on urban transport systems. New mobility-related businesses will emerge, whereas existing ones will have to adapt to changes. There are various aspects that affect urban transport systems: in this work, we highlight how road markings, intersections, and pavement design upgradings have a key role for AVs operation. This work aims to describe how contemporary society will be influenced by Autonomous Vehicles’ spread in regard to urban transport systems. A comprehensive analysis of the expected developments within urban transport systems is hereby presented, and some crucial issues concerning benefits and drawbacks are also discussed. From our studies, it emerges that the detection performed by vehicles is mainly affected by road markings characteristics, especially at intersections. Indeed, the need for a new cross-sections type arise, since vehicles wandering phenomena will be reduced due to AVs position-keeping systems.


1984 ◽  
Vol 1 (19) ◽  
pp. 123 ◽  
Author(s):  
H. Derks ◽  
M.J.F. Stive

Field campaigns were conducted in 1981 and 1982/83 on the Dutch coast near Egmond. Measurements were made of surface elevations, water velocities and sediment concentrations in 3 to 8 surf zone locations and 2 to 5 offshore locations simultaneously. A total of 50 measurement series was obtained under a variety of weather conditions, resulting in offshore wave heights of 0.2 to 4.6 m. A description is given of the field set-up, the instruments and measurements, and the collected data. The quality of the various measurement systems and the data produced has been investigated extensively by intercomparison of instruments and devices in the field. The results are reported here.


PLoS ONE ◽  
2021 ◽  
Vol 16 (11) ◽  
pp. e0259438
Author(s):  
Sushank Chaudhary ◽  
Lunchakorn Wuttisittikulkij ◽  
Muhammad Saadi ◽  
Abhishek Sharma ◽  
Sattam Al Otaibi ◽  
...  

Autonomous vehicles are regarded as future transport mechanisms that drive the vehicles without the need of drivers. The photonic-based radar technology is a promising candidate for delivering attractive applications to autonomous vehicles such as self-parking assistance, navigation, recognition of traffic environment, etc. Alternatively, microwave radars are not able to meet the demand of next-generation autonomous vehicles due to its limited bandwidth availability. Moreover, the performance of microwave radars is limited by atmospheric fluctuation which causes severe attenuation at higher frequencies. In this work, we have developed coherent-based frequency-modulated photonic radar to detect target locations with longer distance. Furthermore, the performance of the proposed photonic radar is investigated under the impact of various atmospheric weather conditions, particularly fog and rain. The reported results show the achievement of significant signal to noise ratio (SNR) and received power of reflected echoes from the target for the proposed photonic radar under the influence of bad weather conditions. Moreover, a conventional radar is designed to establish the effectiveness of the proposed photonic radar by considering similar parameters such as frequency and sweep time.


Frequenz ◽  
2019 ◽  
Vol 73 (11-12) ◽  
pp. 399-408
Author(s):  
Stefan Preussler ◽  
Fabian Schwartau ◽  
Joerg Schoebel ◽  
Thomas Schneider

Abstract Fully autonomous driving, even under bad weather conditions, requires use of multiple sensor systems including radar imaging. Microwave photonics, especially the optical generation and distribution of radar signals, can overcome many of the electronic disadvantages. This article will give an overview about several photonic components and how they could be incorporated into a photonic synchronized radar system, where all the complexity is shifted to a central station. A first proof-of-concept radar experiment with of the shelf telecommunication equipment shows an angular resolution of 1.1°. Furthermore an overview about possible photonic electronic integration is given, leading to comprising low complexity transmitter and receiver chips.


2016 ◽  
Vol 36 (1) ◽  
pp. 3-15 ◽  
Author(s):  
Will Maddern ◽  
Geoffrey Pascoe ◽  
Chris Linegar ◽  
Paul Newman

We present a challenging new dataset for autonomous driving: the Oxford RobotCar Dataset. Over the period of May 2014 to December 2015 we traversed a route through central Oxford twice a week on average using the Oxford RobotCar platform, an autonomous Nissan LEAF. This resulted in over 1000 km of recorded driving with almost 20 million images collected from 6 cameras mounted to the vehicle, along with LIDAR, GPS and INS ground truth. Data was collected in all weather conditions, including heavy rain, night, direct sunlight and snow. Road and building works over the period of a year significantly changed sections of the route from the beginning to the end of data collection. By frequently traversing the same route over the period of a year we enable research investigating long-term localization and mapping for autonomous vehicles in real-world, dynamic urban environments. The full dataset is available for download at: http://robotcar-dataset.robots.ox.ac.uk


Symmetry ◽  
2020 ◽  
Vol 12 (2) ◽  
pp. 324 ◽  
Author(s):  
G Ajay Kumar ◽  
Jin Hee Lee ◽  
Jongrak Hwang ◽  
Jaehyeong Park ◽  
Sung Hoon Youn ◽  
...  

The fusion of light detection and ranging (LiDAR) and camera data in real-time is known to be a crucial process in many applications, such as in autonomous driving, industrial automation, and robotics. Especially in the case of autonomous vehicles, the efficient fusion of data from these two types of sensors is important to enabling the depth of objects as well as the detection of objects at short and long distances. As both the sensors are capable of capturing the different attributes of the environment simultaneously, the integration of those attributes with an efficient fusion approach greatly benefits the reliable and consistent perception of the environment. This paper presents a method to estimate the distance (depth) between a self-driving car and other vehicles, objects, and signboards on its path using the accurate fusion approach. Based on the geometrical transformation and projection, low-level sensor fusion was performed between a camera and LiDAR using a 3D marker. Further, the fusion information is utilized to estimate the distance of objects detected by the RefineDet detector. Finally, the accuracy and performance of the sensor fusion and distance estimation approach were evaluated in terms of quantitative and qualitative analysis by considering real road and simulation environment scenarios. Thus the proposed low-level sensor fusion, based on the computational geometric transformation and projection for object distance estimation proves to be a promising solution for enabling reliable and consistent environment perception ability for autonomous vehicles.


2020 ◽  
Vol 2020 (1) ◽  
pp. 78-81
Author(s):  
Simone Zini ◽  
Simone Bianco ◽  
Raimondo Schettini

Rain removal from pictures taken under bad weather conditions is a challenging task that aims to improve the overall quality and visibility of a scene. The enhanced images usually constitute the input for subsequent Computer Vision tasks such as detection and classification. In this paper, we present a Convolutional Neural Network, based on the Pix2Pix model, for rain streaks removal from images, with specific interest in evaluating the results of the processing operation with respect to the Optical Character Recognition (OCR) task. In particular, we present a way to generate a rainy version of the Street View Text Dataset (R-SVTD) for "text detection and recognition" evaluation in bad weather conditions. Experimental results on this dataset show that our model is able to outperform the state of the art in terms of two commonly used image quality metrics, and that it is capable to improve the performances of an OCR model to detect and recognise text in the wild.


Author(s):  
Jiayuan Dong ◽  
Emily Lawson ◽  
Jack Olsen ◽  
Myounghoon Jeon

Driving agents can provide an effective solution to improve drivers’ trust in and to manage interactions with autonomous vehicles. Research has focused on voice-agents, while few have explored robot-agents or the comparison between the two. The present study tested two variables - voice gender and agent embodiment, using conversational scripts. Twenty participants experienced autonomous driving using the simulator for four agent conditions and filled out subjective questionnaires for their perception of each agent. Results showed that the participants perceived the voice only female agent as more likeable, more comfortable, and more competent than other conditions. Their final preference ranking also favored this agent over the others. Interestingly, eye-tracking data showed that embodied agents did not add more visual distractions than the voice only agents. The results are discussed with the traditional gender stereotype, uncanny valley, and participants’ gender. This study can contribute to the design of in-vehicle agents in the autonomous vehicles and future studies are planned to further identify the underlying mechanisms of user perception on different agents.


Sign in / Sign up

Export Citation Format

Share Document