scholarly journals Vision-based Visibility Estimation: From Fog Detection to Complete Visual Range

2021 ◽  
Author(s):  
Harald Ganster ◽  
Jürgen Lang

<p>In air traffic management (ATM) and monitoring of critical infrastructure, the exact description of the near surface atmospheric state - and thus the visibility - is an indispensable basis for situation awareness and any further weather forecast.</p><p>In order to overcome the drawbacks of the currently subjective reports from human observers, we present an innovative solution to automatically derive visibility measures from standard cameras by a vision based approach.</p><p>The certified state of the art for automated visibility measurement is represented by visibility sensors, such as those e.g. used for RVR (Runway Visual Range) measurements. These sensors only allow a very local measurement, whereas camera-based methods enable a representative measurement of the visibility in the entire environment of the camera location. A variety of camera-based approaches use physically based models to derive a measure of visibility (e.g. the Koschmieder model or contrast measurements, as well as models for measuring light reduction). The Dutch weather service (KNMI) uses similar visibility detectors and methods as are used for our system called “visIvis®” (e.g. feature-based methods or de-hazing methods). In addition to the restriction to a single specific method, often additional special requirements (e.g. the measurement object or the land mark must lie on a straight line with two cameras) complicate the use of these methods for a representative measurement of the entire scene.</p><p>It will be shown how the visIvis® system can detect automatically most suitable areas for visibility estimation within the camera-covered range based on a variety of detection algorithms, automatically tunes its detection parameters, and automatically derives fog covered areas. Furthermore, by coupling visIvis® with georeferenced data, a pixel-precise depth map is deduced from digital surface and terrain models and user orientated visibility classes can be defined (customized or according to meteorological relevant thresholds). Based on this mapping, visIvis® is able to derive representative visibility measures for complete visual range, that can be reported in customized or standard formats (e.g. METAR).</p><p>The presentation will give insight on a recent visibility measurement study for synoptic meteorological applications in cooperation with Deutscher Wetterdienst (DWD), the German National Meteorological Service. Special focus was laid on night scenarios, which pose challenges on a camera based measurement system, e.g. light sensitivity of the sensor or availability of representative landmarks. In addition, we will show how to generate added value by extending the concept of vision-based visibility measurement to other weather-related parameters. In the present study it was investigated, which steps are required by transfer learning principles to adapt the system towards other camera-based observations. Results will be presented from evaluations in different challenging application scenarios.</p>

2021 ◽  
Vol 8 ◽  
Author(s):  
Marta de Alfonso ◽  
Jue Lin-Ye ◽  
José M. García-Valdecasas ◽  
Susana Pérez-Rubio ◽  
M. Yolanda Luna ◽  
...  

Storm Gloria, generated on January 17th, 2020 in the Eastern North Atlantic, crossed the Iberian Peninsula and impacted the Western Mediterranean during the following days. The event produced relevant damages on the coast and the infrastructures at the Catalan-Balearic Sea, due to extraordinary wind and wave fields, concomitant with anomalously intense rain and ocean currents. Puertos del Estado (the Spanish holding of harbors) has developed and operates a complex monitoring and forecasting system (PORTUS System), in collaboration with the Spanish Met Office (AEMET). The present work shows how Gloria was correctly forecasted by this system, alerts were properly issued (with special focus to the ports), and the buoys were able to monitor the sea state conditions during the event, measuring several new records of significant wave height and exceptional high mean wave periods. The paper describes, in detail, the dynamic evolution of the atmospheric conditions, and the sea state during the storm. It is by means of the study of both in situ and modeled PORTUS data, in combination with the AEMET weather forecast system results. The analysis also serves to place this storm in a historical context, showing the exceptional nature of the event, and to identify the specific reasons why its impact was particularly severe. The work also demonstrates the relevance of the PORTUS System to warn, in advance, the main Spanish Ports. It prevents accidents that could result in fatal casualties. To do so, the wave forecast warning performance is analyzed, making special focus on the skill score for the different horizons. Furthermore, it is demonstrated how a storm of this nature results in the need of changes on the extreme wave analysis for the area. It impacts all sorts of design activities at the coastline. The paper studies both how this storm fits into existing extreme analysis and how these should be modified in the light of this particular single event. This work is the first of a series of papers to be published on this issue. They analyze, in detail, other aspects of the event, including evolution of sea level and description of coastal damages.


2020 ◽  
Author(s):  
Paolo Ruggieri ◽  
Stefano Materia ◽  
Angel G. Muñoz ◽  
M.Carmen Alvarez Castro ◽  
Simon J. Mason ◽  
...  

<p>Producing probabilistic subseasonal forecasts of extreme events up to six weeks in advance is crucial for many economic sectors. In agribusiness, this time-scale is particularly critical because it allows for mitigation strategies to be adopted for counteracting weather hazards and taking advantage of opportunities.<br>For example, spring frosts are detrimental for many nut trees, resulting in dramatic losses at harvest time. To explore subseasonal forecast quality in boreal spring, identified as one of the most sensitive times of the year by agribusiness end-users, we build a multi-system ensemble using four models involved in the Subseasonal-to-Seasonal (S2S) Prediction Project. Two-meter temperature forecasts are used to analyze cold spell predictions in the coastal Black Sea region, an area that is a global leader in the production of hazelnuts. When analyzed at global scale, the multi-system ensemble probabilistic forecasts for near-surface temperature is better than climatological values for several regions, especially the Tropics, even many weeks in advance; however, in coastal Black Sea skill is low after the second forecast week. When cold spells are predicted instead of near-surface temperatures, skill improves for the region, and the forecasts prove to contain potentially useful information to stakeholders willing to put mitigation plans into effect. Using a cost-loss model approach for the first time in this context, we show that there is added value of having such a forecast system instead of a business-as-usual strategy, not only for predictions released one to two weeks ahead of the extreme event, but also at longer lead-times.</p>


2020 ◽  
Vol 3 (2) ◽  
pp. 191-201
Author(s):  
Martine G. de Vos ◽  
Wilco Hazeleger ◽  
Driss Bari ◽  
Jörg Behrens ◽  
Sofiane Bendoukha ◽  
...  

Abstract. The need for open science has been recognized by the communities of meteorology and climate science. While these domains are mature in terms of applying digital technologies, the implementation of open science methodologies is less advanced. In a session on “Weather and Climate Science in the Digital Era” at the 14th IEEE International eScience Conference domain specialists and data and computer scientists discussed the road towards open weather and climate science. Roughly 80 % of the studies presented in the conference session showed the added value of open data and software. These studies included open datasets from disparate sources in their analyses or developed tools and approaches that were made openly available to the research community. Furthermore, shared software is a prerequisite for the studies which presented systems like a model coupling framework or digital collaboration platform. Although these studies showed that sharing code and data is important, the consensus among the participants was that this is not sufficient to achieve open weather and climate science and that there are important issues to address. At the level of technology, the application of the findable, accessible, interoperable, and reusable (FAIR) principles to many datasets used in weather and climate science remains a challenge. This may be due to scalability (in the case of high-resolution climate model data, for example), legal barriers such as those encountered in using weather forecast data, or issues with heterogeneity (for example, when trying to make use of citizen data). In addition, the complexity of current software platforms often limits collaboration between researchers and the optimal use of open science tools and methods. The main challenges we observed, however, were non-technical and impact the practice of science as a whole. There is a need for new roles and responsibilities in the scientific process. People working at the interface of science and digital technology – e.g., data stewards and research software engineers – should collaborate with domain researchers to ensure the optimal use of open science tools and methods. In order to remove legal boundaries on sharing data, non-academic parties such as meteorological institutes should be allowed to act as trusted agents. Besides the creation of these new roles, novel policies regarding open weather and climate science should be developed in an inclusive way in order to engage all stakeholders. Although there is an ongoing debate on open science in the community, the individual aspects are usually discussed in isolation. Our approach in this paper takes the discourse further by focusing on “open science in weather and climate research” as a whole. We consider all aspects of open science and discuss the challenges and opportunities of recent open science developments in data, software, and hardware. We have compiled these into a list of concrete recommendations that could bring us closer to open weather and climate science. We acknowledge that the development of open weather and climate science requires effort to change, but the benefits are large. We have observed these benefits directly in the studies presented in the conference and believe that it leads to much faster progress in understanding our complex world.


2020 ◽  
Author(s):  
Jutta Thielen-del Pozo ◽  
Lise Autogena ◽  
Joshua Portway ◽  
Florian Pappenberger

<p>The European Union is funding research through so-called framework programmes (FPs), the financial and strategic tools to stimulate excellence, innovation, economic growth and creation of jobs across Europe. The allocated research budgets increased considerably from less than 4 billion Euro for FP1 (4 years) to 100 billion for Horizon Europe (FP9, 7 years), demonstrating the strategic importance that is being attributed to research and development for a strong and competitive Europe. The upcoming framework programme Horizon Europe will add a new level of ambition for the scientific, economic as well as societal impact of EU funding and address global challenges that affect the quality of our daily lives.</p><p>However, if societal issues that affect our everyday lives are to be addressed effectively in research and to drive the necessary innovation process in view of a better future, then the third component at the science-policy interface must be “society”. Robust data, facts and evidences represent an important input to policy making in addition to other inputs and considerations. Scientists and policy makers must therefore not only network amongst their communities and experts but also interact with the public and engage in dialogue with citizens in order to first understand what the concerns and issues are and later to explain the solutions.</p><p>The Joint Research Centre has engaged in an Art, Science and Society programme to fill this gap. Artists are invited to the JRC to co-develop projects with the scientists under a specific theme – in 2015 the topic was “Food”, in 2017 “Fairness” and in 2019 “Big Data, Digital Transformation and Artificial Intelligence”. The final works are exhibited during the so-called Resonances Festival.</p><p>This presentation illustrates at the example of the Resonances III installation “Weather Prediction by Numerical Process - a forecast for Europe” by artists Lise Autogena and Joshua Portway in collaboration with the co-authors, the added value of this approach. The installation is a performance inspired by the work of L.F. Richardson (1881–1953), a truly multi-disciplinary scientist, who contributed to finite difference solutions of partial differential equations, turbulent flow and diffusion, also fractals, and the cause and evolution of conflicts. He was particularly visionary in his work on designing a numerical scheme for weather forecasting. While serving as ambulance driver during WWI, he performed the calculation for a weather forecast for Europe “by hand”. Even if the result of his years of calculations resulted in a wrong forecast because the numerical solution was not stable, the methodology for numerical weather forecast was born and today’s weather forecasts follow largely the same method – just with infinite more computing power. Richardson estimated that 64000 scientists, working together in a big orchestrated calculation, would be needed to calculate the weather in real-time.</p><p>The chosen format for the art installation is a performance, ritualistically re-enacting a small part of this epic calculation, drawing the audience into a multi-faceted discussion on the relevance of Richardson’s legacy today in the times of super computing and climate change.</p>


Bosnia and Herzegovina (B&H) is a relatively small and poor country which faces numerous issues such as consequences of war, poverty, emigration of qualified people and, especially, useless and barren political conflicts. The country is in a very difficult economic situation. It is enough to say that in 2015, B&H had EUR 3200 per capita GDP, and that Greece, which is in the focus of Europe and world because of its economic crisis, had EUR 16,000 per capita GDP. Bosnia and Herzegovina was ranked low on the current Global Competitiveness Report 2015-2016. It was 111th out of a total of 140 countries. At this moment, Bosnia and Herzegovina is mainly a loser in the process of globalization with an excess labor force that is fighting for survival. Data on the structure of exports confirm that the inclusion of BiH in the international division of labor is based on the extraction of limited natural resources and production based on cheap labor. This paper analyze most important elements for the development of the economy in B&H, a private sector, scientific and technological institutions (universities, faculties, institutes, etc.), educational and government institutions for economic development. The challenge ahead of Bosnia and Herzegovina in the next 5-10 years to build the conditions for transition from the current economic model characterized by the use of natural resources and low-educated labor, to use the new drivers of development and export competitiveness - new technologies and knowledge. The special focus is on the change from the environment where a majority of population lacks skills and knowledge to create competitive products and services for domestic, regional, European and global markets to the environment in which most people possess them. Basically, authors analyze possibilities of transition from the present-day economic model characterized by use of a semiskilled labour force and manufacture of products with low added value to the knowledge-based development model. In simple words, from ignorance to knowledge.


2017 ◽  
Author(s):  
Christian R. Steger ◽  
Carleen H. Reijmer ◽  
Michiel R. van den Broeke

Abstract. Recent studies indicate that the surface mass balance will dominate the Greenland Ice Sheet's (GrIS) contribution to 21st century sea level rise. Consequently, it is crucial to understand the liquid water balance (LWB) of the ice sheet and its response to increasing surface melt. We therefore analyse a firn simulation conducted with SNOWPACK for the GrIS and over the period 1960–2014 with a special focus on the LWB and refreezing. An indirect evaluation of the simulated refreezing climate with GRACE and firn temperature observations indicate a good model performance. Results of the LWB analysis reveal a spatially uniform increase in surface melt during 1990–2014. As a response, refreezing and runoff also indicate positive trends for this period, where refreezing increases with only half the rate of runoff, which implies that the majority of the additional liquid input runs off the ice sheet. However, this pattern is spatially variable as e.g. in the southeastern part of the GrIS, most of the additional liquid input is buffered in the firn layer due to relatively high snowfall rates. The increase in modelled refreezing leads to a general decrease in firn air content and to a substantial increase in near-surface firn temperature in some regions. On the western side of the ice sheet, modelled firn temperature increases are highest in the lower accumulation zone and are primarily caused by the exceptional melt season of 2012. On the eastern side, simulated firn temperature increases more gradually and with an associated upward migration of firn aquifers.


Author(s):  
Eszter Katics

This study investigates the presence of the European identity with a particular focus on youth in EU member and candidate states. It introduces the most important theoretical and some of the recent empirical works on the subject, and offers a statistical analysis based on the data of the Eurobarometer survey between 2011-2019. This period involves the time of the migration crisis and the end of the financial and economic crisis, which gives an added value to the research. The empirical findings touch upon the relationship of the national identity and the European identity in the countries in question, and a special focus is made on EU citizenship.


Processes ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 366 ◽  
Author(s):  
Alberto Mannu ◽  
Sebastiano Garroni ◽  
Jesus Ibanez Porras ◽  
Andrea Mele

Recently, the interest in converting waste cooking oils (WCOs) to raw materials has grown exponentially. The driving force of such a trend is mainly represented by the increasing number of WCO applications, combined with the definition, in many countries, of new regulations on waste management. From an industrial perspective, the simple chemical composition of WCOs make them suitable as valuable chemical building blocks, in fuel, materials, and lubricant productions. The sustainability of such applications is sprightly related to proper recycling procedures. In this context, the development of new recycling processes, as well as the optimization of the existing ones, represents a priority for applied chemistry, chemical engineering, and material science. With the aim of providing useful updates to the scientific community involved in vegetable oil processing, the current available technologies for WCO recycling are herein reported, described, and discussed. In detail, two main types of WCO treatments will be considered: chemical transformations, to exploit the chemical functional groups present in the waste for the synthesis of added value products, and physical treatments as extraction, filtration, and distillation procedures. The first part, regarding chemical synthesis, will be connected mostly to the production of fuels. The second part, concerning physical treatments, will focus on bio-lubricant production. Moreover, during the description of filtering procedures, a special focus will be given to the development and applicability of new materials and technologies for WCO treatments.


Energies ◽  
2020 ◽  
Vol 13 (21) ◽  
pp. 5750
Author(s):  
Cristina Moliner ◽  
Dario Bove ◽  
Elisabetta Arato

Agricultural activities produce an estimated amount of 32.7 MToe/year of residues in EU countries. They are mostly disposed in landfills, incinerated without any control, or abandoned in fields, causing severe impacts on human health and environment. Rice is one of the most consumed crops worldwide with an annual production of 782 million tons according to the Food and Agriculture Organization of the United Nations database. In this context, the EU-funded project LIFE LIBERNITRATE promotes the use of renewable residual sources (i.e., rice straw) to obtain new materials with an added value. The methodology is based on the incineration of rice straw in an own-designed and constructed valorization system. Rice straw/wood pellets are burned in optimized conditions to produce a maximized quantity of ashes with high silica content. These materials will be then used to treat water polluted with nitrates, representing an optimal example of circular economy strategy. In this work, the own-designed valorization unit is described, with special focus on its main constituting elements. The theoretical study of the co-incineration of rice straw and wood pellets identified the optimised combustion conditions. Experimental tests using the theoretical inputs confirmed the most adequate operational conditions (10 g rice straw pellets/min + 10 g wood pellets/min, 6–7 Nm3/h of air, T = 500 °C) and helped in the definition of improvements on the experimental plant.


2020 ◽  
Vol 35 (1) ◽  
pp. 51-66 ◽  
Author(s):  
L. Cucurull ◽  
M. J. Mueller

Abstract Observing system simulation experiments (OSSEs) were conducted to evaluate the potential impact of the six Global Navigation Satellite System (GNSS) radio occultation (RO) receiver satellites in equatorial orbit from the initially proposed Constellation Observing System for Meteorology, Ionosphere, and Climate-2 (COSMIC-2) mission, known as COSMIC-2A. Furthermore, the added value of the high-inclination component of the proposed mission was investigated by considering a few alternative architecture designs, including the originally proposed polar constellation of six satellites (COSMIC-2B), a constellation with a reduced number of RO receiving satellites, and a constellation of six satellites but with fewer observations in the lower troposphere. The 2015 year version of the operational three-dimensional ensemble–variational data assimilation system of the National Centers for Environment Prediction (NCEP) was used to run the OSSEs. Observations were simulated and assimilated using the same methodology and their errors assumed uncorrelated. The largest benefit from the assimilation of COSMIC-2A, with denser equatorial coverage, was to improve tropical winds, and its impact was found to be overall neutral in the extratropics. When soundings from the high-inclination orbit were assimilated in addition to COSMIC-2A, positive benefits were found globally, confirming that a high-inclination orbit constellation of RO receiving satellites is necessary to improve weather forecast skill globally. The largest impact from reducing COSMIC-2B from six to four satellites was to slightly degrade weather forecast skill in the Northern Hemisphere extratropics. The impact of degrading COSMIC-2B to the COSMIC level of accuracy, in terms of penetration into the lower troposphere, was mostly neutral.


Sign in / Sign up

Export Citation Format

Share Document