scholarly journals Snow Cover Evolution in the Gran Paradiso National Park, Italian Alps, Using the Earth Observation Data Cube

Data ◽  
2019 ◽  
Vol 4 (4) ◽  
pp. 138 ◽  
Author(s):  
Charlotte Poussin ◽  
Yaniss Guigoz ◽  
Elisa Palazzi ◽  
Silvia Terzago ◽  
Bruno Chatenoux ◽  
...  

Mountainous regions are particularly vulnerable to climate change, and the impacts are already extensive and observable, the implications of which go far beyond mountain boundaries and the environmental sectors. Monitoring and understanding climate and environmental changes in mountain regions is, therefore, needed. One of the key variables to study is snow cover, since it represents an essential driver of many ecological, hydrological and socioeconomic processes in mountains. As remotely sensed data can contribute to filling the gap of sparse in-situ stations in high-altitude environments, a methodology for snow cover detection through time series analyses using Landsat satellite observations stored in an Open Data Cube is described in this paper, and applied to a case study on the Gran Paradiso National Park, in the western Italian Alps. In particular, this study presents a proof of concept of the preliminary version of the snow observation from space algorithm applied to Landsat data stored in the Swiss Data Cube. Implemented in an Earth Observation Data Cube environment, the algorithm can process a large amount of remote sensing data ready for analysis and can compile all Landsat series since 1984 into one single multi-sensor dataset. Temporal filtering methodology and multi-sensors analysis allows one to considerably reduce the uncertainty in the estimation of snow cover area using high-resolution sensors. The study highlights that, despite this methodology, the lack of available cloud-free images still represents a big issue for snow cover mapping from satellite data. Though accurate mapping of snow extent below cloud cover with optical sensors still represents a challenge, spatial and temporal filtering techniques and radar imagery for future time series analyses will likely allow one to reduce the current cloud cover issue.

2021 ◽  
Vol 13 (10) ◽  
pp. 1957
Author(s):  
Chiara Richiardi ◽  
Palma Blonda ◽  
Fabio Michele Rana ◽  
Mattia Santoro ◽  
Cristina Tarantino ◽  
...  

Snow cover plays an important role in biotic and abiotic environmental processes, as well as human activities, on both regional and global scales. Due to the difficulty of in situ data collection in vast and inaccessible areas, the use of optical satellite imagery represents a useful support for snow cover mapping. At present, several operational snow cover algorithms and products are available. Even though most of them offer an up-to-daily time scale, they do not provide sufficient spatial resolution for studies requiring high spatial detail. By contrast, the Let-It-Snow (LIS) algorithm can produce high-resolution snow cover maps, based on the use of both the normalized-difference snow index (NDSI) and a digital elevation model. The latter is introduced to define a threshold value on the altitude, below which the presence of snow is excluded. In this study, we revised the LIS algorithm by introducing a new parameter, based on a threshold in the shortwave infrared (SWIR) band, and by modifying the overall algorithm workflow, such that the cloud mask selection can be used as an input. The revised algorithm has been applied to a case study in Gran Paradiso National Park. Unlike previous studies, we also compared the performance of both the original and the modified algorithms in the presence of cloud cover, in order to evaluate their effectiveness in discriminating between snow and clouds. Ground data collected by meteorological stations equipped with both snow gauges and solarimeters were used for validation purposes. The changes introduced in the revised algorithm can improve upon the overall classification accuracy obtained by the original LIS algorithm (i.e., up to 89.17 from 80.88%). The producer’s and user’s accuracy values obtained by the modified algorithm (89.12 and 95.03%, respectively) were larger than those obtained by the original algorithm (76.68 and 93.67%, respectively), thus providing a more accurate snow cover map.


2007 ◽  
Vol 158 (11) ◽  
pp. 349-352
Author(s):  
Grégory Amos ◽  
Ambroise Marchand ◽  
Anja Schneiter ◽  
Annina Sorg

The last Capricorns (Capra ibex ibex) in the Alps survived during the nineteenth century in the Aosta valley thanks to the royal hunting reservation (today Gran Paradiso national park). Capricorns from this reservation were successfully re-introduced in Switzerland after its Capricorn population had disappeared. Currently in Switzerland there are 13200 Capricorns. Every year 1000 are hunted in order to prevent a large variation and overaging of their population and the damage of pasture. In contrast, in the Gran Paradiso national park the game population regulates itself naturally for over eighty years. There are large fluctuations in the Capricorn population (2600–5000) which are most likely due to the climate, amount of snow, population density and to the interactions of these factors. The long-term surveys in the Gran Paradiso national park and the investigations of the capacity of this area are a valuable example for the optimal management of the ibexes in Switzerland.


1999 ◽  
Vol 23 (2) ◽  
pp. 205-227 ◽  
Author(s):  
R. I. Ferguson

Models that predict meltwater runoff at a daily timescale are important in water resource management, flood hazard assessment and climate-change impact studies. This article identifies four basic components of such models: meteorological extrapolation, snowmelt estimation at a point, snow-cover depletion and runoff routing. Alternative ways of handling these are discussed, with emphasis on the contrasting treatments in two widely used models: HBV and SRM. Many of the issues in meltwater modelling reflect wider debates in hydrological and environmental modelling, including problems of complexity vs. simplicity, the appropriate level of spatial disaggregation, parameter identification and calibration, and internal validation. In reviewing current trends emphasis is placed on the potential and limitations of fully distributed models, problems in using energy-balance rather than temperature-index melt models at basin scale, ways to deal with spatial variability in snow cover, and the value and limitations of earth observation data.


2021 ◽  
Author(s):  
Edzer Pebesma ◽  
Patrick Griffiths ◽  
Christian Briese ◽  
Alexander Jacob ◽  
Anze Skerlevaj ◽  
...  

<p>The OpenEO API allows the analysis of large amounts of Earth Observation data using a high-level abstraction of data and processes. Rather than focusing on the management of virtual machines and millions of imagery files, it allows to create jobs that take a spatio-temporal section of an image collection (such as Sentinel L2A), and treat it as a data cube. Processes iterate or aggregate over pixels, spatial areas, spectral bands, or time series, while working at arbitrary spatial resolution. This pattern, pioneered by Google Earth Engine™ (GEE), lets the user focus on the science rather than on data management.</p><p>The openEO H2020 project (2017-2020) has developed the API as well as an ecosystem of software around it, including clients (JavaScript, Python, R, QGIS, browser-based), back-ends that translate API calls into existing image analysis or GIS software or services (for Sentinel Hub, WCPS, Open Data Cube, GRASS GIS, GeoTrellis/GeoPySpark, and GEE) as well as a hub that allows querying and searching openEO providers for their capabilities and datasets. The project demonstrated this software in a number of use cases, where identical processing instructions were sent to different implementations, allowing comparison of returned results.</p><p>A follow-up, ESA-funded project “openEO Platform” realizes the API and progresses the software ecosystem into operational services and applications that are accessible to everyone, that involve federated deployment (using the clouds managed by EODC, Terrascope, CreoDIAS and EuroDataCube), that will provide payment models (“pay per compute job”) conceived and implemented following the user community needs and that will use the EOSC (European Open Science Cloud) marketplace for dissemination and authentication. A wide range of large-scale cases studies will demonstrate the ability of the openEO Platform to scale to large data volumes.  The case studies to be addressed include on-demand ARD generation for SAR and multi-spectral data, agricultural demonstrators like crop type and condition monitoring, forestry services like near real time forest damage assessment as well as canopy cover mapping, environmental hazard monitoring of floods and air pollution as well as security applications in terms of vessel detection in the mediterranean sea.</p><p>While the landscape of cloud-based EO platforms and services has matured and diversified over the past decade, we believe there are strong advantages for scientists and government agencies to adopt the openEO approach. Beyond the absence of vendor/platform lock-in or EULA’s we mention the abilities to (i) run arbitrary user code (e.g. written in R or Python) close to the data, (ii) carry out scientific computations on an entirely open source software stack, (iii) integrate different platforms (e.g., different cloud providers offering different datasets), and (iv) help create and extend this software ecosystem. openEO uses the OpenAPI standard, aligns with modern OGC API standards, and uses the STAC (SpatioTemporal Asset Catalog) to describe image collections and image tiles.</p>


Data ◽  
2019 ◽  
Vol 4 (3) ◽  
pp. 94 ◽  
Author(s):  
Steve Kopp ◽  
Peter Becker ◽  
Abhijit Doshi ◽  
Dawn J. Wright ◽  
Kaixi Zhang ◽  
...  

Earth observation imagery have traditionally been expensive, difficult to find and access, and required specialized skills and software to transform imagery into actionable information. This has limited adoption by the broader science community. Changes in cost of imagery and changes in computing technology over the last decade have enabled a new approach for how to organize, analyze, and share Earth observation imagery, broadly referred to as a data cube. The vision and promise of image data cubes is to lower these hurdles and expand the user community by making analysis ready data readily accessible and providing modern approaches to more easily analyze and visualize the data, empowering a larger community of users to improve their knowledge of place and make better informed decisions. Image data cubes are large collections of temporal, multivariate datasets typically consisting of analysis ready multispectral Earth observation data. Several flavors and variations of data cubes have emerged. To simplify access for end users we developed a flexible approach supporting multiple data cube styles, referencing images in their existing structure and storage location, enabling fast access, visualization, and analysis from a wide variety of web and desktop applications. We provide here an overview of that approach and three case studies.


Author(s):  
Gregory Giuliani ◽  
Bruno Chatenoux ◽  
Thomas Piller ◽  
Frédéric Moser ◽  
Pierre Lacroix

2003 ◽  
Vol 48 (3) ◽  
pp. 411-423 ◽  
Author(s):  
Francesca Parrini ◽  
Stefano Grignolio ◽  
Siriano Luccarini ◽  
Bruno Bassano ◽  
Marco Apollonio

Sign in / Sign up

Export Citation Format

Share Document