Measuring gravity on ice: An example from Wanapitei Lake, Ontario, Canada

Geophysics ◽  
2006 ◽  
Vol 71 (3) ◽  
pp. J23-J29 ◽  
Author(s):  
Hernan A. Ugalde ◽  
Elizabeth L’Heureux ◽  
Richard Lachapelle ◽  
Bernd Milkereit

Large lakes have always represented a problem for regional gravity databases; the difficulty of access means gaps or coarse spacing in the sampling. Satellite, airborne, and shipborne gravity techniques are options, but the resolution and/or cost of these systems make them impractical or inaccurate for exploration or environmental studies, where the required resolution is [Formula: see text]. In this study, the feasibility of a ground gravity survey over a frozen lake where ice moves because of windy conditions is assessed. Lake Wanapitei, widely accepted as resulting from the impact of a meteorite 37 million years ago, is one of these cases in which the necessity of expanding coverage over poorly sampled regions arose from a significant gap between surface and airborne geophysical maps. Two gravity surveys were completed on the ice of Lake Wanapitei in the winters of 2003 and 2004. To study the structure, longtime series of gravity field measurements were recorded for 98 stations, allowing for improved control over the noise sources in the data. Final processing and integration with an existing regional data set in the area and the application of terrain corrections reduced the amplitude of the circular anomaly from 15 to [Formula: see text] and its diameter from 11 to [Formula: see text]. The feasibility of gravity surveys on ice was assessed, and we determined that for large-scale studies such as this one, the quality of the data, even under noisy conditions, was acceptable. However, for more detailed mapping, calm wind conditions and long time series are required.

2021 ◽  
Vol 56 (1) ◽  
pp. 112-130 ◽  
Author(s):  
Haifeng Huang

AbstractFor a long time, since China’s opening to the outside world in the late 1970s, admiration for foreign socioeconomic prosperity and quality of life characterized much of the Chinese society, which contributed to dissatisfaction with the country’s development and government and a large-scale exodus of students and emigrants to foreign countries. More recently, however, overestimating China’s standing and popularity in the world has become a more conspicuous feature of Chinese public opinion and the social backdrop of the country’s overreach in global affairs in the last few years. This essay discusses the effects of these misperceptions about the world, their potential sources, and the outcomes of correcting misperceptions. It concludes that while the world should get China right and not misinterpret China’s intentions and actions, China should also get the world right and have a more balanced understanding of its relationship with the world.


2021 ◽  
Vol 10 (7) ◽  
pp. 436
Author(s):  
Amerah Alghanim ◽  
Musfira Jilani ◽  
Michela Bertolotto ◽  
Gavin McArdle

Volunteered Geographic Information (VGI) is often collected by non-expert users. This raises concerns about the quality and veracity of such data. There has been much effort to understand and quantify the quality of VGI. Extrinsic measures which compare VGI to authoritative data sources such as National Mapping Agencies are common but the cost and slow update frequency of such data hinder the task. On the other hand, intrinsic measures which compare the data to heuristics or models built from the VGI data are becoming increasingly popular. Supervised machine learning techniques are particularly suitable for intrinsic measures of quality where they can infer and predict the properties of spatial data. In this article we are interested in assessing the quality of semantic information, such as the road type, associated with data in OpenStreetMap (OSM). We have developed a machine learning approach which utilises new intrinsic input features collected from the VGI dataset. Specifically, using our proposed novel approach we obtained an average classification accuracy of 84.12%. This result outperforms existing techniques on the same semantic inference task. The trustworthiness of the data used for developing and training machine learning models is important. To address this issue we have also developed a new measure for this using direct and indirect characteristics of OSM data such as its edit history along with an assessment of the users who contributed the data. An evaluation of the impact of data determined to be trustworthy within the machine learning model shows that the trusted data collected with the new approach improves the prediction accuracy of our machine learning technique. Specifically, our results demonstrate that the classification accuracy of our developed model is 87.75% when applied to a trusted dataset and 57.98% when applied to an untrusted dataset. Consequently, such results can be used to assess the quality of OSM and suggest improvements to the data set.


2015 ◽  
Vol 8 (1) ◽  
pp. 421-434 ◽  
Author(s):  
M. P. Jensen ◽  
T. Toto ◽  
D. Troyan ◽  
P. E. Ciesielski ◽  
D. Holdridge ◽  
...  

Abstract. The Midlatitude Continental Convective Clouds Experiment (MC3E) took place during the spring of 2011 centered in north-central Oklahoma, USA. The main goal of this field campaign was to capture the dynamical and microphysical characteristics of precipitating convective systems in the US Central Plains. A major component of the campaign was a six-site radiosonde array designed to capture the large-scale variability of the atmospheric state with the intent of deriving model forcing data sets. Over the course of the 46-day MC3E campaign, a total of 1362 radiosondes were launched from the enhanced sonde network. This manuscript provides details on the instrumentation used as part of the sounding array, the data processing activities including quality checks and humidity bias corrections and an analysis of the impacts of bias correction and algorithm assumptions on the determination of convective levels and indices. It is found that corrections for known radiosonde humidity biases and assumptions regarding the characteristics of the surface convective parcel result in significant differences in the derived values of convective levels and indices in many soundings. In addition, the impact of including the humidity corrections and quality controls on the thermodynamic profiles that are used in the derivation of a large-scale model forcing data set are investigated. The results show a significant impact on the derived large-scale vertical velocity field illustrating the importance of addressing these humidity biases.


2017 ◽  
Vol 10 (5) ◽  
pp. 2031-2055 ◽  
Author(s):  
Thomas Schwitalla ◽  
Hans-Stefan Bauer ◽  
Volker Wulfmeyer ◽  
Kirsten Warrach-Sagi

Abstract. Increasing computational resources and the demands of impact modelers, stake holders, and society envision seasonal and climate simulations with the convection-permitting resolution. So far such a resolution is only achieved with a limited-area model whose results are impacted by zonal and meridional boundaries. Here, we present the setup of a latitude-belt domain that reduces disturbances originating from the western and eastern boundaries and therefore allows for studying the impact of model resolution and physical parameterization. The Weather Research and Forecasting (WRF) model coupled to the NOAH land–surface model was operated during July and August 2013 at two different horizontal resolutions, namely 0.03 (HIRES) and 0.12° (LOWRES). Both simulations were forced by the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis data at the northern and southern domain boundaries, and the high-resolution Operational Sea Surface Temperature and Sea Ice Analysis (OSTIA) data at the sea surface.The simulations are compared to the operational ECMWF analysis for the representation of large-scale features. To analyze the simulated precipitation, the operational ECMWF forecast, the CPC MORPHing (CMORPH), and the ENSEMBLES gridded observation precipitation data set (E-OBS) were used as references.Analyzing pressure, geopotential height, wind, and temperature fields as well as precipitation revealed (1) a benefit from the higher resolution concerning the reduction of monthly biases, root mean square error, and an improved Pearson skill score, and (2) deficiencies in the physical parameterizations leading to notable biases in distinct regions like the polar Atlantic for the LOWRES simulation, the North Pacific, and Inner Mongolia for both resolutions.In summary, the application of a latitude belt on a convection-permitting resolution shows promising results that are beneficial for future seasonal forecasting.


Author(s):  
Dariusz AMPUŁA

A statistical analysis of multiyear laboratory test results of artillery tracers number 8 is presented in this article. This analysis was aimed at testing the impact of a natural ageing process on quality indicators during the long-time storage of these tracers. The influence of storage time on taking a diagnostic decision, relating to quality of lots after the conducted laboratory tests and on different classes of inconsistencies that occurred during these tests, was analysed. A detailed analysis of the impact of the storage time on diagnostic shooting decisions taken was also presented. The conducted statistical analysis suggests an assumption, that it is possible to change an evaluation module in the previous test’s methodology. Modification of this evaluation module will not negatively impact on the quality of further diagnostic tests. It will not negatively impact on correct evaluation of the prediction process of the tested elements of ammunition such as artillery tracers. The statistical analysis, carried out in the article, may have a significant impact on the modification of test methodology of the artillery tracers.


2009 ◽  
Vol 2 (1) ◽  
pp. 87-98 ◽  
Author(s):  
C. Lerot ◽  
M. Van Roozendael ◽  
J. van Geffen ◽  
J. van Gent ◽  
C. Fayt ◽  
...  

Abstract. Total O3 columns have been retrieved from six years of SCIAMACHY nadir UV radiance measurements using SDOAS, an adaptation of the GDOAS algorithm previously developed at BIRA-IASB for the GOME instrument. GDOAS and SDOAS have been implemented by the German Aerospace Center (DLR) in the version 4 of the GOME Data Processor (GDP) and in version 3 of the SCIAMACHY Ground Processor (SGP), respectively. The processors are being run at the DLR processing centre on behalf of the European Space Agency (ESA). We first focus on the description of the SDOAS algorithm with particular attention to the impact of uncertainties on the reference O3 absorption cross-sections. Second, the resulting SCIAMACHY total ozone data set is globally evaluated through large-scale comparisons with results from GOME and OMI as well as with ground-based correlative measurements. The various total ozone data sets are found to agree within 2% on average. However, a negative trend of 0.2–0.4%/year has been identified in the SCIAMACHY O3 columns; this probably originates from instrumental degradation effects that have not yet been fully characterized.


2021 ◽  
Vol 10 (9) ◽  
pp. 144-147
Author(s):  
Huiling LI ◽  
Xuan SU ◽  
Shuaipeng ZHANG

Massive amounts of business process event logs are collected and stored by modern information systems. Model discovery aims to discover a process model from such event logs, however, most of the existing approaches still suffer from low efficiency when facing large-scale event logs. Event log sampling techniques provide an effective scheme to improve the efficiency of process discovery, but the existing techniques still cannot guarantee the quality of model mining. Therefore, a sampling approach based on set coverage algorithm named set coverage sampling approach is proposed. The proposed sampling approach has been implemented in the open-source process mining toolkit ProM. Furthermore, experiments using a real event log data set from conformance checking and time performance analysis show that the proposed event log sampling approach can greatly improve the efficiency of log sampling on the premise of ensuring the quality of model mining.


2014 ◽  
Vol 7 (4) ◽  
pp. 5087-5139 ◽  
Author(s):  
R. Pommrich ◽  
R. Müller ◽  
J.-U. Grooß ◽  
P. Konopka ◽  
F. Ploeger ◽  
...  

Abstract. Variations in the mixing ratio of trace gases of tropospheric origin entering the stratosphere in the tropics are of interest for assessing both troposphere to stratosphere transport fluxes in the tropics and the impact of these transport fluxes on the composition of the tropical lower stratosphere. Anomaly patterns of carbon monoxide (CO) and long-lived tracers in the lower tropical stratosphere allow conclusions about the rate and the variability of tropical upwelling to be drawn. Here, we present a simplified chemistry scheme for the Chemical Lagrangian Model of the Stratosphere (CLaMS) for the simulation, at comparatively low numerical cost, of CO, ozone, and long-lived trace substances (CH4, N2O, CCl3F (CFC-11), CCl2F2 (CFC-12), and CO2) in the lower tropical stratosphere. For the long-lived trace substances, the boundary conditions at the surface are prescribed based on ground-based measurements in the lowest model level. The boundary condition for CO in the free troposphere is deduced from MOPITT measurements (at ≈ 700–200 hPa). Due to the lack of a specific representation of mixing and convective uplift in the troposphere in this model version, enhanced CO values, in particular those resulting from convective outflow are underestimated. However, in the tropical tropopause layer and the lower tropical stratosphere, there is relatively good agreement of simulated CO with in-situ measurements (with the exception of the TROCCINOX campaign, where CO in the simulation is biased low ≈ 10–20 ppbv). Further, the model results are of sufficient quality to describe large scale anomaly patterns of CO in the lower stratosphere. In particular, the zonally averaged tropical CO anomaly patterns (the so called "tape recorder" patterns) simulated by this model version of CLaMS are in good agreement with observations. The simulations show a too rapid upwelling compared to observations as a consequence of the overestimated vertical velocities in the ERA-interim reanalysis data set. Moreover, the simulated tropical anomaly patterns of N2O are in good agreement with observations. In the simulations, anomaly patterns for CH4 and CFC-11 were found to be consistent with those of N2O; for all long-lived tracers, positive anomalies are simulated because of the enhanced tropical upwelling in the easterly phase of the quasi-biennial oscillation.


2018 ◽  
Vol 64 (247) ◽  
pp. 811-821 ◽  
Author(s):  
STEFAN LIPPL ◽  
SAURABH VIJAY ◽  
MATTHIAS BRAUN

ABSTRACTDespite their importance for mass-balance estimates and the progress in techniques based on optical and thermal satellite imagery, the mapping of debris-covered glacier boundaries remains a challenging task. Manual corrections hamper regular updates. In this study, we present an automatic approach to delineate glacier outlines using interferometrically derived synthetic aperture radar (InSAR) coherence, slope and morphological operations. InSAR coherence detects the temporally decorrelated surface (e.g. glacial extent) irrespective of its surface type and separates it from the highly coherent surrounding areas. We tested the impact of different processing settings, for example resolution, coherence window size and topographic phase removal, on the quality of the generated outlines. We found minor influence of the topographic phase, but a combination of strong multi-looking during interferogram generation and additional averaging during coherence estimation strongly deteriorated the coherence at the glacier edges. We analysed the performance of X-, C- and L- band radar data. The C-band Sentinel-1 data outlined the glacier boundary with the least misclassifications and a type II error of 0.47% compared with Global Land Ice Measurements from Space inventory data. Our study shows the potential of the Sentinel-1 mission together with our automatic processing chain to provide regular updates for land-terminating glaciers on a large scale.


2021 ◽  
Vol 263 (5) ◽  
pp. 1186-1193
Author(s):  
Yoshiharu Soeta ◽  
Ei Onogawa

Air conditioners are widely used in buildings to maintain thermal comfort for long time. Air conditioners produce sounds during operation, and air conditioners are regarded as one of the main noise sources in buildings. Most sounds produced by the air conditioner do not fluctuate over time and sound quality of the steady sounds produced by the air conditioner have been evaluated. However, air conditioners sometimes produce low-level and impulsive sounds. Customers criticizes such sounds are annoying when they sleep and they spend time quietly in the living room. The aim of this study was to determine the factors that significantly influence the psycho-physiological response to the low-level impulsive sounds produced by air conditioners. We assessed the A-weighted equivalent continuous sound pressure level (LAeq) and factors extracted from the autocorrelation function (ACF). Subjective loudness, sharpness, annoyance, and electroencephalography (EEG) were evaluated. Multiple regression analyses were performed using a linear combination of LAeq, the ACF factors, and their standard deviations. The results indicated that LAeq, the delay time of the first maximum peak, the width of the first decay of the ACF, and the magnitude and width of the IACF could predict psycho-physiological responses to air conditioner sounds.


Sign in / Sign up

Export Citation Format

Share Document