scholarly journals Measuring Change Using Quantitative Differencing of Repeat Structure-From-Motion Photogrammetry: The Effect of Storms on Coastal Boulder Deposits

2019 ◽  
Vol 12 (1) ◽  
pp. 42 ◽  
Author(s):  
Timothy Nagle-McNaughton ◽  
Rónadh Cox

Repeat photogrammetry is increasingly the go-too tool for long-term geomorphic monitoring, but quantifying the differences between structure-from-motion (SfM) models is a developing field. Volumetric differencing software (such as the open-source package CloudCompare) provides an efficient mechanism for quantifying change in landscapes. In this case study, we apply this methodology to coastal boulder deposits on Inishmore, Ireland. Storm waves are known to move these rocks, but boulder transportation and evolution of the deposits are not well documented. We used two disparate SfM data sets for this analysis. The first model was built from imagery captured in 2015 using a GoPro Hero 3+ camera (fisheye lens) and the second used 2017 imagery from a DJI FC300X camera (standard digital single-lens reflex (DSLR) camera); and we used CloudCompare to measure the differences between them. This study produced two noteworthy findings: First, volumetric differencing reveals that short-term changes in boulder deposits can be larger than expected, and that frequent monitoring can reveal not only the scale but the complexities of boulder transport in this setting. This is a valuable addition to our growing understanding of coastal boulder deposits. Second, SfM models generated by different imaging hardware can be successfully compared at sub-decimeter resolution, even when one of the camera systems has substantial lens distortion. This means that older image sets, which might not otherwise be considered of appropriate quality for co-analysis with more recent data, should not be ignored as data sources in long-term monitoring studies.

2012 ◽  
Vol 4 (1) ◽  
pp. 91-100 ◽  
Author(s):  
K. Vaníček ◽  
L. Metelka ◽  
P. Skřivánková ◽  
M. Staněk

Abstract. Homogenized data series of total ozone measurements taken by the regularly and well calibrated Dobson and Brewer spectrophotometers at Hradec Králové (Czech) and the data from the re-analyses ERA-40 and ERA-Interim were merged and compared to investigate differences between the particular data sets originated in Central Europe, the Northern Hemisphere (NH) mid-latitudes. The Dobson-to-Brewer transfer function and the algorithm for approximation of the data from the re-analyses were developed, tested and applied for creation of instrumentally consistent and completed total ozone data series of the 50-yr period 1961–2010 of observations. This correction has reduced the well-known seasonal differences between Dobson and Brewer data below the 1% calibration limit of the spectrophotometers. Incorporation of the ERA-40 and ERA-Interim total ozone data on days with missing measurements significantly improved completeness and reliability of the data series mainly in the first two decades of the period concerned. Consistent behaviour of the original and corrected/merged data sets was found in the pre-ozone-hole period (1961–1985). In the post-Pinatubo (1994–2010) era the data series show seasonal differences that can introduce uncertainty in estimation of ozone recovery mainly in the winter-spring season when the effect of the Montreal Protocol and its Amendments is expected. All the data sets confirm substantial depletion of ozone also in the summer months that gives rise to the question about its origin. The merged and completed data series of total ozone will be further analyzed to quantify chemical ozone losses and contribution of natural atmospheric processes to the ozone depletion over the region. This case study points out the importance of selection and evaluation of the quality and consistency of the input data sets used in estimation of long-term ozone changes including recovery of the ozone layer over the selected areas. Data are available from the PANGAEA database at doi:10.1594/PANGAEA.779819.


2012 ◽  
Vol 5 (1) ◽  
pp. 445-473
Author(s):  
K. Vaníček ◽  
L. Metelka ◽  
P. Skřivánková ◽  
M. Staněk

Abstract. Homogenized data series of total ozone measurements taken by the regularly and well calibrated Dobson and Brewer spectrophotometers at Hradec Králové (Czech) and the data from the re-analyses ERA-40 and ERA-Interim were assimilated and combined to investigate differences between the particular data sets over Central Europe, the NH mid-latitudes. The Dobson-to-Brewer transfer function and the algorithm for approximation of the data from the re-analyses were developed, tested and applied for creation of instrumentally consistent and completed total ozone data series of the 50-yr period 1961–2010 of observations. The assimilation has reduced the well-known seasonal differences between Dobson and Brewer data below the 1% calibration limit of the spectrophotometers. Incorporation of the ERA-40 and ERA-Interim total ozone data on days with missing measurements significantly improved completeness and reliability of the data series mainly in the first two decades of the period concerned. Consistent behaviour of the original and assimilated data sets was found in the pre-ozone-hole period (1961–1985). In the post-Pinatubo (1994–2010) era the data series show seasonal differences that can introduce uncertainty in estimation of ozone recovery mainly in the winter-spring season when the effect of the Montreal Protocol and its Amendments is expected. All the data sets confirm substantial depletion of ozone also in the summer months that gives rise to the question about its origin. The assimilated and completed data series of total ozone will be further analyzed to quantify chemical ozone losses and contribution of natural atmospheric processes to the ozone depletion over the region. This case study points out importance of selection and evaluation of the quality and consistency of the input data sets used in estimation of long-term ozone changes including recovery of the ozone layer over the selected areas. Data are available from the PANGAEA database at http://dx.doi.org/10.1594/PANGAEA.779819.


2015 ◽  
Vol 15 (10) ◽  
pp. 2209-2225 ◽  
Author(s):  
M. P. Wadey ◽  
J. M. Brown ◽  
I. D. Haigh ◽  
T. Dolphin ◽  
P. Wisse

Abstract. The extreme sea levels and waves experienced around the UK's coast during the 2013/14 winter caused extensive coastal flooding and damage. Coastal managers seek to place such extremes in relation to the anticipated standards of flood protection, and the long-term recovery of the natural system. In this context, return periods are often used as a form of guidance. This paper provides these levels for the winter storms, and discusses their application to the given data sets for two UK case study sites: Sefton, northwest England, and Suffolk, east England. Tide gauge records and wave buoy data were used to compare the 2013/14 storms with return periods from a national data set, and also joint probabilities of sea level and wave heights were generated, incorporating the recent events. The 2013/14 high waters and waves were extreme due to the number of events, as well as the extremity of the 5 December 2013 "Xaver" storm, which had a high return period at both case study sites. The national-scale impact of this event was due to its coincidence with spring high tide at multiple locations. Given that this event is such an outlier in the joint probability analyses of these observed data sets, and that the season saw several events in close succession, coastal defences appear to have provided a good level of protection. This type of assessment could in the future be recorded alongside defence performance and upgrade. Ideally other variables (e.g. river levels at estuarine locations) would also be included, and with appropriate offsetting for local trends (e.g. mean sea-level rise) so that the storm-driven component of coastal flood events can be determined. This could allow long-term comparison of storm severity, and an assessment of how sea-level rise influences return levels over time, which is important for consideration of coastal resilience in strategic management plans.


2001 ◽  
Vol 58 (11) ◽  
pp. 2284-2297 ◽  
Author(s):  
E Rivot ◽  
E Prévost ◽  
E Parent

We present a Bayesian approach of a Ricker stock-recruitment (S/R) analysis accounting for measurement errors on S/R data. We assess the sensitivity of posterior inferences to (i) the choice of Ricker model parameterizations, with special regards to management-related ones, and (ii) prior parameter distributions. Closed forms for Ricker parameter posterior distributions exist given S/R data known without error. We use this property to develop a procedure based on the Rao–Blackwell formula. This procedure achieves integration of measurement errors by averaging these closed forms over possible S/R data sets sampled from distributions derived from a stochastic model relating field data to the S and R variables. High-quality Bayesian estimates are obtained. The analysis of the influence of different parameterizations and of the priors is made easier. We illustrate our methodological approach by a case study of Atlantic salmon (Salmo salar). Posterior distributions for S and R are computed from a mark–recapture stochastic model. Ignoring measurement errors underestimates parameter uncertainty and overestimates both stock productivity and density dependence. We warn against using management-related parameterizations because it makes the strong prior assumption of long-term sustainability of stocks. Posterior inferences are sensitive to the choice of prior. The use of informative priors as a remedy is discussed.


Author(s):  
Eliana Trinaistic

In Canada, the non-profit organizations (NPO) and settlement sectors are increasingly re-examining their responsibility for service delivery and service design. With a growing interest in understanding how to include design principles and an “innovation” mindset in addressing the long-term outcomes of social services, new instruments are introduced as a way to experiment with different modes of engagement among the various stakeholders. The aim of community hackathons or civic hacks—a derivative of tech gatherings customized to fit public engagement—is to collaboratively rethink, redesign, and resolve a range of social and policy issues that communities are facing, from settlement, the environment, health, or legal services. Although hackathons and civic hacks aspire to be democratic, relationship-driven instruments, aligned with non-profit principles of inclusion and diversity, they are also risky propositions from the perspective of the non-profit organizational culture in Canada in that they tend to lack solid structure, clear rules, and fixed outcomes. Despite the challenges, the promise of innovation is too attractive to be disregarded, and some non-profits are embarking (with or without the government’s help) on incorporating hackathons into their toolkits. This case study will present a practitioner’s perspective on the outcomes of two community hackathons, one exploring migration data sets and the other on language policy innovation, co-developed between 2016 and 2019 by MCIS Language Solutions, a Toronto based not-for-profit social enterprise, in partnership with various partners. The case study examines how the hackathon as an instrument can aid settlement sectors and governments in fostering non-profit innovation to rethinking the trajectory of taking solutions to scale.


2015 ◽  
Vol 3 (4) ◽  
pp. 2665-2708 ◽  
Author(s):  
M. P. Wadey ◽  
J. M. Brown ◽  
I. D. Haigh ◽  
T. Dolphin ◽  
P. Wisse

Abstract. The extreme sea levels and waves experienced around the UK's coast during the 2013/2014 winter caused extensive coastal flooding and damage. In such circumstances, coastal managers seek to place such extremes in relation to the anticipated standards of flood protection, and the long-term recovery of the natural system. In this context, return periods are often used as a form of guidance. We therefore provide these levels for the winter storms, as well as discussing their application to the given data sets and case studies (two UK case study sites: Sefton, northwest England; and Suffolk, east England). We use tide gauge records and wave buoy data to compare the 2013/2014 storms with return periods from a national dataset, and also generate joint probabilities of sea level and waves, incorporating the recent events. The UK was hit at a national scale by the 2013/2014 storms, although the return periods differ with location. We also note that the 2013/2014 high water and waves were extreme due to the number of events, as well as the extremity of the 5 December 2013 "Xaver" storm, which had a very high return period at both case study sites. Our return period analysis shows that the national scale impact of this event is due to its coincidence with spring high tide at multiple locations as the tide and storm propagated across the continental shelf. Given that this event is such an outlier in the joint probability analyses of these observed data sets, and that the season saw several events in close succession, coastal defences appear to have provided a good level of protection. This type of assessment should be recorded alongside details of defence performance and upgrade, with other variables (e.g. river levels at estuarine locations) included and appropriate offsetting for linear trends (e.g. mean sea level rise) so that the storm-driven component of coastal flood events can be determined. Local offsetting of the mean trends in sea level allows long-term comparison of storm severity and also enables an assessment of how sea level rise is influencing return levels over time, which is important when considering long-term coastal resilience in strategic management plans.


2018 ◽  
Author(s):  
Ricardo Wurmus ◽  
Bora Uyar ◽  
Brendan Osberg ◽  
Vedran Franke ◽  
Alexander Gosdschan ◽  
...  

AbstractIn bioinformatics, as well as other computationally-intensive research fields, there is a need for workflows that can reliably produce consistent output, independent of the software environment or configuration settings of the machine on which they are executed. Indeed, this is essential for controlled comparison between different observations or for the wider dissemination of workflows. Providing this type of reproducibility, however, is often complicated by the need to accommodate the myriad dependencies included in a larger body of software, each of which generally come in various versions. Moreover, in many fields (bioinformatics being a prime example), these versions are subject to continual change due to rapidly evolving technologies, further complicating problems related to reproducibility. Here, we propose a principled approach for building analysis pipelines and managing their dependencies. As a case study to demonstrate the utility of our approach, we present a set of highly reproducible pipelines for the analysis of RNA-seq, ChIP-seq, Bisulfite-seq, and single-cell RNA-seq. All pipelines process raw experimental data, and generate reports containing publication-ready plots and figures, with interactive report elements and standard observables. Users may install these highly reproducible packages and apply them to their own datasets without any special computational expertise beyond the use of the command line. We hope such a toolkit will provide immediate benefit to laboratory workers wishing to process their own data sets or bioinformaticians seeking to automate all, or parts of, their analyses. In the long term, we hope our approach to reproducibility will serve as a blueprint for reproducible workflows in other areas. Our pipelines, along with their corresponding documentation and sample reports, are available at http://bioinformatics.mdc-berlin.de/pigx


2016 ◽  
Vol 5 (2) ◽  
pp. 70
Author(s):  
Thiago Souza Araujo ◽  
Silvio Dagoberto Orsatto ◽  
Aires Jose Rover ◽  
Gertrudes A. Dandolini

This paper aims to present improvements in Public Management on Brazilian judiciary. A qualitative theoretical-empirical study based on data sets and interviews provided by Brazilian Judiciary shows how specialization and multidisciplinary teams can affect the efficiency and efficacy in solving lawsuits. Innovations observed includes practices such as systemic view, multidisciplinary teams’ adoption - co-production - applied to the public service. It is also shown long term planning and preventive practices that tends to avoid public service demand. Better results emerged when applied both specialization and multidisciplinary work. In this period the organizational unit observed dropped convictions amounts to a quarter of similar sized units on average. This means that State saved millions of Dollars – presented in local currency – in State judicial convictions that may now be spent in other areas such as education or infra-structure. Lawsuit resolution timeframe reduced from 729 to 60 days on average, median from 665 to 11 days. These observed results evidences that a lawsuit with a specialized and multidisciplinary team is more likely to improve quantitative and qualitative results.


2020 ◽  
Vol 29 (4) ◽  
pp. 2049-2067
Author(s):  
Karmen L. Porter ◽  
Janna B. Oetting ◽  
Loretta Pecchioni

Purpose This study examined caregiver perceptions of their child's language and literacy disorder as influenced by communications with their speech-language pathologist. Method The participants were 12 caregivers of 10 school-aged children with language and literacy disorders. Employing qualitative methods, a collective case study approach was utilized in which the caregiver(s) of each child represented one case. The data came from semistructured interviews, codes emerged directly from the caregivers' responses during the interviews, and multiple coding passes using ATLAS.ti software were made until themes were evident. These themes were then further validated by conducting clinical file reviews and follow-up interviews with the caregivers. Results Caregivers' comments focused on the types of information received or not received, as well as the clarity of the information. This included information regarding their child's diagnosis, the long-term consequences of their child's disorder, and the connection between language and reading. Although caregivers were adept at describing their child's difficulties and therapy goals/objectives, their comments indicated that they struggled to understand their child's disorder in a way that was meaningful to them and their child. Conclusions The findings showed the value caregivers place on receiving clear and timely diagnostic information, as well as the complexity associated with caregivers' understanding of language and literacy disorders. The findings are discussed in terms of changes that could be made in clinical practice to better support children with language and literacy disorders and their families.


Sign in / Sign up

Export Citation Format

Share Document