Past Bank Erosion as a Guide for Bank Erosion Prediction at Pipeline Crossings

Author(s):  
Laurent Roberge ◽  
Gerald Ferris ◽  
Hamish Weatherly

This paper presents a methodology which uses past bank erosion behaviour as a predictor of future performance. The methodology employed in the bank erosion study consists of the following main steps: identifying a reach to examine, classifying the watercourse, estimating key hydrotechnical properties, obtaining historical air photographs of the reach, georeferencing or orthorectifying the airphotos, mapping the position of the channel edge, obtaining the historical records of nearby gauges to estimate the return period of floods that have occurred between successive pairs of historical air photographs, and finally combining the results to provide correlations between the rates of bank erosion and the rarity of the floods that have occurred. More than 70 bank erosion studies have been completed in the past two years at a variety of watercourses. This paper provides three case histories that illustrate the methodology and then proceeds to provide some tentative relationships that could be used to focus future bank erosion studies on those sites most active, and used to provide a preliminary estimate of the amount of bank erosion that could be expected in both design settings and existing pipeline integrity evaluations. In this study wandering rivers are more laterally active than other channel pattern types. Although the smallest floods do not cause large-scale changes to the banks, significant bank erosion can be caused by either moderate (20-year) or extreme (100-year) events with a rough trend to larger bank erosion in larger floods. No significant correlation between the time elapsed between successive air photos and the magnitude of erosion was found, suggesting that bank erosion is an event-driven process rather than time dependent.

Heritage ◽  
2021 ◽  
Vol 4 (3) ◽  
pp. 1660-1680
Author(s):  
Feras Hammami

This article explores the role that heritage might play in the representation of ‘difference’, within the context of neoliberal cities. The case is a large-scale urban change in the former working-class neighborhood of Gamlestaden, Sweden. Interviews and on-site observations revealed how authorized heritage practices can enable the celebration of particular social and cultural values, while naturalizing the erasure of others. People’s cultural diversity, and diverging interpretations of the past, have been guided by the power of heritage into a process of subjectification, according to which only ‘unthreatening’ forms of cultural diversity were celebrated and revealed legitimate. The ‘fetishized’ difference and particular historical records have served to conceal the political interest at stake in its’ production and maintenance, and led to a politicised representation of cultural diversity through what Annie Coombes’ terms ‘scopic feast’. All this was made possible through BID, the first neoliberal business improvement district model in Sweden, and its investment in a deeply rooted process of heritageisation. Uncritical engagement with difference in the context of heritage management and neoliberal urban development, make it appear almost natural to erase the cultural values that fall outside the authorized narrative of value.


2021 ◽  
Vol 10 ◽  
pp. 5-8
Author(s):  
Lionel Kesztenbaum

Historical demography is inherently associated with constructing large-scale databases from historical records. Although there have been tremendous changes in the way they are constructed, many of the challenges remain. Throughout his career, Kees Mandemakers has been instrumental in facing some of these challenges, particularly those related to the conservation, standardization, and dissemination of databases. This short contribution discusses the evolution of large historical databases in historical demography.


2020 ◽  
Author(s):  
Lungwani Muungo

The purpose of this review is to evaluate progress inmolecular epidemiology over the past 24 years in canceretiology and prevention to draw lessons for futureresearch incorporating the new generation of biomarkers.Molecular epidemiology was introduced inthe study of cancer in the early 1980s, with theexpectation that it would help overcome some majorlimitations of epidemiology and facilitate cancerprevention. The expectation was that biomarkerswould improve exposure assessment, document earlychanges preceding disease, and identify subgroupsin the population with greater susceptibility to cancer,thereby increasing the ability of epidemiologic studiesto identify causes and elucidate mechanisms incarcinogenesis. The first generation of biomarkers hasindeed contributed to our understanding of riskandsusceptibility related largely to genotoxic carcinogens.Consequently, interventions and policy changes havebeen mounted to reduce riskfrom several importantenvironmental carcinogens. Several new and promisingbiomarkers are now becoming available for epidemiologicstudies, thanks to the development of highthroughputtechnologies and theoretical advances inbiology. These include toxicogenomics, alterations ingene methylation and gene expression, proteomics, andmetabonomics, which allow large-scale studies, includingdiscovery-oriented as well as hypothesis-testinginvestigations. However, most of these newer biomarkershave not been adequately validated, and theirrole in the causal paradigm is not clear. There is a needfor their systematic validation using principles andcriteria established over the past several decades inmolecular cancer epidemiology.


1987 ◽  
Vol 19 (5-6) ◽  
pp. 701-710 ◽  
Author(s):  
B. L. Reidy ◽  
G. W. Samson

A low-cost wastewater disposal system was commissioned in 1959 to treat domestic and industrial wastewaters generated in the Latrobe River valley in the province of Gippsland, within the State of Victoria, Australia (Figure 1). The Latrobe Valley is the centre for large-scale generation of electricity and for the production of pulp and paper. In addition other industries have utilized the brown coal resource of the region e.g. gasification process and char production. Consequently, industrial wastewaters have been dominant in the disposal system for the past twenty-five years. The mixed industrial-domestic wastewaters were to be transported some eighty kilometres to be treated and disposed of by irrigation to land. Several important lessons have been learnt during twenty-five years of operating this system. Firstly the composition of the mixed waste stream has varied significantly with the passage of time and the development of the industrial base in the Valley, so that what was appropriate treatment in 1959 is not necessarily acceptable in 1985. Secondly the magnitude of adverse environmental impacts engendered by this low-cost disposal procedure was not imagined when the proposal was implemented. As a consequence, clean-up procedures which could remedy the adverse effects of twenty-five years of impact are likely to be costly. The question then may be asked - when the total costs including rehabilitation are considered, is there really a low-cost solution for environmentally safe disposal of complex wastewater streams?


2019 ◽  
Vol 19 (1) ◽  
pp. 4-16 ◽  
Author(s):  
Qihui Wu ◽  
Hanzhong Ke ◽  
Dongli Li ◽  
Qi Wang ◽  
Jiansong Fang ◽  
...  

Over the past decades, peptide as a therapeutic candidate has received increasing attention in drug discovery, especially for antimicrobial peptides (AMPs), anticancer peptides (ACPs) and antiinflammatory peptides (AIPs). It is considered that the peptides can regulate various complex diseases which are previously untouchable. In recent years, the critical problem of antimicrobial resistance drives the pharmaceutical industry to look for new therapeutic agents. Compared to organic small drugs, peptide- based therapy exhibits high specificity and minimal toxicity. Thus, peptides are widely recruited in the design and discovery of new potent drugs. Currently, large-scale screening of peptide activity with traditional approaches is costly, time-consuming and labor-intensive. Hence, in silico methods, mainly machine learning approaches, for their accuracy and effectiveness, have been introduced to predict the peptide activity. In this review, we document the recent progress in machine learning-based prediction of peptides which will be of great benefit to the discovery of potential active AMPs, ACPs and AIPs.


Author(s):  
Jeasik Cho

This book provides the qualitative research community with some insight on how to evaluate the quality of qualitative research. This topic has gained little attention during the past few decades. We, qualitative researchers, read journal articles, serve on masters’ and doctoral committees, and also make decisions on whether conference proposals, manuscripts, or large-scale grant proposals should be accepted or rejected. It is assumed that various perspectives or criteria, depending on various paradigms, theories, or fields of discipline, have been used in assessing the quality of qualitative research. Nonetheless, until now, no textbook has been specifically devoted to exploring theories, practices, and reflections associated with the evaluation of qualitative research. This book constructs a typology of evaluating qualitative research, examines actual information from websites and qualitative journal editors, and reflects on some challenges that are currently encountered by the qualitative research community. Many different kinds of journals’ review guidelines and available assessment tools are collected and analyzed. Consequently, core criteria that stand out among these evaluation tools are presented. Readers are invited to join the author to confidently proclaim: “Fortunately, there are commonly agreed, bold standards for evaluating the goodness of qualitative research in the academic research community. These standards are a part of what is generally called ‘scientific research.’ ”


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Mateusz Taszarek ◽  
John T. Allen ◽  
Mattia Marchio ◽  
Harold E. Brooks

AbstractGlobally, thunderstorms are responsible for a significant fraction of rainfall, and in the mid-latitudes often produce extreme weather, including large hail, tornadoes and damaging winds. Despite this importance, how the global frequency of thunderstorms and their accompanying hazards has changed over the past 4 decades remains unclear. Large-scale diagnostics applied to global climate models have suggested that the frequency of thunderstorms and their intensity is likely to increase in the future. Here, we show that according to ERA5 convective available potential energy (CAPE) and convective precipitation (CP) have decreased over the tropics and subtropics with simultaneous increases in 0–6 km wind shear (BS06). Conversely, rawinsonde observations paint a different picture across the mid-latitudes with increasing CAPE and significant decreases to BS06. Differing trends and disagreement between ERA5 and rawinsondes observed over some regions suggest that results should be interpreted with caution, especially for CAPE and CP across tropics where uncertainty is the highest and reliable long-term rawinsonde observations are missing.


2021 ◽  
Vol 7 ◽  
pp. 237802312110201
Author(s):  
Thomas A. DiPrete ◽  
Brittany N. Fox-Williams

Social inequality is a central topic of research in the social sciences. Decades of research have deepened our understanding of the characteristics and causes of social inequality. At the same time, social inequality has markedly increased during the past 40 years, and progress on reducing poverty and improving the life chances of Americans in the bottom half of the distribution has been frustratingly slow. How useful has sociological research been to the task of reducing inequality? The authors analyze the stance taken by sociological research on the subject of reducing inequality. They identify an imbalance in the literature between the discipline’s continual efforts to motivate the plausibility of large-scale change and its lesser efforts to identify feasible strategies of change either through social policy or by enhancing individual and local agency with the potential to cumulate into meaningful progress on inequality reduction.


2021 ◽  
pp. 875529302199636
Author(s):  
Mertcan Geyin ◽  
Brett W Maurer ◽  
Brendon A Bradley ◽  
Russell A Green ◽  
Sjoerd van Ballegooy

Earthquakes occurring over the past decade in the Canterbury region of New Zealand have resulted in liquefaction case-history data of unprecedented quantity. This provides the profession with a unique opportunity to advance the prediction of liquefaction occurrence and consequences. Toward that end, this article presents a curated dataset containing ∼15,000 cone-penetration-test-based liquefaction case histories compiled from three earthquakes in Canterbury. The compiled, post-processed data are presented in a dense array structure, allowing researchers to easily access and analyze a wealth of information pertinent to free-field liquefaction response (i.e. triggering and surface manifestation). Research opportunities using these data include, but are not limited to, the training or testing of new and existing liquefaction-prediction models. The many methods used to obtain and process the case-history data are detailed herein, as is the structure of the compiled digital file. Finally, recommendations for analyzing the data are outlined, including nuances and limitations that users should carefully consider.


Sign in / Sign up

Export Citation Format

Share Document