scholarly journals Spatial structure of grassland patches in Poland: implications for nature conservation

2019 ◽  
Vol 88 (1) ◽  
Author(s):  
Tomasz Henryk Szymura ◽  
Magdalena Szymura

Grasslands provide wide range of ecosystem services, however, their area and quality are still diminishing in Europe. Nowadays, they often create isolated patches inside “sea” of other habitats. We have examined basic structural landscape metrics of grasslands in Poland using CORINE land use database. Characteristics for both all individual patches as well as average values for 10 × 10-km grid covering Poland were examined. We also assessed the percentage of grasslands within protected areas and ecological corridors. We found that in Poland rather small patches (0.3–1 km<sup>2</sup>) dominate, usually located 200–500 m away from each other. The grasslands had clumped distribution, thus in Poland exist large areas where grasslands patches are separated kilometers from each other. Almost all indices calculated for 10 × 10-km<sup>2</sup> were correlated, i.e., in regions with high percentage of grasslands, the patches were large, more numerous, placed close to each other, and had more irregular shapes. Our results revealed that the percentage of grasslands within protected areas and ecological corridors did not differ from the average value for Poland. On the other hand, forests were significantly over-represented in protected areas and ecological corridors. These findings suggest that there is no planned scheme for grassland protection at the landscape scale in Poland. Development the scheme is urgent and needs high-quality data regarding distribution of seminatural grasslands patches. In practice, nature conservationists and managers should consider spatial processes in their plans in order to maintain grassland biodiversity.

2021 ◽  
pp. 1-62
Author(s):  
Rozenn Gazan ◽  
Florent Vieux ◽  
Ségolène Mora ◽  
Sabrina Havard ◽  
Carine Dubuisson

Abstract Objective: To describe existing online 24-hour dietary recall (24hDR) tools in terms of functionalities and ability to tackle challenges encountered during national dietary surveys, such as maximizing response rates and collecting high-quality data from a representative sample of the population, while minimizing the cost and response burden. Design: A search (from 2000 to 2019) was conducted in peer-reviewed and grey literature. For each tool, information on functionalities, validation and user usability studies, and potential adaptability for integration into a new context was collected. Setting: Not country-specific Participants: General population Results: Eighteen online 24hDR tools were identified. Most were developed in Europe, for children ≥10 years old and/or for adults. Eight followed the five multiple-pass steps, but used various methodologies and features. Almost all tools (except three) validated their nutrient intake estimates, but with high heterogeneity in methodologies. User usability was not always assessed, and rarely by applying real-time methods. For researchers, eight tools developed a web platform to manage the survey and five appeared to be easily adaptable to a new context. Conclusions: Among the eighteen online 24hDR tools identified, the best candidates to be used in national dietary surveys should be those that were validated for their intake estimates, had confirmed user and researcher usability, and seemed sufficiently flexible to be adapted to new contexts. Regardless of the tool, adaptation to another context will still require time and funding, and this is probably the most challenging step.


2019 ◽  
Vol 3 (Supplement_1) ◽  
pp. S757-S757
Author(s):  
Charlotte L Eost-Telling ◽  
Paul Kingston ◽  
Louise Taylor ◽  
Jan Bailey

Abstract The Mass Observation Project, established in 1937, documents the lives of ordinary people living in the UK, and explores a wide range of social issues. The Project distributes a set of written questions (“Directives”) to a panel of 500 members of the British public (“Observers”) three times each year; “Observers” respond in writing. From the initial commissioning of a “Directive” to data becoming available for analysis takes between four to six months. This approach offers researchers an opportunity to capture in-depth qualitative data from individuals with a range of demographic backgrounds who live across the UK. As there are no word limits on “Observers’” responses and they remain anonymous, a “Directive” often yields rich, high-quality data. Additionally, compared with alternative methods of collecting large volumes of qualitative data from a heterogeneous population, commissioning a “Directive” is cost-effective in terms of time and resource.


2016 ◽  
Vol 6 (5) ◽  
pp. 1115-1118
Author(s):  
F. Mavromatakis ◽  
Y. Franghiadakis ◽  
F. Vignola

A robust and reliable model describing the power produced by a photovoltaic system is needed in order to be able to detect module failures, inverter malfunction, shadowing effects and other factors that may result to energy losses. In addition, a reliable model enables an investor to perform accurate estimates of the system energy production, payback times etc. The model utilizes the global irradiance reaching the plane of the photovoltaic modules since in almost all Photovoltaic (PV) facilities the beam and the diffuse solar irradiances are not recorded. The airmass, the angle of incidence and the efficiency drop due to low values of solar irradiance are taken into account. Currently, the model is validated through the use of high quality data available from the National Renewable Energy Laboratory (USA). The data were acquired with IV tracers while the meteorological conditions were also recorded. Several modules of different technologies were deployed but here we present results from a single crystalline module. The performance of the model is acceptable at a level of 5% despite the assumptions made. The dependence of the residuals upon solar irradiance temperature, airmass and angle of incidence is also explored and future work is described.


Author(s):  
Arefa Shafique Shaikh

In the coming years, sensors will likely grow in every aspect of our lives. Several activities explain how the Internet of Things (IoT) will have an impact on almost all aspect of our lives and why security is at the top of the list of IoT challenges. Device to Device communications (D2D) in IoT are forecast and another major concern within the use of IoT is to make sure device security, D2D connectivity and high quality data. Therefore, a proper communication protocol is required to fix this issues. To address this, we purpose the use of Message Queue Telemetry Transport(MQTT)protocol to transfer data between devices, as it is more secured. MQTT (Message Queuing Telemetry Transport) is a publish/subscribe messaging protocol which works on top of the TCP/IP protocol. The key feature of MQTT is its light weight, adds flexible authentication and bandwidth efficiency. The result of this study is transferring high quality data securely using MQTT protocol.


2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Nicolas Scalzitti ◽  
Arnaud Kress ◽  
Romain Orhand ◽  
Thomas Weber ◽  
Luc Moulinier ◽  
...  

Abstract Background Ab initio prediction of splice sites is an essential step in eukaryotic genome annotation. Recent predictors have exploited Deep Learning algorithms and reliable gene structures from model organisms. However, Deep Learning methods for non-model organisms are lacking. Results We developed Spliceator to predict splice sites in a wide range of species, including model and non-model organisms. Spliceator uses a convolutional neural network and is trained on carefully validated data from over 100 organisms. We show that Spliceator achieves consistently high accuracy (89–92%) compared to existing methods on independent benchmarks from human, fish, fly, worm, plant and protist organisms. Conclusions Spliceator is a new Deep Learning method trained on high-quality data, which can be used to predict splice sites in diverse organisms, ranging from human to protists, with consistently high accuracy.


2019 ◽  
Vol 116 (14) ◽  
pp. 6531-6539 ◽  
Author(s):  
Morgan R. Frank ◽  
David Autor ◽  
James E. Bessen ◽  
Erik Brynjolfsson ◽  
Manuel Cebrian ◽  
...  

Rapid advances in artificial intelligence (AI) and automation technologies have the potential to significantly disrupt labor markets. While AI and automation can augment the productivity of some workers, they can replace the work done by others and will likely transform almost all occupations at least to some degree. Rising automation is happening in a period of growing economic inequality, raising fears of mass technological unemployment and a renewed call for policy efforts to address the consequences of technological change. In this paper we discuss the barriers that inhibit scientists from measuring the effects of AI and automation on the future of work. These barriers include the lack of high-quality data about the nature of work (e.g., the dynamic requirements of occupations), lack of empirically informed models of key microlevel processes (e.g., skill substitution and human–machine complementarity), and insufficient understanding of how cognitive technologies interact with broader economic dynamics and institutional mechanisms (e.g., urban migration and international trade policy). Overcoming these barriers requires improvements in the longitudinal and spatial resolution of data, as well as refinements to data on workplace skills. These improvements will enable multidisciplinary research to quantitatively monitor and predict the complex evolution of work in tandem with technological progress. Finally, given the fundamental uncertainty in predicting technological change, we recommend developing a decision framework that focuses on resilience to unexpected scenarios in addition to general equilibrium behavior.


2020 ◽  
Author(s):  
Kevin Jooss ◽  
John P. McGee ◽  
Rafael D. Melani ◽  
Neil L. Kelleher

AbstractNative mass spectrometry (nMS) is a rapidly growing method for the characterization of large proteins and protein complexes, preserving “native” non-covalent inter- and intramolecular interactions. Direct infusion of purified analytes into a mass spectrometer represents the standard approach for conducting nMS experiments. Alternatively, CZE can be performed under native conditions, providing high separation performance while consuming trace amounts of sample material. Here, we provide standard operating procedures for acquiring high quality data using CZE in native mode coupled online to various Orbitrap mass spectrometers via a commercial sheathless interface, covering a wide range of analytes from 30 – 800 kDa. Using a standard protein mix, the influence of various CZE method parameters were evaluated, such as BGE/conductive liquid composition and separation voltage. Additionally, a universal approach for the optimization of fragmentation settings in the context of protein subunit and metalloenzyme characterization is discussed in detail for model analytes. A short section is dedicated to troubleshooting of the nCZE-MS setup. This study is aimed to help normalize nCZE-MS practices to enhance the CE community and provide a resource for production of reproducible and high-quality data.


2020 ◽  
Author(s):  
James McDonagh ◽  
William Swope ◽  
Richard L. Anderson ◽  
Michael Johnston ◽  
David J. Bray

Digitization offers significant opportunities for the formulated product industry to transform the way it works and develop new methods of business. R&D is one area of operation that is challenging to take advantage of these technologies due to its high level of domain specialisation and creativity but the benefits could be significant. Recent developments of base level technologies such as artificial intelligence (AI)/machine learning (ML), robotics and high performance computing (HPC), to name a few, present disruptive and transformative technologies which could offer new insights, discovery methods and enhanced chemical control when combined in a digital ecosystem of connectivity, distributive services and decentralisation. At the fundamental level, research in these technologies has shown that new physical and chemical insights can be gained, which in turn can augment experimental R&D approaches through physics-based chemical simulation, data driven models and hybrid approaches. In all of these cases, high quality data is required to build and validate models in addition to the skills and expertise to exploit such methods. In this article we give an overview of some of the digital technology demonstrators we have developed for formulated product R&D. We discuss the challenges in building and deploying these demonstrators.<br>


2020 ◽  
Vol 4 (1) ◽  
pp. 14-28
Author(s):  
S. K. Gaikwad ◽  
N. D. Pathan ◽  
N. S. Bansode ◽  
S. P. Gaikwad ◽  
Y. P. Badhe ◽  
...  

To study the chemistry of major ion in groundwater from Vel (Velu) River basin, sixty (60) samples of dug wells and bore wells were collected and analyzed using standard techniques given by APHA. It shows order of dominance for cations, Na+ > Ca2+ > Mg2+ > K+ and in anionic concentration as HCO3- > Cl- > SO42- in groundwater. The pH of groundwater is slightly alkaline (range: pH 7.0 - 8.1), while average values of Electrical Conductivity (EC) is about 2641 µS/cm indicating high mineralization of groundwater. In general, the cationic concentration (Na+, K+, Ca2+ and Mg2+) of the groundwater increase in the downstream side (from Northwest to South east), suggesting geological control on the composition of groundwater while highest concentration is in lower part of the basin are generally associated with the high salinity. In the major anions, bicarbonate (HCO3-) is higher due to rock-water interaction. Average value of chloride is about of 235 mg/L due to discharge zones along with anthropogenic activities. The geochemical data plotted on Piper Trilinear Diagram is showing dominant hydro-chemical facies: Ca2++Mg2+, Na++ K+, Cl-+ SO42- -HCO3- found in 83.3 % samples indicating the alkaline earth exceeding the alkalis and the strong acids exceeds the weak acids. The pH, Total Hardness (TH) and Magnesium (Mg2+) of the samples show more proportion of samples falling above desirable limit. Otherwise the quality of groundwater is good for drinking. The irrigation indices like SAR, KR and SSP were considered to evaluate groundwater suitability for irrigation. Comparing with SAR parameter all samples are excellent to good for irrigation. In SSP, 33.3 % samples are within permissible, while 66.6% samples are doubtful for irrigation purpose. In KR almost all samples (excluding 04 samples in lower side of basin) are suitable for irrigation. So, variations in climate, geology with anthropogenic activities are modifying the groundwater geochemistry of Vel River Basin.


Author(s):  
Rami Obeid ◽  
Elias Wehbe ◽  
Mohamad Rima ◽  
Mohammad Kabara ◽  
Romeo Al Bersaoui ◽  
...  

Background: Tobacco mosaic virus (TMV) is the most known virus in the plant mosaic virus family and is able to infect a wide range of crops, in particularly tobacco, causing a production loss. Objectives: Herein, and for the first time in Lebanon, we investigated the presence of TMV infection in crops by analyzing 88 samples of tobacco, tomato, cucumber and pepper collected from different regions in North Lebanon. Methods: Double-antibody sandwich enzyme-linked immunosorbent assay (DAS-ELISA), revealed a potential TMV infection of four tobacco samples out of 88 crops samples collected. However, no tomato, cucumber and pepper samples were infected. The TMV+ tobacco samples were then extensively analyzed by RT-PCR to detect viral RNA using different primers covering all the viral genome. Results and Discussion: PCR results confirmed those of DAS-ELISA showing TMV infection of four tobacco samples collected from three crop fields of North Lebanon. In only one of four TMV+ samples, we were able to amplify almost all the regions of viral genome, suggesting possible mutations in the virus genome or an infection with a new, not yet identified, TMV strain. Conclusion: Our study is the first in Lebanon revealing TMV infection in crop fields, and highlighting the danger that may affect the future of agriculture.


Sign in / Sign up

Export Citation Format

Share Document