Age of peat-based lupin and chickpea inoculants in relation to quality and efficacy

2005 ◽  
Vol 45 (3) ◽  
pp. 183 ◽  
Author(s):  
E. J. Hartley ◽  
L. G. Gemell ◽  
J. F. Slattery ◽  
J. G. Howieson ◽  
D. F. Herridge

Extension of the current 12-month expiry of rhizobial inoculants in Australia to 18 months would have commercial benefits for the manufacturers and resellers. The dilemma, however, is that numbers of rhizobia in the inoculants decline over time and individual cells may lose efficacy. The research undertaken in this study shows the effect of lupin and chickpea inoculant age (i.e. 0, 6, 12, 15 and 18 months old) on numbers of rhizobia, rhizobial cell characteristics and efficacy. For the latter, assessments included colony size on plates, survival on inoculated beads, and infectiveness and effectiveness in field experiments at 3 sites. Assessment of commercially produced inoculants at the Australian Legume Inoculants Research Unit (ALIRU) laboratory indicated that, on average, chickpea and lupin inoculants had counts of about log10 9.6 when fresh, delivering >log10 6 rhizobia/seed. At 12 months, the average counts had fallen to log10 9.4, delivering slightly less than log10 6 rhizobia/seed. By 18 months, average counts were log10 9.3, delivering log10 5.9 rhizobia/seed. The lines of best fit indicated decline rates of 0.0005 log10 units/day. We found no evidence that the rhizobia in the older inoculants (i.e. >12 months old) had lost any ability to grow on nutrient agar, survive on inoculated beads, and nodulate and fix nitrogen with the host plant. While the chickpea and lupin inoculants produced currently in Australia are as efficacious after 18 months of storage at 4°C as they are when fresh, efficacy of other inoculant types may fall below acceptable levels at <12 months.

1992 ◽  
Vol 23 (1) ◽  
pp. 13-26 ◽  
Author(s):  
W. H. Hendershot ◽  
L. Mendes ◽  
H. Lalande ◽  
F. Courchesne ◽  
S. Savoie

In order to determine how water flowpath controls stream chemistry, we studied both soil and stream water during spring snowmelt, 1985. Soil solution concentrations of base cations were relatively constant over time indicating that cation exchange was controlling cation concentrations. Similarly SO4 adsorption-desorption or precipitation-dissolution reactions with the matrix were controlling its concentrations. On the other hand, NO3 appeared to be controlled by uptake by plants or microorganisms or by denitrification since their concentrations in the soil fell abruptly as snowmelt proceeded. Dissolved Al and pH varied vertically in the soil profile and their pattern in the stream indicated clearly the importance of water flowpath on stream chemistry. Although Al increased as pH decreased, the relationship does not appear to be controlled by gibbsite. The best fit of calculated dissolved inorganic Al was obtained using AlOHSO4 with a solubility less than that of pure crystalline jurbanite.


BMC Biology ◽  
2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Elise J. Gay ◽  
Jessica L. Soyer ◽  
Nicolas Lapalu ◽  
Juliette Linglin ◽  
Isabelle Fudal ◽  
...  

Abstract Background The fungus Leptosphaeria maculans has an exceptionally long and complex relationship with its host plant, Brassica napus, during which it switches between different lifestyles, including asymptomatic, biotrophic, necrotrophic, and saprotrophic stages. The fungus is also exemplary of “two-speed” genome organisms in the genome of which gene-rich and repeat-rich regions alternate. Except for a few stages of plant infection under controlled conditions, nothing is known about the genes mobilized by the fungus throughout its life cycle, which may last several years in the field. Results We performed RNA-seq on samples corresponding to all stages of the interaction of L. maculans with its host plant, either alive or dead (stem residues after harvest) in controlled conditions or in field experiments under natural inoculum pressure, over periods of time ranging from a few days to months or years. A total of 102 biological samples corresponding to 37 sets of conditions were analyzed. We show here that about 9% of the genes of this fungus are highly expressed during its interactions with its host plant. These genes are distributed into eight well-defined expression clusters, corresponding to specific infection lifestyles or to tissue-specific genes. All expression clusters are enriched in effector genes, and one cluster is specific to the saprophytic lifestyle on plant residues. One cluster, including genes known to be involved in the first phase of asymptomatic fungal growth in leaves, is re-used at each asymptomatic growth stage, regardless of the type of organ infected. The expression of the genes of this cluster is repeatedly turned on and off during infection. Whatever their expression profile, the genes of these clusters are enriched in heterochromatin regions associated with H3K9me3 or H3K27me3 repressive marks. These findings provide support for the hypothesis that part of the fungal genes involved in niche adaptation is located in heterochromatic regions of the genome, conferring an extreme plasticity of expression. Conclusion This work opens up new avenues for plant disease control, by identifying stage-specific effectors that could be used as targets for the identification of novel durable disease resistance genes, or for the in-depth analysis of chromatin remodeling during plant infection, which could be manipulated to interfere with the global expression of effector genes at crucial stages of plant infection.


Author(s):  
Joel Hellewell ◽  
Timothy W. Russell ◽  
Rupert Beale ◽  
Gavin Kelly ◽  
Catherine Houlihan ◽  
...  

AbstractBackgroundRoutine asymptomatic testing using RT-PCR of people who interact with vulnerable populations, such as medical staff in hospitals or care workers in care homes, has been employed to help prevent outbreaks among vulnerable populations. Although the peak sensitivity of RT-PCR can be high, the probability of detecting an infection will vary throughout the course of an infection. The effectiveness of routine asymptomatic testing will therefore depend on testing frequency and how PCR detection varies over time.MethodsWe fitted a Bayesian statistical model to a dataset of twice weekly PCR tests of UK healthcare workers performed by self-administered nasopharyngeal swab, regardless of symptoms. We jointly estimated times of infection and the probability of a positive PCR test over time following infection, we then compared asymptomatic testing strategies by calculating the probability that a symptomatic infection is detected before symptom onset and the probability that an asymptomatic infection is detected within 7 days of infection.FindingsWe estimated that the probability that the PCR test detected infection peaked at 77% (54 - 88%) 4 days after infection, decreasing to 50% (38 - 65%) by 10 days after infection. Our results suggest a substantially higher probability of detecting infections 1–3 days after infection than previously published estimates. We estimated that testing every other day would detect 57% (33-76%) of symptomatic cases prior to onset and 94% (75-99%) of asymptomatic cases within 7 days if test results were returned within a day.InterpretationOur results suggest that routine asymptomatic testing can enable detection of a high proportion of infected individuals early in their infection, provided that the testing is frequent and the time from testing to notification of results is sufficiently fast.FundingWellcome Trust, National Institute for Health Research (NIHR) Health Protection Research Unit, Medical Research Council (UKRI)


2021 ◽  
Vol 42 (2) ◽  
pp. 95-98
Author(s):  
Anthony Olson
Keyword(s):  

In this essay, I explain how I switched the lens of my  sophomore research unit to one that focuses on rural issues. This essay follows the unit from beginning to end. I explain what I do to raise awareness through the use of daily articles along with providing models for their own research. The essay then details the writing portion and how it has changed over time. The essay ends with a reflection of my work and choices. 


2007 ◽  
Vol 38 (7) ◽  
pp. 1001-1011 ◽  
Author(s):  
K. S. Kendler ◽  
K. Jacobson ◽  
J. M. Myers ◽  
L. J. Eaves

BackgroundConduct disorder (CD) and peer deviance (PD) both powerfully predict future externalizing behaviors. Although levels of CD and PD are strongly correlated, the causal relationship between them has remained controversial and has not been examined by a genetically informative study.MethodLevels of CD and PD were assessed in 746 adult male–male twin pairs at personal interview for ages 8–11, 12–14 and 15–17 years using a life history calendar. Model fitting was performed using the Mx program.ResultsThe best-fit model indicated an active developmental relationship between CD and PD including forward transmission of both traits over time and strong causal relationships between CD and PD within time periods. The best-fit model indicated that the causal relationship for genetic risk factors was from CD to PD and was constant over time. For common environmental factors, the causal pathways ran from PD to CD and were stronger in earlier than later age periods.ConclusionA genetically informative model revealed causal pathways difficult to elucidate by other methods. Genes influence risk for CD, which, through social selection, impacts on the deviance of peers. Shared environment, through family and community processes, encourages or discourages adolescent deviant behavior, which, via social influence, alters risk for CD. Social influence is more important than social selection in childhood, but by late adolescence social selection becomes predominant. These findings have implications for prevention efforts for CD and associated externalizing disorders.


2011 ◽  
Vol 47 (3) ◽  
pp. 509-527 ◽  
Author(s):  
A. S. KARUNARATNE ◽  
S. N. AZAM-ALI ◽  
G. IZZI ◽  
P. STEDUTO

SUMMARYSimulation of yield response to water plays an increasingly important role in optimization of crop water productivity (WP) especially in prevalent drought in Africa. The present study is focused on a representative crop: bambara groundnut (Vigna subterranea), an ancient grain legume grown, cooked, processed and traded mainly by subsistence women farmers in sub-Saharan Africa. Over four years (2002, 2006–2008), glasshouse experiments were conducted at the Tropical Crops Research Unit, University of Nottingham, UK under controlled environments with different landraces, temperatures (23 ± 5 °C, 28 ± 5 °C, 33 ± 5 °C) and soil moisture regimes (irrigated, early drought, late drought). Parallel to this, field experiments were conducted in Swaziland (2002/2003) and Botswana (2007/2008). Crop measurements of canopy cover (CC), biomass (B) and pod yield (Y) of selected experiments from glasshouse (2006 and 2007) and field (Botswana) were used to calibrate the FAO AquaCrop model. Subsequently, the model was validated against independent data sets from glasshouse (2002 and 2008) and field (Swaziland) for different landraces. AquaCrop simulations for CC, B and Y of different bambara groundnut landraces are in good agreement with observed data with R2 (CC-0.88; B-0.78; Y-0.72), but with significant underestimation for some landraces.


2015 ◽  
Vol 105 (5) ◽  
pp. 452-456 ◽  
Author(s):  
Steven Blader ◽  
Claudine Gartenberg ◽  
Rebecca Henderson ◽  
Andrea Prat

Does the “soft side” of management matter? Many managers assert that “firm culture” is strongly correlated with productivity, but there are few robust tests of this assertion. In a set of field experiments, we study driver productivity within a large US logistics company that is arguably transitioning from one relational contract to another, while leaving formal practices and incentives unchanged. We find that sites under the new contract are associated with 1/8 percent higher productivity. Our findings suggest that relational contracts have a first-order effect on productivity and that they can be altered over time.


2019 ◽  
Author(s):  
Daniel Rasche ◽  
Christian Reinhardt-Imjela ◽  
Achim Schulte ◽  
Robert Wenzel

Abstract. Fifteen years after introducing the European Union's water framework directive (WFD), most of the German surface water bodies are still far away from having the targeted good ecological status or potential. One reason are insufficient hydromorphological diversities such as riverbed structure including the absence of natural woody debris in the channels. The presence of large woody debris (LWD) in river channels can improve the hydromorphological and hydraulic characteristics of rivers and streams and therefore act positively on a river’s ecology. On the contrary, floating LWD is a potential threat for anthropogenic goods and infrastructure during flood events. Concerning the contradiction of potential risks as well as positive ecological impacts, addressing the physical effects of large woody debris is highly important, for example to identify river sections in which large woody debris can remain or can be reintroduced. Hydrodynamic models offer the possibility of investigating the hydraulic effects of fastened large woody debris. In such models roughness coefficients are commonly used to implement LWD, however, because of the complexity of the shape of LWD elements this approach seems to be too simple and not appropriate to simulate its diverse effects especially on flood hydrographs. Against this background a two-dimensional hydraulic model is set up for a mountain creek to simulate the hydraulic effects of LWD and to test different methods of LWD implementation. The study area comprises a 282 m long reach of the Ullersdorfer Teichbächel, a creek in the Ore Mountains (South-eastern Germany). In previous studies, field experiments with artificially generated flood events have been performed with and without LWD in the channel. Discharge time series from the experiments allow a validation of the model outputs with field observations. Methodically, in-channel roughness coefficients are changed iteratively for retrieving the best fit between mean simulated and observed flood hydrographs with and without LWD at the downstream reach outlet. In addition, roughness values are modified at LWD positions only and, simplified discrete elements representing LWD were incorporated into the calculation mesh. In general, the model results reveal a good simulation of the observed flood hydrographs of the field experiments without in-channel large woody debris. This indicates the applicability of the model used in the studied reach of a creek in low mountain ranges. The best fit of simulation and mean observed hydrograph with in-channel LWD can be obtained when increasing in-channel roughness through decreasing Strickler coefficients by 30 % in the entire reach or 55 % at LWD positions only. However, the increase of roughness in the entire reach shows a better simulation of the observed hydrograph, indicating that LWD elements affect sections beyond their own dimensions i.e. by forming downstream wake fields. The best fit in terms of the hydrograph's general shape can be achieved by integrating discrete elements into the calculation mesh. The emerging temporal shift between simulation and observation can be attributed to mesh impermeability and element dimensions causing too intense water retention and flow alteration. The results illustrate that the mean observed hydrograph can be satisfactorily modelled using roughness coefficients. Nevertheless, discrete elements result in a better fitting shape of the simulated hydrograph. In conclusion, a time-consuming and work-intensive mesh manipulation is suitable for analysing detailed flow conditions using computational fluid dynamics (CFD) on small spatio-temporal scale. Here, a close-to-nature design of discrete LWD objects is essential to retrieve accurate results. In contrast, the reach-wise adjustment of in-channel roughness coefficients is useful in larger scale model applications such as 1D-hydrodynamic or rainfall-runoff simulations on catchment scale.


2021 ◽  
Vol 5 (Supplement_1) ◽  
pp. 881-881
Author(s):  
Tara Klinedinst ◽  
Lauren Terhorst ◽  
Juleen Rodakowski

Abstract Recent evidence shows that more complex clusters of chronic conditions are associated with poorer health outcomes. Less clear is the extent to which these clusters are associated with different types of disability (basic and instrumental activities of daily living (ADL, IADL) and functional mobility (FM)) over time. This was a longitudinal analysis using the National Health and Aging Trends Study (NHATS) (n = 6,179). Using latent class analysis, we determined the optimal clusters of chronic conditions, then assigned each person to a best-fit class. Next, we used mixed-effects models with repeated measures to examine the effects of group (best-fit class), time (years from baseline), and the group by time interaction on each of the outcomes in separate models over 4 years. We identified 5 chronic condition clusters: “multisystem morbidity” (13.9% of the sample), “diabetes” (39.5%), “osteoporosis” (24.9%), “cardio/stroke/cancer” (4.5%), and “minimal disease” (17.3%). Group by time interaction was not significant for any outcome. For ADL outcome, only time was significant (F3,16249 = 224.72, p &lt; .001). For IADL, both group (F4,5403 = 6.62, p &lt; .001) and time (F3,22622 = 3.87, p = .009) were significant. For FM, both group (F4,5920 = 2.96, p = .02) and time were significant (F3,16381 = 213.41, p &lt; .001). We did not find evidence that any cluster experienced greater increases in disability over time, but all clusters containing multiple chronic conditions had risk of IADL and FM disability. Increased screening for IADL and FM disability could identify early disability and prevent decline.


2012 ◽  
Vol 36 (2) ◽  
pp. 401-409
Author(s):  
Felipe de Campos Carmona ◽  
Ibanor Anghinoni ◽  
Eduardo Giacomelli Cao

Rice in Rio Grande do Sul State is grown mostly under flooding, which induces a series of chemical, physical and biological changes in the root environment. These changes, combined with the presence of rice plants, affect the availability of exchangeable ammonium (NH4+) and pH of soil solution, whereas the dynamics of both variables can be influenced by soil salinity, a common problem in the coastal region. This study was conducted to evaluate the dynamics of exchangeable NH4+ and pH in the soil solution, and their relation in the solution of Albaqualf soils with different salinity levels, under rice. Four field experiments were conducted with soils with exchangeable Na percentage (ESP) of 5.6, 9.0, 21.2, and 32.7 %. Prior to flooding, soil solution collectors were installed at depths of 5, 10 and 20 cm. The soil solution was collected weekly, from 7 to 91 days after flooding (DAF), to analyze exchangeable NH4+ and pH in the samples. Plant tissue was sampled 77 DAF, to determine N uptake and estimate the contribution of other N forms to rice nutrition. The content of exchangeable NH4+ decreased over time at all sites and depths, with a more pronounced reduction in soils with lower salinity levels, reaching values close to zero. A possible contribution of non-exchangeable NH4+ forms and N from soil organic matter to rice nutrition was observed. Soil pH decreased with time in soils with ESP 5.6 and 9.0 %, being positively correlated with the decreasing NH4+ levels at these sites.


Sign in / Sign up

Export Citation Format

Share Document