Stubble Management Practices and the Survival of Fusarium Graminearum Group 1 in Wheat Stubble Residues.

1988 ◽  
Vol 17 (4) ◽  
pp. 88 ◽  
Author(s):  
BA Summerell ◽  
LW Burgess
1994 ◽  
Vol 34 (5) ◽  
pp. 655 ◽  
Author(s):  
KE Nelson ◽  
LW Burgess

The incidence of infection by Fusarium graminearum Group 1 and the incidence of crown rot were compared for various cultivars of oats, wheat and barley in glasshouse and field experiments. In glasshouse studies, the incidence of infected plants was lower in oats than in wheat or barley at 6 weeks after sowing. Crown rot symptoms were not observed in oats of any cultivar. The incidence and severity of crown rot in barley was similar to that in wheat cv. Banks. Between 17 and 29 genotypes of oats, wheat and barley were assessed in field trials over 3 years. Stem browning, a symptom of crown rot, was common in wheat and barley but was not observed in any cultivar of oats. Among plants of wheat and barley, the effect of cultivar on incidence of crown rot was significant (P = 0.05) in 3 of 4 trials. Results suggest that oats are a symptomless host. This should be considered when growing oats in rotation to reduce crown rot inoculum. The barley cultivars assessed all developed moderate-severe crown rot symptoms and thus may incur yield limitations where crown rot is prevalent.


1991 ◽  
Vol 42 (3) ◽  
pp. 399 ◽  
Author(s):  
TA Klein ◽  
LW Burgess ◽  
FW Ellison

The incidence and spatial patterns of wheat plants infected by Fusarium graminearum Group 1 were assessed in six fields in northern New South Wales, Australia, over a four-year period. The incidence of infected plants declined from 1978 to 1981 in fields where wheat was sown each year, where there was a bare fallow of 18 months and where sunflowers were sown in one season. The pattern of infected plants tended to be regular (uniform) where the incidence of infected plants was particularly high (> 96%). In all fields where a clustered (aggregated) pattern was detected, 12% to 64.4% of plants were infected. A random pattern was observed at a number of sites. There was a positive association between loss in potential yield and the incidence of infection, basal browning of plants and whiteheads. Losses of up to 89% were recorded.


1987 ◽  
Vol 38 (3) ◽  
pp. 473
Author(s):  
RL Dodman ◽  
GB Wildermuth

A range of inocululatron methods for assessing resistance in wheat to crown rot caused by Fusarium graminearum Group 1 was evaluated in the glasshouse and in the field. When grain was colonized with the pathogen, ground and applied with the seed at planting or spread around young plants as an aqueous suspension, high levels of crown rot were produced, but resistance (usually measured as per cent diseased plants or tillers) was still expressed. Similar results were achieved with induced field inoculum obtained by inoculating an area of wheat to obtain a high incidence of disease and incorporating the stubble into the soil. Natural field inoculum and inoculation of seed with spores produced lower levels of disease, although differentiation of resistant and susceptible cultivars was still possible. Other methods, suitable only for plants in pots and often for more specific purposes (for example, for inoculation at different stages of plant growth) were also studied. Resistance was best expressed where inoculum was applied onto or into soil, rather than directly onto or into plants. Currently, the resistance of all potential cultivars for release in Queensland is assessed in the field by sowing seed dusted with benomyl into furrows along which ground, colonized grain is distributed. Crown rot severity is then determined at maturity.


1987 ◽  
Vol 38 (3) ◽  
pp. 473 ◽  
Author(s):  
RL Dodman ◽  
GB Wildermuth

A range of inocululatron methods for assessing resistance in wheat to crown rot caused by Fusarium graminearum Group 1 was evaluated in the glasshouse and in the field. When grain was colonized with the pathogen, ground and applied with the seed at planting or spread around young plants as an aqueous suspension, high levels of crown rot were produced, but resistance (usually measured as per cent diseased plants or tillers) was still expressed. Similar results were achieved with induced field inoculum obtained by inoculating an area of wheat to obtain a high incidence of disease and incorporating the stubble into the soil. Natural field inoculum and inoculation of seed with spores produced lower levels of disease, although differentiation of resistant and susceptible cultivars was still possible. Other methods, suitable only for plants in pots and often for more specific purposes (for example, for inoculation at different stages of plant growth) were also studied. Resistance was best expressed where inoculum was applied onto or into soil, rather than directly onto or into plants. Currently, the resistance of all potential cultivars for release in Queensland is assessed in the field by sowing seed dusted with benomyl into furrows along which ground, colonized grain is distributed. Crown rot severity is then determined at maturity.


1995 ◽  
Vol 35 (6) ◽  
pp. 765 ◽  
Author(s):  
KE Nelson ◽  
LW Burgess

We investigated the incidence of Fusarium graminearum Group 1 (infection, stem colonisation) and crown rot in 3-year crop sequences of 1 or 2 years of barley, oats, or mown oats followed by wheat, compared with 3 years of wheat. Seed was sown into the stubble of the previous crop. Stubble production was estimated for each cereal treatment. Plants of each cereal were infected by the crown rot pathogen. Oats were susceptible to infection but did not express symptoms of crown rot in 2 years of the trial. Oats can, therefore, be considered a symptomless host that may contribute to the maintenance of inoculum. The overall mean incidence of infected plants increased from 12% in 1987 to 81% in 1989. The various treatments did not significantly reduce the incidence of infected wheat plants in November of the final year. The incidence of crown rot of wheat in 1989 was greatest after 2 prior wheat crops and lowest after 1 or 2 years of mown oats. The 3 species produced a similar amount of straw by weight; however, mown oats produced significantly less. Oat straw decomposed more rapidly than that of other cereals in controlled conditions.


1998 ◽  
Vol 12 (3) ◽  
pp. 522-526 ◽  
Author(s):  
Theodore M. Webster ◽  
John Cardina ◽  
Mark M. Loux

The objectives of this study were to determine how the timing of weed management treatments in winter wheat stubble affects weed control the following season and to determine if spring herbicide rates in corn can be reduced with appropriately timed stubble management practices. Field studies were conducted at two sites in Ohio between 1993 and 1995. Wheat stubble treatments consisted of glyphosate (0.84 kg ae/ha) plus 2,4-D (0.48 kg ae/ha) applied in July, August, or September, or at all three timings, and a nontreated control. In the following season, spring herbicide treatments consisted of a full rate of atrazine (1.7 kg ai/ha) plus alachlor (2.8 kg ai/ha) preemergence, a half rate of these herbicides, or no spring herbicide treatment. Across all locations, a postharvest treatment of glyphosate plus 2,4-D followed by alachlor plus atrazine at half or full rates in the spring controlled all broadleaf weeds, except giant ragweed, at least 88%. Giant foxtail control at three locations was at least 83% when a postharvest glyphosate plus 2,4-D treatment was followed by spring applications of alachlor plus atrazine at half or full rates. Weed control in treatments without alachlor plus atrazine was variable, although broadleaf control from July and August glyphosate plus 2,4-D applications was greater than from September applications. Where alachlor and atrazine were not applied, August was generally the best timing of herbicide applications to wheat stubble for reducing weed populations the following season.


2020 ◽  
Vol 110 (4) ◽  
pp. 916-926
Author(s):  
Martha M. Vaughan ◽  
Todd J. Ward ◽  
Susan P. McCormick ◽  
Nathane Orwig ◽  
William T. Hay ◽  
...  

Fusarium graminearum is a causal agent of Fusarium head blight (FHB), a disease that reduces yield and quality of cereal crops and contaminates grain with mycotoxins that pose health risks to humans and livestock. Interpopulation antagonistic interactions between isolates that produce different trichothecene mycotoxins can reduce FHB in wheat, but it is not known if interactions between isolates with a shared population identity that produce the same trichothecenes have a similar effect. Using isolates from the predominant F. graminearum populations in North America (NA1 and NA2), we examined intrapopulation interactions by comparing growth, disease progression, and toxin production of individual isolates with multi-isolate mixes. In vitro, mycelial growth was significantly greater when most NA1 and NA2 isolates were cultured individually versus when cultured as a mixture of isolates from the same population. In susceptible wheat Norm, FHB generally progressed faster in heads inoculated with an individual isolate versus a multi-isolate mixture, but the antagonistic effect of intrapopulation interactions was more pronounced for NA1 than NA2 isolates. By contrast, in moderately resistant wheat Alsen, mixtures of isolates from either population caused obvious reductions in FHB development. Mycotoxin contamination was not consistently affected by intrapopulation interactions and varied depending on the interacting isolates from either population. Our results indicate that antagonistic intrapopulation interactions can influence FHB in controlled environmental conditions. Understanding if the regional composition of pathogen populations similarly influences FHB in the field could improve disease forecasting and management practices.


2013 ◽  
Vol 64 (8) ◽  
pp. 799 ◽  
Author(s):  
N. R. Hulugalle ◽  
T. B. Weaver ◽  
L. A. Finlay ◽  
V. Heimoana

Long-term studies of soil organic carbon dynamics in two- and three-crop rotations in irrigated cotton (Gossypium hirsutum L.) based cropping systems under varying stubble management practices in Australian Vertosols are relatively few. Our objective was to quantify soil organic carbon dynamics during a 9-year period in four irrigated, cotton-based cropping systems sown on permanent beds in a Vertosol with restricted subsoil drainage near Narrabri in north-western New South Wales, Australia. The experimental treatments were: cotton–cotton (CC); cotton–vetch (Vicia villosa Roth. in 2002–06, Vicia benghalensis L. in 2007–11) (CV); cotton–wheat (Triticum aestivum L.), where wheat stubble was incorporated (CW); and cotton–wheat–vetch, where wheat stubble was retained as in-situ mulch (CWV). Vetch was terminated during or just before flowering by a combination of mowing and contact herbicides, and the residues were retained as in situ mulch. Estimates of carbon sequestered by above- and below-ground biomass inputs were in the order CWV >> CW = CV > CC. Carbon concentrations in the 0–1.2 m depth and carbon storage in the 0–0.3 and 0–1.2 m depths were similar among all cropping systems. Net carbon sequestration rates did not differ among cropping systems and did not change significantly with time in the 0–0.3 m depth, but net losses occurred in the 0–1.2 m depth. The discrepancy between measured and estimated values of sequestered carbon suggests that either the value of 5% used to estimate carbon sequestration from biomass inputs was an overestimate for this site, or post-sequestration losses may have been high. The latter has not been investigated in Australian Vertosols. Future research efforts should identify the cause and quantify the magnitude of these losses of organic carbon from soil.


Sign in / Sign up

Export Citation Format

Share Document