The control of cereal root diseases in Western Australia.

1974 ◽  
Vol 3 (2) ◽  
pp. 51 ◽  
Author(s):  
CA Parker
2013 ◽  
Vol 161 (11-12) ◽  
pp. 828-840 ◽  
Author(s):  
Ravjit K. Khangura ◽  
Gordon C. MacNish ◽  
William J. MacLeod ◽  
Vivien A. Vanstone ◽  
Colin D. Hanbury ◽  
...  

2015 ◽  
Vol 105 (8) ◽  
pp. 1069-1079 ◽  
Author(s):  
Grant J. Poole ◽  
Martin Harries ◽  
D. Hüberli ◽  
S. Miyan ◽  
W. J. MacLeod ◽  
...  

Root diseases have long been prevalent in Australian grain-growing regions, and most management decisions to reduce the risk of yield loss need to be implemented before the crop is sown. The levels of pathogens that cause the major root diseases can be measured using DNA-based services such as PreDicta B. Although these pathogens are often studied individually, in the field they often occur as mixed populations and their combined effect on crop production is likely to vary across diverse cropping environments. A 3-year survey was conducted covering most cropping regions in Western Australia, utilizing PreDicta B to determine soilborne pathogen levels and visual assessments to score root health and incidence of individual crop root diseases caused by the major root pathogens, including Rhizoctonia solani (anastomosis group [AG]-8), Gaeumannomyces graminis var. tritici (take-all), Fusarium pseudograminearum, and Pratylenchus spp. (root-lesion nematodes) on wheat roots for 115, 50, and 94 fields during 2010, 2011, and 2012, respectively. A predictive model was developed for root health utilizing autumn and summer rainfall and soil temperature parameters. The model showed that pathogen DNA explained 16, 5, and 2% of the variation in root health whereas environmental parameters explained 22, 11, and 1% of the variation in 2010, 2011, and 2012, respectively. Results showed that R. solani AG-8 soil pathogen DNA, environmental soil temperature, and rainfall parameters explained most of the variation in the root health. This research shows that interactions between environment and pathogen levels before seeding can be utilized in predictive models to improve assessment of risk from root diseases to assist growers to plan more profitable cropping programs.


2011 ◽  
Vol 131 ◽  
pp. 39-48 ◽  
Author(s):  
Xiangling Fang ◽  
Dennis Phillips ◽  
Hua Li ◽  
Krishnapillai Sivasithamparam ◽  
Martin J. Barbetti

2010 ◽  
Vol 40 (2) ◽  
pp. 109-119 ◽  
Author(s):  
X. L. Fang ◽  
D. Phillips ◽  
Hua Li ◽  
K. Sivasithamparam ◽  
M. J. Barbetti

1993 ◽  
Vol 33 (7) ◽  
pp. 885 ◽  
Author(s):  
M Incerti ◽  
PWG Sale ◽  
GJ O'Leary

Two experiments were conducted at the Mallee Research Station, Walpeup, between 1985 and 1989 to determine whether increases in wheat yield that occur after long fallows result from improvements in the supply and use of additional soil water conserved during the fallow. Although long fallows increased the amount of water stored in the soil at sowing (average 22 mm) and the yield of wheat (0.26 to 1.37 t/ha) in the first experiment, the results suggest no causal relationship between these increases. Improvements in wheat yield were attributed to increases in soil nitrogen availability and to control of cereal root diseases rather than to any increase in soil water conservation. This was confirmed in the second experiment, which was managed to ensure that nitrogen supply and cereal root diseases were not limiting crop production. Increases in soil water content at sowing resulting from long fallows did not result in higher wheat yields. This study suggests that long fallows cannot be justified on the basis of this increased soil water storage, as much of the additional soil water accumulated during the fallow period is stored in the lower part of the rootzone. Movement of this water below the rootzone during the growing season appears to be the main reason for the additional water stored at sowing, with long fallows failing to increase wheat growth and yield.


2020 ◽  
Vol 646 ◽  
pp. 79-92
Author(s):  
RE Scheibling ◽  
R Black

Population dynamics and life history traits of the ‘giant’ limpet Scutellastra laticostata on intertidal limestone platforms at Rottnest Island, Western Australia, were recorded by interannual (January/February) monitoring of limpet density and size structure, and relocation of marked individuals, at 3 locations over periods of 13-16 yr between 1993 and 2020. Limpet densities ranged from 4 to 9 ind. m-2 on wave-swept seaward margins of platforms at 2 locations and on a rocky notch at the landward margin of the platform at a third. Juvenile recruits (25-55 mm shell length) were present each year, usually at low densities (<1 m-2), but localized pulses of recruitment occurred in some years. Annual survival rates of marked limpets varied among sites and cohorts, ranging from 0.42 yr-1 at the notch to 0.79 and 0.87 yr-1 on the platforms. A mass mortality of limpets on the platforms occurred in 2003, likely mediated by thermal stress during daytime low tides, coincident with high air temperatures and calm seas. Juveniles grew rapidly to adult size within 2 yr. Asymptotic size (L∞, von Bertalanffy growth model) ranged from 89 to 97 mm, and maximum size from 100 to 113 mm, on platforms. Growth rate and maximum size were lower on the notch. Our empirical observations and simulation models suggest that these populations are relatively stable on a decadal time scale. The frequency and magnitude of recruitment pulses and high rate of adult survival provide considerable inertia, enabling persistence of these populations in the face of sporadic climatic extremes.


Sign in / Sign up

Export Citation Format

Share Document