The effect of inoculum distribution and sowing depth on Pleiochaeta root rot of lupins

1991 ◽  
Vol 42 (1) ◽  
pp. 121 ◽  
Author(s):  
MW Sweetingham

In paddocks with a history of previous lupin cultivation, propagules of the fungus Pleiochaeta setosa are most concentrated in the top 2 cm of soil and rapidly decline to zero at the base of the tillage layer (10-14 cm). The severity of Pleiochaeta root rot is greatly reduced as sowing depth increases, due to avoidance of the concentrated surface soil borne inoculum. Hypocotyls are not infected by P. setosa, enabling disease escape. In four field trials over three seasons, optimum establishment and grain yield occurred at sowing depths close to 5 cm, deeper than previously recommended for lupins in Western Australia.

2010 ◽  
Vol 61 (3) ◽  
pp. 241 ◽  
Author(s):  
G. J. Thomas ◽  
W. J. MacLeod ◽  
M. W. Sweetingham

Three separate surveys were carried out in commercial lupin crops in the major lupin growing region of Western Australia in 1986, 1990, and 2004–05. In total, 333 sites were sampled and plants assessed for the incidence and cause of root and hypocotyl rots. Measurements were made of plant density and sowing depth at all sites. In all surveys, root rot was more common than hypocotyl rot. Root rot occurred in more than 95% of sites in each survey; however, a greater proportion of sites had high levels of root rot in early surveys. The incidence of root rot within sites decreased from an average of 34.9% in 1986 to 10.2% in 2004–05. Hypocotyl rot incidence varied among surveys, incidence of infected paddocks, and within-paddock incidence was greatest in the 1990 survey. Hypocotyl rot incidence was lowest in the 2004–05 survey. Rhizoctonia solani and Pleiochaeta setosa were commonly isolated from root lesions and R. solani was the predominant pathogen isolated from hypocotyl lesions. Analysis of the R. solani isolates by pectic zymogram showed that the ZG3 strain was most regularly isolated from roots and hypocotyls. This series of surveys indicates that the incidence of root rots in commercial lupin paddocks in Western Australia has decreased dramatically over the past 20 years; however, root rot still occurs in most paddocks regardless of soil type, location, crop rotation, and management systems.


1987 ◽  
Vol 27 (5) ◽  
pp. 671 ◽  
Author(s):  
GC MacNish ◽  
CS Fang

The effects of short chemical fallows after ryegrass pasture on rhizoctonia bare patch and root rot ofwheat were studied in 2 experiments at the Esperance Downs Research Station, 35 km north of Esperance, W.A. In 1 experiment the subterranean-clover dominant pasture was sprayed with a paraquat-diquat mixture prior to resowing with annual ryegrass at densities ranging from 3 to about 400 plants m-2. The ryegrass was allowed to grow for either 42 or 63 days prior to treatment with a desiccant herbicide (paraquat-diquat) followed by a short chemical fallow of 26 or 5 days, respectively, before sowing with wheat using minimum tillage. Some treatments were cultivated twice to 10 cm. Neither the ryegrass density nor the length of chemical fallow had any effect (P=0.05) on rhizoctonia bare patch score or incidence or severity of root rot. However, cultivation caused 76% reduction in mean patch score and a 38 and 68% reduction in mean rhizoctonia incidence and severity respectively. Yield was negatively correlated with rhizoctonia incidence and severity: each 1% increase in incidence percentage resulted in 17 kg ha-1 reduction in grain yield of wheat. In another experiment, chemical fallow periods of 66, 52, 24 or 1 day prior to sowing wheat had no effect (P= 0.05) on rhizoctonia root rot incidence.


1991 ◽  
Vol 31 (2) ◽  
pp. 259 ◽  
Author(s):  
RF Brennan

The area of rhizoctonia bare patch and the incidence and severity of rhizoctonia root rot (caused by Rhizoctonia solani Khnn) were reduced by the application of ammonium nitrate fertiliser. Residual copper (Cu) from a Cu fertiliser treatment in 1967 had no effect on the area of rhizoctonia bare patch or the incidence and severity of root rot. With no applied nitrogen (N), 17.6% (mean of residual Cu levels) of the plot was affected by patches while the area of plot affected by patches declined to 4.2% where 92 kg N/ha had been applied. The incidence and severity of rhizoctonia root rot declined from 45.9 and 27.0% to 32.7 and 9.1%, respectively, with the application of N fertiliser. The grain yield of wheat supplied with adequate Cu increased although the level of N fertiliser exceeded that considered adequate for plant nutrition. The response is explained by the control of rhizoctonia bare patch. The area of rhizoctonia patches and the incidence and severity of rhizoctonia root rot decreased with the application of N, and with adequate Cu fertiliser (2.2 kg Cu/ha), the grain yields increased. However, with marginal and deficient levels of applied Cu fertiliser, the application of N fertiliser induced Cu deficiency in wheat plants, and the grain yields declined although rhizoctonia patches were reduced.


1977 ◽  
Vol 89 (1) ◽  
pp. 161-167 ◽  
Author(s):  
A. Hadjichristodoulou ◽  
Athena Della ◽  
J. Photiades

SummaryIn field trials conducted over 8 years the effect of sowing depth on plant establishment, tillering capacity, plant height, grain yield, top growth weight and patterns of root development of wheat and barley was studied. Establishment, number of grainbearing tillers per established plant, plant height at maturity and grain yield and top growth weight per plot as well as per plant were reduced with increase in sowing depth from 2 to 20 cm. Seedling emergence started earlier from large seeds and from shallow sowing. Establishment from large seeds of two varieties was better, especially for deep sowing and in clay soils. Several patterns of root and tiller development were observed at various sowing depths. Varietal differences in stand establishment under field conditions were not related to plant height. It was concluded that deep sowing beyond around 10 cm should be avoided because stand and plant vigour are adversely affected.


1990 ◽  
Vol 41 (4) ◽  
pp. 619 ◽  
Author(s):  
JM Wilson ◽  
J Hamblin

The effects of soil fumigation (98% methyl bromide + 2% chloropicrin at 580 kg/ha) and N fertilizer (0, 12.5, 25, 50 or 100 kg N/ha) were examined in field trials on continuous wheat and wheat in rotation with lupins on the Geraldton sandplain. Fumigation increased grain yields at N fertilizer levels more or =25 kg/ha and was associated with reduced incidence and severity of common root rot (Bipolaris sorokiniana)[Cochliobolus sativus]. Grain yield was not significantly affected by rotation. Fumigation increased soil ammonium levels and decreased soil nitrate levels. Rotation of wheat and lupins increased mid-season growth at all levels of applied N but only increased grain yield where no N was applied.


1982 ◽  
Vol 62 (4) ◽  
pp. 885-891 ◽  
Author(s):  
L. J. DUCZEK ◽  
L. J. PIENING

The effects of variable seeding depth and dates of seeding of barley on the incidence of root rot, and on emergence and grain yield were investigated in field trials at Saskatoon and Scott, Saskatchewan. The effect of variable seeding depth of barley on intensity of root rot, grain yield, loss of yield due to root rot, and the effect of variable seed size of barley on incidence of root rot and yield were also investigated in field trials at Lacombe, Alberta. Symptoms of common root rot, based on lesions on the subcrown internode, were not influenced by seed size or seeding date but the disease increased with depth of seeding. Grain yield decreased with depth and with late seeding. Emergence was not affected by seeding date but decreased with depth of seeding. Common root rot was not associated with the reduced yields of later seeding dates but was associated with reduced yields of increased seeding depths. The increased emergence and reduced disease at shallow depths resulted in a greater number of clean plants which probably accounts for some of the increased grain yield at shallow seeding depths.


1989 ◽  
Vol 40 (3) ◽  
pp. 457 ◽  
Author(s):  
MW Perry ◽  
MF D'Antuono

Twenty-eight Australian wheat (Triticum aestivum L. em. Thell.) cultivars representing a series from the 1860s to 1982, were grown in 20 field trials over four years in the wheatbelt of Western Australia. The cultivars included introductions and selections made before 1900, plus important cultivars bred or grown in Western Australia up to 1982. Five of the latter group were from crosses including semidwarf cultivars as parents. Grain yields were measured on all trials, and six trials were also sampled for biomass and yield components.Based on the regression of mean grain yield versus the number of years elapsed since 1884, yields have increased from 1022 kg ha-1 in 1884 to 1588 kg ha-1 in 1982. This represents a rate of increase of 5.8 kg ha-1 year-1 or 0.57% per year. Regression of cultivar yield on site mean yield gave values of b, the slope of the regression, from 0.66 to 1.24, and these were higher for modern than for old cultivars.In six trials sampled for yield components, above-ground biomass appeared to have increased slightly when comparing early selections and their derivatives with later cultivars, but over 80% of the overall increase in grain yield was due to increase in harvest index. Grains per car and grains m-2 were strongly and positively correlated with grain yield, but there were weak negative correlations between 1000-grain weight and yield, and between 1000 grain weight and years since 1884. Cultivars with a semi-dwarf background had equal biomass, but higher yield, harvest index, ear number m-2 and grains ear-2 than modern tall cultivars. The results show that genetic improvement has substantially increased yield potential in this environment and that this has been achieved through substantial increases in grain number m-2 associated with an improvement in harvest index.


2003 ◽  
Vol 54 (7) ◽  
pp. 649 ◽  
Author(s):  
N. W. Galwey ◽  
K. Adhikari ◽  
M. Dracup ◽  
R. Thomson

The indeterminate growth habit of narrow-leafed lupin appears to cause a suboptimal pattern of grain filling in the Mediterranean-type environment of south-western Australia. Development of cultivars with genetically restricted branching (RB) has been proposed to overcome the problem. However, restriction of branching causes profound phenological and architectural changes, and it may be necessary to compensate for these by incorporating RB into a genetic background that confers high shoot mass. In order to make a robust assessment of the value of RB in a range of backgrounds, the trait was incorporated from 5 donor parents into the genetic background of 10 recurrent parents by 2 rounds of back-crossing followed by self-fertilisation of the progeny for 4 generations to produce BC2S4 lines. Thirty-two of these lines were obtained with highly RB or mildly RB, a range of flowering times from 68 to 118 days after sowing, and 16–34 leaf nodes on the main stem. They were tested with their parents in replicated field trials at 3 sites in Western Australia at latitudes from 28°S to 33°S. The RB genotypes generally gave higher grain yield than the normal-branching genotypes at the high-latitude, high-rainfall, long growing season, high shoot mass producing site of Esperance, and the 2 types gave approximately equal yield in the low-latitude, low-rainfall, short-season, low shoot mass site of Mullewa. Only at the intermediate site of Wongan Hills did the normal-branching genotypes have a clear advantage. RB genotypes had higher harvest index than corresponding normal-branching genotypes, particularly at Esperance, and tended to produce more pods but slightly fewer seeds per pod and lighter seeds. There was no consistent difference in performance between highly and mildly RB genotypes, contrary to an expectation that the highly RB type would produce insufficient shoot mass. There was a tentative indication that, within RB lines, a large number of leaf nodes on the main stem conferred more reliably high grain yield in the environments of Esperance and Wongan Hills. Overall, these results provide ample justification for the development and further evaluation of RB cultivars. However, this conclusion comes with 2�caveats: that a different background development pattern should be adopted to that used in normal branching lupins, and that RB cultivars should be evaluated in the target environments where the character confers an advantage.


2019 ◽  
Vol 20 (1) ◽  
pp. 29-34 ◽  
Author(s):  
Yuba R. Kandel ◽  
Leonor F. S. Leandro ◽  
Daren S. Mueller

Conservation tillage has become a common practice of soybean farming in the Midwestern United States owing to the benefits of soil and moisture conservation. Field trials were established in a field with a history of sudden death syndrome (SDS; caused by Fusarium virguliforme) in Iowa in 2011 and evaluated for five consecutive years to determine the impact of tillage on SDS and yield. The experiment was laid out in a split-split-plot design with four replicates. The main plot factor was tillage (no-till both crops, no-till corn and chisel plow soybean, and disc corn and chisel plow soybean), and each main plot was divided into subplots of corn or soybeans (in a 2-year rotation). Each subplot was again divided into two subsubplots, in which two soybean cultivars, moderately susceptible (MS) and moderately resistant (MR) to SDS, were planted each year. Root rot and SDS disease index (FDX) differed among years, because some years were more favorable for the disease than the others. However, tillage did not affect any parameters, including yield, in any year (P > 0.05). Cultivar effect was significant for each parameter occasionally. When significant, the MR cultivar had lower root rot and FDX and greater yield than the MS cultivar. These data suggest planting resistant cultivars can be an effective management tactic, but tillage does not help for SDS management.


Soil Systems ◽  
2021 ◽  
Vol 5 (3) ◽  
pp. 46
Author(s):  
Andrew W. Rate

Public recreation areas in cities may be constructed on land which has been contaminated by various processes over the history of urbanisation. Charles Veryard and Smith’s Lake Reserves are adjacent parklands in Perth, Western Australia with a history of horticulture, waste disposal and other potential sources of contamination. Surface soil and soil profiles in the Reserves were sampled systematically and analysed for multiple major and trace elements. Spatial analysis was performed using interpolation and Local Moran’s I to define geochemical zones which were confirmed by means comparison and principal components analyses. The degree of contamination of surface soil in the Reserves with As, Cr, Cu, Ni, Pb, and Zn was low. Greater concentrations of As, Cu, Pb, and Zn were present at depth in some soil profiles, probably related to historical waste disposal in the Reserves. The results show distinct advantages to using spatial statistics at the site investigation scale, and for measuring multiple elements not just potential contaminants.


Sign in / Sign up

Export Citation Format

Share Document