Seasonal Variation in Numbers of Chlamydospores in Victorian Forest Soils Infected With Phytophthora cinnamomi

1978 ◽  
Vol 26 (5) ◽  
pp. 657 ◽  
Author(s):  
G Weste ◽  
K Vithanage

Chlamydospore numbers were counted for 2 years on replicated soil samples from three different types of naturally infected Victorian native forest. Soil temperatures and soil water potentials were recorded concurrently. A highly significant seasonal variation in chlamydospore numbers was observed with maxima from summer to autumn and minima from winter to spring. There was little variation either between replicates or between different forest soils in winter and spring counts, but there was highly significant variation between different forest sites during the large summer and autumn counts. At this period sandy soil contained five to 12 times the number of chlamydo- spores found in other soils. For example, in autumn 1977,286 chlamydospores were recorded per 50 g sample from deep sandy soils compared with 31 for krasnozem and 17 for shallow duplex soils. At this period soil temperatures were similar but the soil water potential for the duplex soil was very low (-82 bars).

1979 ◽  
Vol 27 (6) ◽  
pp. 693 ◽  
Author(s):  
G Weste ◽  
K Vithanage

Sporangial production by Phytophthora cinnamomi was investigated during a 3-year period for three types of conducive forest soils. A pilot survey conducted during the first year included garden soil and demonstrated that all forest soils tested stimulated sporangial production, that the stimulus was reduced by soil sterilization, and that soils without P. cinnamomi provided a greater stimulus than soils from diseased sites. For the final 2 years of the investigation soil and root samples were collected from diseased plants at 3- monthly intervals, and soil filtrates were tested for their ability to stimulate sporangial production. Soil matric water potentials and soil temperatures were recorded concurrently. Three factors which influenced the soil and indirectly affected sporangial formation were soil temperature, soil matric water potential and the presence of a stimulus produced by certain living microorganisms. Sporangial production showed highly significant seasonal variation with maxima in spring. No sporangia were produced in summer. Differences in the numbers of sporangia produced in response to the different soil filtrates were also highly significant, the highest numbers being formed in response to sandy soils of Wilson's Promontory. These results, particularly the large release of zoospores during spring, are of great importance in the spread of disease, and should be considered in relation to roadmaking and forestry in diseased areas.


1977 ◽  
Vol 25 (5) ◽  
pp. 461 ◽  
Author(s):  
G Weste ◽  
P Ruppin

Population densities of Phytophthora cinnamomi, associated disease and environmental factors were studied concurrently during a 2-year period in three different forest ecosystems. Pathogen populations showed seasonal variation, low values being obtained for winter months associated with soil temperatures less than 10°C. Populations increased with warmer temperatures for spring and summer, but declined during dry periods in late summer or early autumn when the soil water potential was lower than -9 bars, although at that period soil temperatures were favourable. High populations were recorded in autumn, then declined with decrease in soil temperatures during winter. Correlation coefficients indicated a highly significant relationship between pathogen populations and soil temperatures from autumn to early summer, and between soil moisture and pathogen population for summer and autumn, in the Brisbane Ranges independently of site. The same pattern was evident in wetter forests at Narbethong and savannah woodlands at Wilson's Promontory, although results were not significant. Disease was evident wherever the pathogen occurred among susceptible hosts. The savannah woodland, the dry shrubby sclerophyll forest and the wetter sclerophyll forest all contained susceptible dominants; consequently disease was associated with changes in the forest community such as early death of the understorey, later die-back and death of the trees, and an increase in sedges and in bare ground. Symptoms and deaths increased with time from invasion. The severity of disease and its rate of extension, apart from spread by free water, were associated with environmental factors such as shallow soil, poor drainage and low soil water-holding capacity. These were characteristic of the Brisbane Ranges, where destruction of the forest community was severe and the rate of disease extension rapid. In the deep krasnozem at Narbethong and the deep sands of Wilson's Promontory, destruction was confined to the most susceptible hosts, disease extension was continuous but slow, and deaths occurred in a mosaic throughout the infected zone.


Weed Science ◽  
1994 ◽  
Vol 42 (4) ◽  
pp. 561-567 ◽  
Author(s):  
Charles A. King ◽  
Lawrence R. Oliver

Experiments were conducted to evaluate the influence of temperature and water potential on water uptake, germination, and emergence of large crabgrass in order to predict emergence in the field. Water uptake of seed soaked in polyethylene glycol solutions of 0 to −1400 kPa underwent an initial imbibition phase followed by a lag phase and subsequent increase in water content when radicles emerged from the seed. Maximum germination at 15 C was 12% at 0 kPa and 60% at 25 C at 0 to −200 kPa osmotic potential. In the growth chamber, large crabgrass emergence from soil began 2 to 3 d after planting at 30 or 35 C and within 9 to 10 d at 15 C. Maximum emergence of 77 % occurred at 25 C and at a soil water potential of −30 kPa. Emergence percentage decreased as water potential decreased or as temperature increased or decreased. A logistic equation described emergence of large crabgrass at each combination of temperature and soil water potential at which emergence occurred, and a predictive model was developed and validated by field data. In the field, there was little or no emergence at soil temperatures below 15 C or water potentials below −50 to −60 kPa. The model predicted the time of onset of large crabgrass emergence and the time to reach maximum emergence to within 2 to 4 d of that recorded in field experiments. The model also predicted the correct number of flushes of emergence occurring in the field in three of four experiments.


1979 ◽  
Vol 59 (3) ◽  
pp. 259-264 ◽  
Author(s):  
R. DE JONG ◽  
K. F. BEST

Daily emergence counts were made on Canthatch wheat (Triticum aestivum L.) grown in five soil types, at four soil temperatures and three water potentials and planted at five different depths. Regardless of soil type, soil water potential or depth of planting, 50% emergence generally occurred within a week at 19.4 and 26.7 °C, and within 2 wk at 12.2 °C, but it took up to 6 wk at 5 °C. The heat sum required to attain 50% seedling emergence did not increase significantly with decreasing soil water potentials, but the minimum temperature for emergence dropped from 1.3 to 0.2 °C as the water potential decreased from −⅓ to −10 bar. It was suggested that the seedlings compensated for the increased water stress by lowering their minimum temperature requirements. Increasing the planting depth not only increased the heat requirement for emergence, but it also increased the variability of emergence, especially at low temperatures. Practical aspects concerning planting dates and depths were considered.


1994 ◽  
Vol 119 (2) ◽  
pp. 216-222 ◽  
Author(s):  
Ian A. Merwin ◽  
Warren C. Stiles ◽  
Harold M. van Es

This study was conducted to compare various orchard groundcover management systems (GMSs)—including a crownvetch “living mulch” (CNVCH), close-mowed (MWSOD) and chemically growth-regulated (GRSOD) sodgrasses, pre-emergence (NDPQT) and two widths of post-emergence (GLY1.5 and GLY2.5) herbicides, hay-straw mulch (STMCH), and monthly rototillage (tilled)—during the first 6 years in a newly established apple (Malus domestica Borkh.) planting. Mean soil water potential at 5 to 35 cm deep varied substantially among treatments each summer, and treatment × year interactions were observed. During most growing seasons from 1986 to 1991, soil water availability trends were STMCH > NDPQT > GLY2.5 > GLY1.5 > tilled > GRSOD > MWSOD > CNVCH. Soil organic matter content increased under STMCH, CNVCH, and MWSOD and decreased under NDPQT and tilled treatments. Water infiltration and saturated hydraulic conductivity after 4 years were lower under NDPQT and tilled, and soil under STMCH and GRSOD retained more water per unit volume at applied pressures approximating field water capacity. Mid-summer soil temperatures at 5 cm deep were highest (25 to 28C) in tilled and NDPQT plots, intermediate (22 to 24C) under GRSOD, and lowest (16 to 20C) under CNVCH and STMCH. These observations indicate that long-term soil fertility and orchard productivity may be diminished under pre-emergence herbicides and mechanical cultivation in comparison with certain other GMSs.


1998 ◽  
Vol 8 (2) ◽  
pp. 201-210 ◽  
Author(s):  
Frank Forcella

AbstractComputer software called WeedCast was developed to simulate weed seed dormancy, timing of seedling emergence, and seedling height growth in crop environments in real-time and using actual or forecasted weather data. Weather data include daily rainfall and minimum and maximum air temperatures. Air temperatures are converted to average daily soil temperature at 5-cm soil depth using a series of equations that are specific for soil type, tillage system and previous year's crop-residue type. Daily rainfall and soil temperature estimates are combined to determine soil water potential (in megapascals) at 5-cm depth. Daily estimated soil water potential or soil temperatures are matched to empirically-derived threshold values that induce secondary dormancy in seeds of certain species. Soil growing degree days (GDD), calculated from soil temperatures, are used to project maximum emergence rates of weed seedlings. Emergence ceases on days when soil water potential falls below threshold values specific to each species. GDD based on air temperatures are used to estimate post-emergence seedling height growth. All three types of simulation provide information that allows users to answer important weed management questions in real-time. These types of questions include but are not limited to the following: (1) Are soil-applied treatments necessary? (2) How late can pre-emergence herbicides be applied? (3) When should mechanical control be implemented? (4) When should field-scouting commence and end? (5) When should post-emergence herbicides be applied?


1979 ◽  
Vol 27 (1) ◽  
pp. 1 ◽  
Author(s):  
G Weste ◽  
K Vithanage

Chlamydospore survival was investigated for six soil types, collected from disease-free areas of native forest in Victoria, in 50-g packs of non-sterile, unamended soils and gravels at five different matric soil water potentials ( ψ ). No chlamydospores survived in gravel free from OM, and only one chlamydospore survived at ψ -3000 kPa. In other packs the numbers of chlamydospores declined for 2 months then increased markedly at 4-6 months. Many chlamydospores remained viable for 8 months and some for 10 months despite the use of non-sterile soil and the absence of hosts. Maximum numbers survived in gravel from the Brisbane Ranges 6 and 8 months after inoculation at ψ -500 kPa. Decreasing soil moisture appeared to stimulate chlamydospore formation while a low rganic matter content and small numbers of microorganisms increased survival.


2017 ◽  
Vol 47 (5) ◽  
pp. 492-496 ◽  
Author(s):  
Katalin Virág ◽  
Tibor András Nyári

Aims: Despite decreasing trends, Hungary is the leader in cancer mortality among European countries. We examined the seasonal variation of cancer mortality in Hungary between 1984 and 2013. Methods: Hungarian monthly cancer mortality and population data were used in the analysis. The Walter–Elwood method was used to determine seasonal variation in both mortality rates and proportionate mortality. Results: Significant winter-peak seasonality was found in all-cancer mortality. A similar seasonal trend with a peak from November to January was observed in death rates from colorectal, lung, female breast, prostate, bladder, brain, lymphoid and hematopoietic cancers. However, no more cyclical variation was identified in the mortality rates from other cancers. In addition, significant seasonal variation in proportionate mortality was shown for all cancer sites examined, with a peak in August or September. Conclusions: This study presents the seasonality pattern of different types of cancer mortality which might be related to environmental factors (e.g. infections).


1977 ◽  
Vol 25 (4) ◽  
pp. 377 ◽  
Author(s):  
G Weste ◽  
K. Vithanage

Microbial populations of three forest soils were assayed by a dilution plate procedure and compared with garden soil. The forest soils were selected from areas subjected to die-back disease caused by Phytophthora cinnamomi Rands, and were from sites for which pathogen populations, soil temperatures, rainfall and soil water potentials were concurrently recorded. Forest soils showed low microbial populations compared with garden soil. This was associated with low organic content, low nitrogen status and poor water-holding capacity. Areas with severe disease and rapid disease extension had a small soil microbial population, particuarly of actinomycetes, compared with soil from areas with moderate disease and slow disease extension. Microbial populations were lowest in spring and autumn when P. cinnamomi was most active, and zoospore production, dispersal and infection was maximal. Microbial populations of forest soil were reduced following die-back; and the reduction of disease was highly significant (P < 0.01) for the Brisbane Ranges where plant mortality was high and the percentage of bare ground increased.


1989 ◽  
Vol 19 (11) ◽  
pp. 1359-1370 ◽  
Author(s):  
John H. Bassman

The effects of mounding and soil scarification on microclimate, water relations, photosynthesis, and growth of planted Piceaengelmannii × glauca seedlings were evaluated over three growing seasons. Mounding increased soil temperatures by up to 40% at depths of 5 and 12 cm, but not at 30 cm, during periods of dry, clear weather. Scarification resulted in small increases in soil temperature only at the 5 cm depth. Soil water potential and soil water content were lower in mounds, but similar in scarified patches and controls from midsummer through fall. Transpiration, leaf conductance, and xylem pressure potentials were generally reduced by mounding and to a much lesser extent by scarification compared with controls. However, these responses were complicated by interactions with leaf to air vapor density differences and possibly by soil temperatures. Treatments had no significant effects on diurnal or light responses of photosynthesis. Mounding increased stem and needle weights during the first one-half of the growing season in the first 2 years after planting, but growth was reduced later in the season, probably as a result of increased water stress. Root growth in mounds was significantly greater than scarified patches and controls in all 3 years. Growth patterns for seedlings planted in scarified patches and control treatments were similar to each other. By the end of the third growing season after planting, seedlings in mound treatments had greater stem diameters and total seedling weight was more than double that of controls, but there was little difference in height. Seedlings in scarified patches were similar to controls in diameter and height, but had slightly greater total weights. Results suggest that the positive effects of improved soil temperatures and root growth in mound treatments were negated to a large extent by increased water stress in the first two seasons. By the third growing season, roots were beyond significant drying influence of the mound and their greater length and mass served to increase seedling biomass substantially.


Sign in / Sign up

Export Citation Format

Share Document