Growth and Carbon Partitioning in Rainforest and Eucalypt Forest Species of South Coastal New South Wales, Australia

1992 ◽  
Vol 40 (1) ◽  
pp. 13 ◽  
Author(s):  
DJ Barrett ◽  
JE Ash

Rainforest, ecotone and eucalypt forest species were grown for 22 weeks in glasshouse conditions under light, water and nutrient treatments. Plant biomass, leaf area and leaf biomass per plant increased in Eucalyptus sieberi, Eucalyptus fastigata, Pittosporum undulatum, Callicoma serratifolia, Elaeocarpus reticulatus, Backhousia myrtifolia and Ceratopetalum apetalum at high irradiance (1230-1670 μ-mol PAR m-2 s-1). Both E. sieberi and E. fastigata inhabit the relatively high light environments of northern aspects, upper southern aspects and ridge tops in the gully systems of south coastal New South Wales. Callicoma serratifolia, P. undulatum and E reticulatus are pioneer species of the ecotone around rainforest patches, and B. myrtifolia and C. apetalum are rainforest canopy species. Mean plant biomass under high irradiance was ranked: eucalypt species > ecotone species and B. myrtifolia > C. apetalum. At low irradiance (200-530 μ-mol PAR m-2 s-1) the trend observed was reversed where rainforest canopy and ecotone species produced greater plant biomass. Plant response to different water and nutrient treatments under glasshouse conditions showed that, while the light environment primarily governed plant response, interaction between treatments occurred which resulted in maximum plant biomass at relatively high levels of soil moisture and nutrients. Carbon partitioning was used as an indication of relative response to light treatments. The proportion of plant mass partitioned to leaves did not change between experimental treatments. The magnitude of the response of leaf area ratio and specific leaf weight to light treatment, however, was ranked: eucalypt species > ecotone species > rainforest canopy species. This suggested that species naturally growing outside the rainforest canopy maximised leaf area in proportion to plant mass for a given irradiance, presumably to maintain high growth rates.

1984 ◽  
Vol 24 (125) ◽  
pp. 236
Author(s):  
GK McDonald ◽  
BG Sutton ◽  
FW Ellison

Three winter cereals (wheat varieties Songlen and WW 15, triticale variety Satu) were grown after cotton or summer fallow under three levels of applied nitrogen (0, 100 and 200 kg N/ha) at Narrabri, New South Wales. The cereals were sown on August 7, 1980 and growing season rainfall was supplemented by a single irrigation. Leaf area, total shoot dry matter production and ears per square metre were lower after cotton than after summer fallow, while grain yields of cereals sown immediately after cotton were 33% lower than those sown after fallow. Adding nitrogen increased leaf area, dry matter and grain yields of crops grown after cotton and fallow, but significant increases were not obtained with more than 100 kg/ha of applied nitrogen. Crops grown after cotton required an application of 100 kg N/ha for leaf and dry matter production at anthesis to equal that of crops grown after fallow with no additional nitrogen. The corresponding cost to grain yield of growing cotton was equivalent to 200 kg N/ha. The low grain yield responses measured in this experiment (1 8 and 10% increase to 100 kg N/ha after cotton and fallow, respectively) were attributed to the combined effects of late sowing, low levels of soil moisture and loss, by denitrification, of some of the applied nitrogen. The triticale, Satu, yielded significantly less than the two wheats (1 99 g/m2 for Satu c.f. 255 and 286 g/m2 for Songlen and WW 15, respectively), and did not appear to be a viable alternative to wheat in a cotton rotation.


1989 ◽  
Vol 29 (1) ◽  
pp. 51 ◽  
Author(s):  
DC McKenzie ◽  
HB So

The effect of gypsum on the properties and crop productivity of 6 contrasting vertisols of the Gwydir Valley, New South Wales was investigated in 1978 and 1979. These soils are often used for dryland wheat production, although crop growth is generally restricted by their structural instability. In 2 of the soils used in our study, the surface aggregates were sodic and dispersive (poor soils), 2 were relatively stable when wetted (good soils), whilst the other 2 soils had surface aggregates that were intermediate in behaviour (intermediate soils). The effects of added gypsum at 4 rates (0, 2.5, 5.0 and 7.5 t ha-1) on soil water profiles, soil properties and the growth of wheat were monitored over a 2 year period. Dryland wheat grain yields were increased by as much as 230% following the application of gypsum. Benefits were most pronounced on clays with sodic topsoils, a high water-holding capacity and adequate nutrition; plant response to gypsum on nearby soils with non-dispersive surfaces was less pronounced. Yield increases were associated with better seedling establishment, greater tiller production, increased grain weight, and lower incidence of 'Crown Rot' disease. Plant response to gypsum was related to improved water penetration into the soil, allowing greater storage of water in the subsoil, rather than loss via evaporation and possibly runoff. Increases as high as 137% in the soil water storage to a depth of 1.2 m were observed. Crop performance was also strongly influenced by rainfall, time of sowing and weed control. Where nitrogen and, to a lesser extent, phosphorus, were deficient in gypsum-treated soil, they had to be added before the extra soil water could be utilised effectively by wheat. On the lighter textured clays, gypsum appeared to aggravate nitrogen deficiency, apparently because of enhanced leaching.


1992 ◽  
Vol 43 (1) ◽  
pp. 29 ◽  
Author(s):  
PJ Ellison ◽  
GM Murray

Development of stripe rust was observed on wheat cultivars that differed in reaction to the disease at the post booting stage of growth over 4 years (1984-1987) at Yanco and Wagga Wagga in southern New South Wales. In 1984, the epidemic began in August and the disease affected up to 20% of leaf area by the booting stage. The disease then ceased to develop in cultivars with moderately resistant or resistant adult plant reaction (APR) to stripe rust, but in susceptible wheats up to 82% of leaf area was affected by early milk stage of growth. The early onset in 1984 was associated with the highest rain in the previous summer-autumn (January-April) of the 4 years. In the other 3 years, the epidemics began laer. Stripe rust did not develop on cultivars with resistant APR, but it affected up to 97% of leaf area of the highly susceptible cultivar Avocet by early milk. The disease was more severe on later sown than early sown Avocet. The apparent rates of infection both before and after booting ranged from 0.02-0.41 per day. In each experiment, the rate was less on wheats with higher levels of APR while in 1984 the rate on all cultivars decreased from the pre-booting to the post-booting stage of crop growth. After booting, the apparent rates of infection on susceptible and moderately susceptible cultivars were postively correlated with the mean temperature during the period over which the rate was calculated, for the range 12.9-16.2�C. Over this range, the apparent rate of infection of susceptible wheats increased at 0.095 per day per �C while that of moderately susceptible wheats increased at 0.045 per day per �C. From 16.2-203�C the rate of susceptible wheats was negatively correlated with the mean temperature, and declined at 0.043 per day per �C. There was no significant relationship between apparent rate of infection and temperature for moderately resistant wheats after booting, or for rates before booting in 1984. Development of wheat, measured on the Zadoks scale, was linear from first appearance of the flag leaf (GS 37) to mid milk (GS 75) at both sites over the four years.


1990 ◽  
Vol 38 (1) ◽  
pp. 1 ◽  
Author(s):  
PG Kodela

The modern pollen spectra for Eucalyptus forest and rainforest communities were investigated from 19 sites in the Robertson area on the Central Tablelands of New South Wales. Cluster and discriminant analyses were applied to analyse pollen distribution from within and from outside warm temperate rainforest stands and tall open eucalypt forest stands. Pollen abundance is compared with a number of plant abundance estimates of taxa within forests to study pollen representation at the forest scale. Pollen of Doryphora, Polyosma, Pittosporum, Hymenanthera, Tasmannia, Asclepiadaceae and most rainforest taxa investigated are poorly represented, while sclerophyll and open-ground taxa, particularly Eucalyptus, are better represented. The pollen of many native taxa do not appear to be well dispersed, and local pollen is commonly outweighed by pollen from regional sources. Pollen representation varied between taxa and sites, with factors such as vegetation structure, plant distribution, topography and disturbance influencing pollen representation.


Soil Research ◽  
1997 ◽  
Vol 35 (4) ◽  
pp. 863 ◽  
Author(s):  
I. P. Little

Red gradational soils at Batlow, in New South Wales, which are used for apple growing, have acid subsoils with exchangeable aluminium (Al) frequently in excess of exchangeable calcium (Ca). There is often inadequate Ca in the fruit cortex of post-harvest apples to maintain good fruit quality and this can lead to losses in cool-store. It is possible that Al in these acid subsoils has interfered with Ca uptake by the trees. The excessive use of nitrogenous fertilisers leads to soil acidity, and it was thought likely that this was exacerbating the subsoil acidity common in the district. In October 1992, soil analysis detected considerable ammonium in the surface 0·3 m at orchard sites at Batlow monitored for mineral nitrogen (N). This probably came from heavy spring dressings of fertiliser. One site examined in detail showed that about half of the ammonium had disappeared by January 1993, but a large nitrate envelope appeared with a peak at 0·6 m which in turn disappeared by April that year. This establishes that heavy applications of ammonium are nitrified, leached into the subsoil, and lost. Under such a high N regime, orchard soil profiles should be more acid than adjacent forest soils. However, it was found that the acidity of the surface soil was less, and the exchangeable Ca greater in the orchard soils, compared with soil profiles in the adjacent eucalypt forest, although amelioration of the subsoils had not occurred. Samples taken from representative sites at Batlow, at the 0–0·1, 0·1–0·2, and 0·3–0·4 m depths, were dosed with ammonium sulfate and leached with water in the laboratory for 23 days in a free-draining environment. Nitrate and ammonium were determined in the leachates. At the end of the experiment, the pH and exchangeable Ca, Mg, and Mn were determined in the leached samples. Only the neutral surface soils were able to nitrify ammonium effectively and nitrification was positively correlated with pH, and with exchangeable Ca and Mg. From this it is argued that the acidity produced by the addition of ammonium sulfate or urea will be nitrified in the surface but the acidity produced will be neutralised, provided it is accompanied by an adequate dressing of lime. Ammonium tends to remain in the surface soil, but if leached, it will not be nitrified in the subsoil. Nitrate leached into the subsoil will not be acid-forming but, if denitrified, may help to reduce acidity. For this work, the soil pH was measured in 1 KCl. So that readers can refer this to the pH in 0·01 CaCl2, a relationship was established between the two measures.


2010 ◽  
Vol 16 (3) ◽  
pp. 209 ◽  
Author(s):  
Harry Parnaby ◽  
Daniel Lunney ◽  
Ian Shannon ◽  
Mike Fleming

Hollows in trees are recognized as a critical and threatened resource for a wide range of fauna in Australian forests and woodlands, yet little data are available on the impact of fire on hollow-bearing trees. We report an opportunistic, post-fire assessment of the proportion of burnt, hollow-bearing trees that collapsed in stands near roads following low intensity prescription burns in three areas of mixed eucalypt forest in the Pilliga forests. Mean collapse rates on 29 plots (40 by 50m), separated by burn Area, ranged from 14?26% for a total of 329 burnt hollow-bearing trees. Collapse rates on individual plots ranged from 0?50%. Collapsed, hollow-bearing trees were predominantly older, with 40% of senescent trees and 44% of live stags collapsing. The best predictor in models of tree collapse was the presence of a basal fire entry point. We cannot determine the extent to which collapse rates on our plots are representative of burnt areas away from containment roads due to sampling limitations, but they appear to be higher than those reported from wildfire and more intense prescription burns in southern Australia. Our results point to an urgent need for comprehensively designed studies to address the impacts of prescribed burns on hollow-bearing trees.


2018 ◽  
Vol 69 (9) ◽  
pp. 915 ◽  
Author(s):  
Jianhua Mo ◽  
Sandra McDougall ◽  
Sarah Beaumont ◽  
Scott Munro ◽  
Mark M. Stevens

Early-season leaf loss due to damage by thrips (Thysanoptera: Thripidae) is considered an important issue by Australian cotton growers. To understand the potential impact of early-season leaf loss in the southern region of New South Wales, we investigated the effects of artificial defoliation on cotton growth, maturity timing and lint yield over four seasons (2013–14 to 2016–17) in commercial cotton crops in the Riverina district. Four defoliation scenarios were investigated: (i) complete defoliation, 100% removal of all true leaves from all plants; (ii) partial defoliation by plant, 100% removal of all true leaves from 75% of plants; (iii) partial defoliation by leaf, removal of 75% of leaf area from all individual true leaves on all plants; and (iv) no defoliation. Defoliation was done by hand at the onset of the 2-, 4-, and 6-node growth stages. Defoliated plants were initially shorter than undefoliated (control) plants, but by ~100-days post seedling emergence, height differences were no longer statistically significant in two of the four seasons. Defoliation did not affect the total number of bolls shortly before harvest. However, complete defoliation delayed crop maturity by up to 18 days and partial defoliation by plant delayed crop maturity by up to 8 days. Because of the delays, fully defoliated plants often had fewer open bolls shortly before harvest and yielded significantly less than undefoliated plants in three of the four seasons. A laboratory experiment with caged cotton seedlings showed that weekly introductions of up to10 thrips per seedling (predominantly onion thrips (Thrips tabaci), the most abundant species on cotton in the region) caused significant clubbing in true leaves, but the total leaf area was not significantly reduced at the 6-node stage. Implications of the results for southern cotton integrated pest management are discussed.


Sign in / Sign up

Export Citation Format

Share Document