Assessment of erosion rates from microphyte-dominated calcareous soils under rain-impacted flow

Soil Research ◽  
1997 ◽  
Vol 35 (3) ◽  
pp. 475 ◽  
Author(s):  
D. J. Eldridge ◽  
P. I. A. Kinnell

Intact soil monoliths with surfaces of varying microphytic crust cover were collected from a calcareous earth soil in a semi-arid belah–rosewood woodland near Wentworth in south-western New South Wales. Monoliths were tested for their susceptibility to erosion by rain-impacted flow using a laboratory rainfall simulator. The erosive stress applied to each surface was controlled by varying the flow depth between 4 and 8 mm whilst maintaining a flow velocity of 25 mm/s using 2·7 mm raindrops falling 11·2 m at average rainfall intensities of 65 mm/h. Increasing the cover of microphytic crusts on the surface resulted in a significant (P = 0·001) reduction in sediment concentration. A linear model incorporating percentage cover and distribution of cover accounted for 46% of the variance in soil erosion. A significant relationship was also found between the coarse fraction (>0·053 mm) and crust cover (P = 0·012) at the 4-mm depth. Management practices such as overgrazing, trampling, and fire, which reduce the cover of crusts in this landscape, will lead to increased erosion hazard.

Soil Research ◽  
2010 ◽  
Vol 48 (3) ◽  
pp. 248 ◽  
Author(s):  
Matthew Miklos ◽  
Michael G. Short ◽  
Alex B. McBratney ◽  
Budiman Minasny

The reliable assessment of soil carbon stock is of key importance for soil conservation and mitigation strategies related to reducing atmospheric carbon. Measuring and monitoring soil carbon is complex because carbon pools cycle and rates of carbon sequestration vary across the landscape due to climate, soil type, and management practices. A new methodology has been developed and applied to make an assessment of the distribution of total, organic, and inorganic carbon at a grains research and grazing property in northern New South Wales at a high spatial resolution. In this study, baseline soil carbon maps were created using fine resolution, geo-referenced, proximal sensor data. Coupled with a digital elevation model and secondary terrain attributes, all of the data layers were combined by k-means clustering to develop a stratified random soil sampling scheme for the survey area. Soil samples taken at 0.15-m increments to a depth of 1 m were scanned with a mid-infrared spectrometer, which was calibrated using a proportion of the samples that were analysed in a laboratory for total carbon and inorganic carbon content. This combination of new methodologies and technologies has the potential to provide large volumes of reliable, fine resolution and timely data required to make baseline assessments, mapping, monitoring, and verification possible. This method has the potential to make soil carbon management and trading at the farm-scale possible by quantifying the carbon stock to a depth of 1 m and at a high spatial resolution.


Soil Research ◽  
1990 ◽  
Vol 28 (4) ◽  
pp. 539 ◽  
Author(s):  
CJ Chartres ◽  
RW Cumming ◽  
JA Beattie ◽  
GM Bowman ◽  
JT Wood

Samples were collected from unimproved road reserves and adjacent paddocks on a 90 km transect crossing red-brown earth soils in the west and red earth soils in the east. Measurements of pH in water and CaCl2 indicated that the red earths have been acidified by approximately 0.5 pH units over the last 30-40 years. Small increases in CaCl2-extractable A1 were also recorded for the acidified red earths. The red-brown earths do not appear to have been markedly affected by soil acidification to date. Clay mineralogical data and measurements of cation exchange capacity of the <2 �m fraction indicate that red-brown earths are better buffered against acidification than red earths. However, small differences in management practices and rainfall along the transect may also be partially responsible for differences in acidification between soil types.


Soil Research ◽  
2009 ◽  
Vol 47 (3) ◽  
pp. 340
Author(s):  
B. Kelly ◽  
C. Allan ◽  
B. P. Wilson

'Soil health' programs and projects in Australia's agricultural districts are designed to influence farmers' management behaviours, usually to produce better outcomes for production, conservation, and sustainability. These programs usually examine soil management practices from a soil science perspective, but how soils are understood by farmers, and how that understanding informs their farm management decisions, is poorly documented. The research presented in this paper sought to better understand how dryland farmers in the Billabong catchment of southern New South Wales use soil indicators to inform their management decisions. Thematic content analysis of transcripts of semi-structured, face-to-face interviews with farmers suggest several themes that have implications for soil scientists and other professionals wishing to promote soil health in the dryland farming regions of south-eastern Australia. In particular, all soil indicators, including those related to soil 'health', need to relate to some clear, practical use to farmers if they are to be used in farm decision making. This research highlights a reliance of the participants of this research on agronomists. Reliance on agronomists for soil management decisions may result in increasing loss of connectivity between farmers and their land. If this reflects a wider trend, soil health projects may need to consider where best to direct their capacity-building activities, and/or how to re-empower individual farmers.


1994 ◽  
Vol 34 (7) ◽  
pp. 921 ◽  
Author(s):  
DC Godwin ◽  
WS Meyer ◽  
U Singh

Evidence exists that night temperatures <18�C immediately preceding flowering in rice crops can adversely affect floret fertility and, hence, yields. It has been suggested that sterility induced by low temperature is also influenced by floodwater depth and nitrogen (N) rate. In southern New South Wales, low night-time temperatures are believed to be a major constraint to the achievement of consistently high yields. The availability of a comprehensive model of rice growth and yield that is sensitive to this constraint would aid the development of better management practices. CERES RICE is a comprehensive model that simulates the phasic development of a rice crop, the growth of its leaves, stems, roots, and panicles, and their response to weather. It also simulates the water and N balances of the crop and the effects of stresses of water and N on the yield-forming processes. The model has been extensively tested in many rice-growing systems in both tropical and temperate environments. However, the original model was unable to simulate the level of chilling injury evident from yield data from southern New South Wales. This paper reports modifications made in the model to simulate these effects and the evaluation of the model in environments of low night temperature. Inclusion of the chilling injury effect greatly improved the accuracy of estimated yields from treatments in an extensive field experiment. However, additional testing with a wider range of data sets is needed to confirm the international applicability of the modifications.


2012 ◽  
Vol 52 (7) ◽  
pp. 675 ◽  
Author(s):  
R. D. Bush ◽  
R. Barnett ◽  
I. J. Links ◽  
P. A. Windsor

The prevalence of Caseous lymphadenitis (CLA) in Australia was estimated to be 5.2% using 2009 abattoir surveillance data from all States supplied by Animal Health Australia involving 5029 lines comprising 1 339 463 sheep. This is a decrease from the 26% estimated in a similar study in 1995. There was a significant difference (P < 0.001) in CLA prevalence between all states except Tasmania and Victoria (P = 0.75) with prevalences of 12.8 and 12.9%, respectively. Western Australia recorded the lowest prevalence with 1.0%. The average CLA prevalence for New South Wales was 5.3% and within three surveyed Livestock Health and Pest Authority regions (Tablelands, Central North and Central West) was 2.9, 4.9 and 4.4%, respectively. The attitude of the majority of producers surveyed in these three Livestock Health and Pest Authority areas was that CLA was of little or no significance (75%) but were aware of the need for CLA control with ~68% using 6-in-1 vaccine, though only 39.9% as recommended. It appears that the prolonged use of CLA vaccination has been successful in reducing the prevalence of CLA across Australia and particularly in New South Wales. Further improvements in communication of information on preventative management practices associated with lice control, importance of using an approved vaccination program, plus increasing producers’ awareness of the importance of CLA control, are indicated.


1990 ◽  
Vol 12 (2) ◽  
pp. 67 ◽  
Author(s):  
ND Macleod

Uany pastoral leases in western New South Wales are too small to ensure that viable pastoral enterprisu will persist in the medium to longer term. Apart from attendant welfare problems for leaseholders and their dependants, there is some evidence that this has exacerbated overgrazing problems which creates undesirable and potentially irreversible degradation of vegetation and soil resources. Arguably, the small size problem has sufficiently weakened the economic welfare of many lessees to make private acquisition of additional areas andl or adoption of conservation oriented management practices non-economic. The future scenario is then one of greater public involvement in property adjustment measures; or a growing population of non-viable pastoral enterprises with its attendant efficiency, welfare and resource conservation problems. The paper examines trends in property enterprise size, structure and concentration and the existence and extent of size economies for wool and livestock production in western New South Wales. Several issues are addressed concerning the economic viability of pastoral properties and several public policy prescriptions are presented for addressing the sizehiability problem.


Soil Research ◽  
2009 ◽  
Vol 47 (2) ◽  
pp. 234 ◽  
Author(s):  
B. Kelly ◽  
C. Allan ◽  
B. P. Wilson

‘Soil health’ programs and projects in Australia’s agricultural districts are designed to influence farmers’ management behaviours, usually to produce better outcomes for production, conservation, and sustainability. These programs usually examine soil management practices from a soil science perspective, but how soils are understood by farmers, and how that understanding informs their farm management decisions, is poorly documented. The research presented in this paper sought to better understand how dryland farmers in the Billabong catchment of southern New South Wales use soil indicators to inform their management decisions. Thematic content analysis of transcripts of semi-structured, face-to-face interviews with farmers suggest several themes that have implications for soil scientists and other professionals wishing to promote soil health in the dryland farming regions of south-eastern Australia. In particular, all soil indicators, including those related to soil ‘health’, need to relate to some clear, practical use to farmers if they are to be used in farm decision making. This research highlights a reliance of the participants of this research on agronomists. Reliance on agronomists for soil management decisions may result in increasing loss of connectivity between farmers and their land. If this reflects a wider trend, soil health projects may need to consider where best to direct their capacity-building activities, and/or how to re-empower individual farmers.


Soil Research ◽  
2015 ◽  
Vol 53 (6) ◽  
pp. 683 ◽  
Author(s):  
Jonathan M. Gray ◽  
Greg A. Chapman ◽  
Brian W. Murphy

A new evaluation scheme, land management within capability (LMwC), used to guide sustainable land management in New South Wales (NSW), is presented. The scheme semi-quantitatively categorises the potential impacts of specific land-management actions and compares these with the inherent physical capability of the land in relation to a range of land-degradation hazards. This leads to the derivation of LMwC indices, which signify the sustainability of land-management practices at the scale of individual sites up to broader regions. The LMwC scheme can be used to identify lands at greatest risk from various land-degradation hazards. It can help to guide natural resource agencies at local, regional and state levels to target priorities and promote sustainable land management across their lands. Few other schemes that assess the sustainability of a given land-management regime in a semi-quantitative yet pragmatic manner are found in the literature. The scheme has particular application for regional soil-monitoring programs and it was applied in such a program over NSW in 2008–09. The results suggested that the hazards most poorly managed across the state are wind erosion, soil acidification and soil organic carbon decline. The LMwC scheme, or at least its underlying concepts, could be readily applied to other jurisdictions.


2000 ◽  
Vol 40 (3) ◽  
pp. 357 ◽  
Author(s):  
S. A. Spence ◽  
S. A. Spence ◽  
A. C. Woodhead ◽  
A. C. Woodhead

The weight of 2659 Friesian heifers was measured electronically on 41 farms in north-eastern New South Wales during autumn 1992. The 41 producers involved were surveyed on their heifer management practices. The relationship between liveweight and age was determined from these measurements. This provided a base line against which improvement in heifer weight for age could be measured subsequent to a planned extension program. In 1992, 2 management practices were found to be associated with significantly higher weights for age. These were feeding more than 1 kg of concentrate to heifers prior to weaning and grazing heifers on fertilised pasture between weaning and 12 months of age. An extension program promoting improved heifer management was conducted, using adult learning principles, from 1992 until 1995. Nineteen of the 41 farms were revisited in winter 1997 when the heifers were again measured and management practices surveyed. Heifer weight for age had increased by a small but significant amount between 1992 and 1997. Management changes which occurred between 1992 and 1997 were: increased vaccination rates; decreased use of antibiotic injections for treating calf illnesses; increase in feeding over 1 kg of concentrate per day to preweaned heifers; and increased use of silage to feed weaned heifers. There was also a decreased percentage of producers who considered calf diseases to be a problem. Between 1992 and 1995, 14 farmers began monitoring their heifers regularly, either by measuring weight electronically or by using a weigh tape.


Sign in / Sign up

Export Citation Format

Share Document