Effect of month of birth and first summer's nutrition on the productivity of Merino wethers

1985 ◽  
Vol 25 (4) ◽  
pp. 777 ◽  
Author(s):  
T Marshall

Liveweight and greasy wool production of Merino wethers born in either mid-May, late June or early September in each of 2 years and subjected to three nutritional regimes (high, normal and low) over their first summer were studied over 4 years. Differences in liveweight of up to 9.0 kg between treatments due to month of birth persisted for 48 months while liveweight differences of as much as 15 3 kg due to nutritional treatment lasted only 24-30 months. Similarly, differences in greasy wool production of up to 0.8 kg between times of birth lasted for the duration of the study but differences between first summer nutritional treatment, although as much as 1.1 kg at the first adult shearing, persisted for only 2 years. The results strongly indicate that, in the high rainfall Mediterranean environment of southern Western Australia, sheep born late in the season will be of lower liveweight and will produce less wool than sheep born early in the season.

2011 ◽  
Vol 51 (9) ◽  
pp. 794 ◽  
Author(s):  
A. N. Thompson ◽  
M. B. Ferguson ◽  
D. J. Gordon ◽  
G. A. Kearney ◽  
C. M. Oldham ◽  
...  

Nutrition of ewes during pregnancy can have permanent impacts on the production potential of their progeny. The hypothesis tested in the experiments reported in this paper was that improving the nutrition of Merino ewes during pregnancy and lactation increases the fleece weight and reduces the fibre diameter of their progeny’s wool during their lifetime. In addition, that these effects on the progeny’s wool production can be predicted from the ewe’s liveweight profile. At sites in Victoria and Western Australia in each of 2 years, a wide range in the liveweight and condition score profiles of Merino ewes was generated by varying the amount of supplements fed from joining to Day 100 of pregnancy and the amount of feed on offer grazed from Day 100 to weaning. The site in Victoria was based on perennial pastures and included both single- and twin-bearing ewes whereas the site in Western Australia was based on annual pastures and included single-bearing ewes only. The production and characteristics of wool from the progeny were measured until 51 months of age at the site in Victoria and 33 months of age at the site in Western Australia. The nutritional treatments and the resulting changes in ewe liveweight had significant impacts on the fleece weight and to a lesser extent the fibre diameter of wool produced by their progeny, but there were no consistent effects on other characteristics of progeny fleece wool. The fleece weight of the progeny was related to the liveweight change during pregnancy of their mothers (P < 0.05) and the relationships were similar for the two experiments at each site. At the site in Victoria, a loss of 10 kg in ewe liveweight between joining and Day 100 of pregnancy reduced fleece weight by ~0.2 kg at each shearing until 51 months of age whereas gaining 10 kg from Day 100 of pregnancy to lambing had the opposite effect. The effect of changes in ewe liveweight during late pregnancy on the fleece weight of their progeny at each shearing was of similar magnitude at the site in Western Australia. When evident, the effect of the ewe liveweight profile on the fibre diameter of progeny wool was opposite to the effect on clean fleece weight and the effect of poor nutrition in early to mid pregnancy could be completely overcome by improving nutrition during late pregnancy. Twin-born and reared progeny produced ~0.3 kg less clean wool at each shearing (P < 0.001) that was 0.3-μm broader (P < 0.001) than that from single-born progeny at the site in Victoria. However, the effects of varying ewe nutrition and ewe liveweight change during pregnancy on fleece weight and fibre diameter of progeny wool were similar (P > 0.05) for both single- and twin-born or reared progeny. Overall, these results supported our hypothesis and it is clear that the nutritional management of Merino ewes during pregnancy is important for optimal wool production from their progeny during their lifetime.


1991 ◽  
Vol 31 (3) ◽  
pp. 373
Author(s):  
R Loughman ◽  
EJ Speijers ◽  
GJ Thomas ◽  
DJ Ballinger

The reasons for an increase in barley loose smut in high rainfall areas of Western Australia were investigated in field trials from 1986 to 1988 by examining the effects of environment, cultivar and adequacy of chemical control. Disease was 4-18 times greater in 2 seed lines produced in very high rainfall areas (>750 mm/year) compared with that produced in high (450-750 mm/year) or low (<325 mm/year) rainfall areas. The effectiveness of 5 fungicide seed treatments was assessed. No fungicide seed treatment controlled disease completely. Triadimenol at 225 mg a.i./kg and carboxin at 940 mg a.i./kg were most effective, providing 93-96% disease control. Treatments were significantly (P<0.01) less effective in high rainfall areas of Western Australia. Barley cultivars released recently in Western Australia were found to be susceptible to loose smut; we suggest that the replacement of the moderately resistant Dampier with these cultivars has contributed to an increased incidence of disease.


1989 ◽  
Vol 29 (3) ◽  
pp. 361
Author(s):  
HL Davies ◽  
PP Mann ◽  
B Goddard

Two experiments on weaner production are reported. In experiment 1, the liveweight and wool production were measured in medium Peppin Merino sheep that grazed at 10.5 weanerstha 8 plots of a mixed Phalaris aquatica-subterranean clover pasture or 8 plots of annual pasture (Trifolium subterraneum cv. Woogenellup and volunteer annual grass species). This was repeated over 2 years using autumn-born sheep; 4 groups on each pasture type were offered no supplement, 2 groups a cereal supplement (340 goats), and 2 groups of supplement isoenergetic with the cereal group but having a high protein meal replace some of the cereal (250 g oats and 60 g protein). The feed supplement was offered over the summer (January-April). The sheep on 2 of the unsupplemented plots and 1 of the 2 plots receiving either a cereal or cereal + protein supplement were offered access to a composite mineral block formulated to meet the mineral requirements of sheep with the exception of cobalt and selenium. There were 16 sheep on each plot within each group of 16 weaners, 4 were given an intraruminal cobalt 'bullet', 4 were given 5 mg of selenium orally, 4 given cobalt plus selenium and 4 were untreated controls. Experiment 2 was in year 3 with spring-born weaners on the same plots. The mineral block treatment was discarded on the plots receiving supplement and the effect of supplementary feeding at the beginning of March was compared with feeding in early January; barley was also compared with oats and protein. The stocking rate was raised to 13.5 sheep/ha. There were no statistically significant differences in sheep liveweight due to pasture type in either of the years of experiment 1 or experiment 2. Supplementation with cereals or protein-fortified cereals resulted in a significantly ( P < 0.05) increased liveweight at the end of March (5.6 kg in year 1,2.4 kg in year 2 of experiment 1, and 2.5 kg in experiment 2), and wool production (0.49 kg clean wool in year 1 and 0.3 1 kg in year 2 in experiment 1, and 0.49 in experiment 2). There was a significant liveweight response on the perennial plots to selenium + cobalt in year 1 of experiment 1. All cobalt-treated sheep were heavier ( P < 0.001) in year 2. Neither selenium nor cobalt significantly affected liveweight in experiment 2. The proportion of Phalaris aquatica on the perennial pasture diminished from 18% to less than 9% by the end of year 2 in experiment 1. These results suggest that, if perennial pastures cannot be maintained, then their establishment in the south-west of Western Australia would not result in greater animal production than on annual pasture. Decisions on using supplements would be dependent upon feed and wool prices.


2007 ◽  
Vol 58 (1) ◽  
pp. 21 ◽  
Author(s):  
Heping Zhang ◽  
Neil C. Turner ◽  
Michael L. Poole ◽  
Senthold Asseng

The growth and yield of spring wheat (Triticum aestivum L.) were examined to determine the actual and potential yields of wheat at a site in the high rainfall zone (HRZ) of south-western Australia. Spring wheat achieved yields of 5.5−5.9 t/ha in 2001 and 2003 when subsurface waterlogging was absent or minimal. These yields were close to the estimated potential, indicating that a high yield potential is achievable. In 2002 when subsurface waterlogging occurred early in the growing season, the yield of spring wheat was 40% lower than the estimated potential. The yield of wheat was significantly correlated with the number of ears per m2 (r2 = 0.81) and dry matter at anthesis (r2 = 0.73). To achieve 5–6 t/ha of yield of wheat in the HRZ, 450–550 ears per m2 and 10–11 t/ha dry matter at anthesis should be targetted. Attaining such a level of dry matter at anthesis did not have a negative effect on dry-matter accumulation during the post-anthesis period. The harvest index (0.36−0.38) of spring wheat was comparable with that in drier parts of south-western Australia, but relatively low given the high rainfall and the long growing season. This relatively low harvest index indicates that the selected cultivar bred for the low- and medium-rainfall zone in this study, when grown in the HRZ, may have genetic limitations in sink capacity arising from the low grain number per ear. We suggest that the yield of wheat in the HRZ may be increased further by increasing the sink capacity by increasing the number of grains per ear.


1994 ◽  
Vol 45 (1) ◽  
pp. 75 ◽  
Author(s):  
KJ Young ◽  
GA Elliott

Ear emergence was measured on a wide range of barley accessions for a number of sowing dates in contrasting environments of the Western Australian cereal-growing regions to determine suitable types for (i) early sowing in the low (<400 mm per annum) regions and (ii) barley production in the high rainfall (>450 mm per annum) regions. Accessions were classified into nine groups via cluster analysis using the time to ear emergence at four sites and a range of sowing dates. Australian cultivars were members of the three groups with the shortest mean time to ear emergence, and, on the basis of an optimum time to ear emergence at each site, were shown to be well adapted to a wide range of sowing times and sites. Members of only one other group showed an acceptable level of adaptation across sites and sowing dates, members of the other five groups being suited to early or very early sowings in the high rainfall region only.


2005 ◽  
Vol 56 (7) ◽  
pp. 743 ◽  
Author(s):  
Heping Zhang ◽  
Neil C. Turner ◽  
Michael L. Poole

Water use of wheat (Triticum aestivum L.), barley (Hordeum vulgare L.), canola (Brassica napus L.), and lucerne (Medicago sativa L.) was measured on a duplex soil in the high rainfall zone (HRZ) of south-western Australia from 2001 to 2003. Rainfall exceeded evapotranspiration in all years, resulting in transient perched watertables, subsurface waterlogging in 2002 and 2003, and loss of water by deep drainage and lateral flow in all years. There was no significant difference in water use among wheat, barley, and canola. Lucerne used water at a similar rate to annual crops during the winter and spring, but continued to extract 80−100 mm more water than the annual crops over the summer and autumn fallow period. This resulted in about 50 mm less drainage past the root-zone than for annual crops in the second and third years after the establishment of the lucerne. Crop water use was fully met by rainfall from sowing to anthesis and a significant amount of water (120−220 mm) was used during the post-anthesis period, resulting in a ratio of pre- to post-anthesis water use (ETa : ETpa) of 1 : 1 to 2 : 1. These ratios were lower than the indicative value of 2 : 1 for limited water supply for grain filling. High water use during the post-anthesis period was attributed to high available soil water at anthesis, a large rooting depth (≥1.4 m), a high proportion (15%) of roots in the clay subsoil, and regular rainfall during grain filling. The pattern of seasonal water use by crops suggested that high dry matter at anthesis did not prematurely exhaust soil water for grain filling and that it is unlikely to affect dry matter accumulation during grain filling and final grain yield under these conditions.


2003 ◽  
Vol 43 (8) ◽  
pp. 907 ◽  
Author(s):  
R. E. White ◽  
B. P. Christy ◽  
A. M. Ridley ◽  
A. E. Okom ◽  
S. R. Murphy ◽  
...  

Eleven experimental sites in the Sustainable Grazing Systems (SGS) national experiment were established in the high rainfall zone (HRZ, >600 mm/year) of Western Australia, Victoria and New South Wales to measure components of the water balance, and pathways of water movement, for a range of pastures from 1997 to 2001. The effect of widely spaced river red gums (Eucalyptus camaldulensis) in pasture, and of belts of plantation blue gums (E. globulus), was studied at 2 of the sites. The soil types tested ranged from Kurosols, Chromosols and Sodosols, with different subsoil permeabilities, to Hydrosols and Tenosols. The pasture types tested were kikuyu (Pennisetum clandestinum), phalaris (Phalaris aquatica), redgrass (Bothriochloa macra) and annual ryegrass (Lolium rigidum), with subterranean clover (Trifolium subterraneum) included. Management variables were set stocking v. rotational grazing, adjustable stocking rates, and level of fertiliser input. Soil, pasture and animal measurements were used to set parameters for the biophysical SGS pasture model, which simulated the long-term effects of soil, pasture type, grazing method and management on water use and movement, using as inputs daily weather data for 31 years from selected sites representing a range of climates. Measurements of mean maximum soil water deficit Sm were used to estimate the probability of surplus water occurring in winter, and the average amount of this surplus, which was highest (97–201 mm/year) for pastures in the cooler, winter-rainfall dominant regions of north-east and western Victoria and lowest (3–11 mm/year) in the warmer, lower rainfall regions of the eastern Riverina and Esperance, Western Australia. Kikuyu in Western Australia achieved the largest increase in Sm compared with annual pasture (55–71 mm), while increases due to phalaris were 18–45 mm, and those of native perennials were small and variable. Long-term model simulations suggested rooting depth was crucial in decreasing deep drainage, to about 50 mm/year for kikuyu rooting to 2.5 m, compared with 70–200 mm/year for annuals rooting to only 0.8 m. Plantation blue gums dried the soil profile to 5.25 m by an average of 400 mm more than kikuyu pasture, reducing the probability of winter surplus water to zero, and eliminating drainage below the root zone. Widely spaced river red gums had a much smaller effect on water use, and would need to number at least 14 trees per hectare to achieve extra soil drying of about 50 mm over a catchment. Soil type affected water use primarily through controlling the rooting depth of the vegetation, but it also changed the partitioning of surplus water between runoff and deep drainage. Strongly duplex soils such as Sodosols shed 50% or more surplus water as runoff, which is important for flushing streams, provided the water is of good quality. Grazing method and pasture management had only a marginal effect in increasing water use, but could have a positive effect on farm profitability through increased livestock production per hectare and improved persistence of perennial species.


Sign in / Sign up

Export Citation Format

Share Document