Effect of chloride fertilization on Bedford barley and Katepwa wheat

1995 ◽  
Vol 75 (1) ◽  
pp. 15-24 ◽  
Author(s):  
R. M. Mohr ◽  
C. C. Bernier ◽  
D. N. Flaten ◽  
G. J. Racz

Recent studies in the northern Great Plains have confirmed that the chloride (Cl−) component of fertilizers can reduce disease severity and increase grain yield for wheat (Triticum aestivum) and barley (Hordeum vulgare). Field studies were conducted in Manitoba in 1989 and 1990 to determine the effect of rates of 25 and 50 kg Cl− ha−1 (applied as KCl or NaCl) applied with or without Cochliobolus sativus inoculum on plant nutrient status, disease severity and grain yield for Katepwa wheat and Bedford barley. Chloride application, regardless of placement or source, increased the Cl− concentration in plant tissue sampled at the boot to heading stages. Rates of 25 and 50 kg Cl− ha−1 resulted in significant reductions in the severity of common root rot for barley in two of six experiments and for wheat in one of four experiments. Chloride applications did not reduce spot blotch severity on barley in either of two experiments conducted. Inoculum did not have a consistent effect on any of the parameters measured. The application of 50 kg Cl− ha−1 significantly increased grain yield for barley by an average 393 kg ha−1 in two of eight experiments, but did not increase grain yield for wheat in any of eight experiments. Yield responses to Cl− were not related to soil Cl− content, Cl− concentration in plant tissue or observed reductions in disease. Key words: Chloride, fertilizers, barley, wheat, disease, yield

1995 ◽  
Vol 75 (1) ◽  
pp. 25-34 ◽  
Author(s):  
R. M. Mohr ◽  
C. C. Bernier ◽  
D. N. Flaten ◽  
G. J. Racz

Crop cultivar has been shown to affect the frequency and magnitude of yield responses to chloride (Cl−) fertilizer applications. Information regarding the Cl− responsiveness of cereal cultivars commonly grown in western Canada is limited, however. Field experiments were conducted in Manitoba in 1990 and 1991 to determine the effect of Cl− fertilization on plant nutrient status, grain yield and grain quality of Katepwa, Roblin, Biggar and Marshall wheat (Triticum aestivum L.) and of Bedford, Brier, Argyle and Heartland barley (Hordeum vulgare L.). Chloride fertilization increased the concentration of Cl− in plant tissue of all cultivars. Increased grain yield and improved grain quality due to Cl− fertilization occurred more frequently in wheat than in barley; however, cultivars within a species differed in Cl− responsiveness. The application of 50 kg Cl− ha−1 significantly increased grain yield for Heartland barley by 905 kg ha−1 in one of four experiments, for Roblin wheat by 492 kg ha−1 in one of four experiments, for Biggar wheat by an average 333 kg ha−1 in two of four experiments and for Marshall wheat by an average 363 kg ha−1 in two of four experiments. However, the application of 50 kg Cl− ha−1 resulted in significant reduction in grain yield for Bedford in one of four experiments and for Marshall in one of four experiments. Yield responses to Cl− were not related to soil Cl− content or Cl− concentration in plant tissue. Key words: Chloride, fertilizers, wheat, barley, cultivars, yield


1976 ◽  
Vol 54 (24) ◽  
pp. 2888-2892 ◽  
Author(s):  
P. R. Verma ◽  
R. A. A. Morrall ◽  
R. D. Tinline

The effects of common root rot (Cochliobolus sativus) on components of grain yield in naturally infected Triticum aestivum cultivar Manitou were studied at Matador, Saskatchewan, by sampling plants at maturity in 1969, 1970, and 1971. Plants were sorted into severe (SE), moderate (MO), slight (SL), and clean (CL) categories based mainly on the extent of lesions on the subcrown internodes. The number of tillers per plant, the number and weight of grains per head, the weight per head, and the 1000-kernel weight in each category were determined. Increasing values of all five components were consistently associated with decreasing disease severity. SE was mostly significantly different from the other three categories in all components except 1000-kernel weight; differences between SL and MO were usually non-significant. CL and SL were mostly significantly different for the number of tillers per plant and weight per head but non-significant for the weight and number of grains per head and 1000-kernel weight. Apparently, the major effect of common root rot was to reduce the number of tillers per plant and number of grains per head.


2019 ◽  
Vol 99 (3) ◽  
pp. 345-355
Author(s):  
Richard E. Engel ◽  
Carlos M. Romero ◽  
Patrick Carr ◽  
Jessica A. Torrion

Fertilizer NO3-N may represent a benefit over NH4-N containing sources in semiarid regions where rainfall is often not sufficient to leach fertilizer-N out of crop rooting zones, denitrification concerns are not great, and when NH3 volatilization concerns exist. The objective of our study was to contrast plant-N derived from fertilizer-15N (15Ndff), fertilizer-15N recovery (F15NR), total N uptake, grain yield, and protein of wheat (Triticum aestivum L.) from spring-applied NaNO3 relative to urea and urea augmented with urease inhibitor N-(n-butyl)thiophosphoric triamide (NBPT). We established six fertilizer-N field trials widespread within the state of Montana between 2012 and 2017. The trials incorporated different experimental designs and 15N-labeled fertilizer-N sources, including NaNO3, NH4NO3, urea, and urea + NBPT. Overall, F15NR and 15Ndff in mature crop biomass were significantly greater for NaNO3 than urea or urea + NBPT (P < 0.05). Crop 15Ndff averaged 53.8%, 43.9%, and 44.7% across locations for NaNO3, urea, and urea + NBPT, respectively. Likewise, crop F15NR averaged 52.2%, 35.8%, and 38.6% for NaNO3, urea, and urea + NBPT, respectively. Soil 15N recovered in the surface layer (0–15 cm) was lower for NaNO3 compared with urea and urea + NBPT. Wheat grain yield and protein were generally not sensitive to improvements in 15Ndff, F15NR, or total N uptake. Our study hypothesis that NaNO3 would result in similar or better performance than urea or urea + NBPT was confirmed. Use of NO3-N fertilizer might be an alternative strategy to mitigate fertilizer-N induced soil acidity in semiarid regions of the northern Great Plains.


Weed Science ◽  
2010 ◽  
Vol 58 (1) ◽  
pp. 61-66 ◽  
Author(s):  
George O. Kegode ◽  
Gauri Nazre ◽  
Michael J. Christoffers

Biennial wormwood and lanceleaf sage have become serious weeds of several crops within the northern Great Plains of the United States and Canada. Both species are prolific seed producers but little is known about their potential for developing persistent seedbanks. Field studies were conducted to determine the influence of duration (7, 8, 11, 19, 20, and 23 mo) and depth of burial (0, 2.5, and 10 cm) on biennial wormwood and lanceleaf sage seed viability and decay. Biennial wormwood and lanceleaf sage seeds were buried in September 2003 (burial 1) and September 2004 (burial 2). In burial 1, biennial wormwood and lanceleaf sage seed viability was 65 and 66%, respectively, after 23 mo of burial. In burial 2, biennial wormwood and lanceleaf sage seed viability was 8 and 3%, respectively, after 23 mo of burial. The difference was likely because of higher soil moisture during burial 2, which promoted seed decay. Controlled-environment studies sought to determine the influence of stratification environments (freezing, chilling, and freeze–thaw) followed by exposure to diurnally fluctuating temperatures on germination of biennial wormwood and lanceleaf sage seeds. Stratified biennial wormwood seed germination was 95% or greater when incubated in fluctuating day/night temperatures of 37/20 or 37/25 C. Stratified lanceleaf sage seeds from freezing and chilling environments did not differ in germination following incubation in fluctuating temperatures and averaged 56 and 55%, respectively. Germination of stratified lanceleaf sage seeds from the freezing and thawing environment was higher than 50% during the thawing cycle, suggesting the possibility of early season emergence of this species. Our study indicates that biennial wormwood and lanceleaf sage have the potential to develop a seedbank that can persist for more than 2 yr. High moisture levels in the soil seedbank can lead to reduced seed survival.


Agronomy ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. 240
Author(s):  
Graham R. S. Collier ◽  
Dean M. Spaner ◽  
Robert J. Graf ◽  
Brian L. Beres

Ultra-early seeding of spring wheat (Triticum aestivum L.) on the northern Great Plains can increase grain yield and grain yield stability compared to current spring wheat planting systems. Field trials were conducted in western Canada from 2015 to 2018 to evaluate the impact of optimal agronomic management on grain yield, quality, and stability in ultra-early wheat seeding systems. Four planting times initiated by soil temperature triggers were evaluated. The earliest planting was triggered when soils reached 0–2.5 °C at a 5 cm depth, with the subsequent three plantings completed at 2.5 °C intervals up to soil temperatures of 10 °C. Two spring wheat lines were seeded at each planting date at two seeding depths (2.5 and 5 cm), and two seeding rates (200 and 400 seeds m−2). The greatest grain yield and stability occurred from combinations of the earliest seeding dates, high seeding rate, and shallow seeding depth; wheat line did not influence grain yield. Grain protein content was greater at later seeding dates; however, the greater grain yield at earlier seeding dates resulted in more protein production per unit area. Despite extreme ambient air temperatures below 0 °C after planting, plant survival was not reduced at the earliest seeding dates. Planting wheat as soon as feasible after soil temperatures reach 0 °C, and prior to soils reaching 7.5–10 °C, at an optimal seeding rate and shallow seeding depth increased grain yield and stability compared to current seeding practices. Adopting ultra-early wheat seeding systems on the northern Great Plains will lead to additional grain yield benefits as climate change continues to increase annual average growing season temperatures.


2001 ◽  
Vol 81 (2) ◽  
pp. 351-354 ◽  
Author(s):  
M. H. Entz ◽  
R. Guilford ◽  
R. Gulden

Cropping records from 13 organic farms in the eastern Canadian prairies and one in North Dakota (1991 to 1996) were surveyed to determine crop rotation pattern, yields and soil nutrient status. Major crops included cereal grains, forages, and green manure legumes. Organic grain and forage yields averaged from one-half to almost double conventional yields. Soil N, K and S levels on organic farms were generally sufficient; however, levels of available soil P were deficient in several instances. Key words: Crop rotation, weeds, forages, green manure crops


2018 ◽  
Vol 19 (4) ◽  
pp. 288-294 ◽  
Author(s):  
Andrew Friskop ◽  
Shashi Yellareddygari ◽  
Neil C. Gudmestad ◽  
Kate Binzen Fuller ◽  
Mary Burrows

The use of fungicides on hard red wheat in the northern Great Plains increased in part owing to inexpensive fungicide options and several years of foliar disease epidemics. In some instances, fungicides are used in the absence of disease, prompting questions on the perceived value of these applications. This study analyzed yield data from 46 fungicide trials conducted in low-disease environments from 2007 to 2014 on hard red spring wheat and hard red winter wheat. Data were sorted and organized to determine yield response attributed to fungicide application timing (Feekes 2–3 or Feekes 9) and fungicide mode of action. Fungicide modes of action included quinone outside inhibitors (QoIs), demethylation inhibitors (DMIs), succinate dehydrogenase inhibitors (SDHIs), and blends (mixtures of QoI, DMI, or SDHI). A meta-analysis indicated a significant yield response of 101.6 kg/ha for a Feekes 9 application and 69.3 kg/ha for applications that used a QoI. Economic analyses indicated that when a midpoint value for both wheat price and application cost were used, less than one-third of the trials had a profitable net return when a Feekes 9 application was used. The infrequent positive yield responses associated with foliar fungicides in low-disease environments should prompt growers to evaluate disease risk prior to making an application for foliar diseases.


1992 ◽  
Vol 43 (1) ◽  
pp. 43 ◽  
Author(s):  
GB Wildermuth ◽  
RD Tinline ◽  
RB McNamara

The effects of common root rot (CRR) caused by Bipolaris sorokiniana on grain yield, number of tillers, number of grains and grain weight of wheat plants were determined in four field experiments. Sites with different soil populations of B. sorokiniana were selected and inoculum of the fungus added to some plots. Disease and yield measurements were made on eight cultivars and lines differing in susceptibility to CRR. Timgalen, Songlen and Hartog were susceptible whereas Kite, 1008 C16, 141-4 and ISWYN 32 were partially resistant to CRR. Grain yield, tiller and grain number, but not grain weight decreased as disease severity increased. Diseased plants had lower tiller numbers than healthy ones and as a consequence a reduced number of grains and grain yield per plant. Five methods were compared for estimating yield loss caused by the disease. Polynomial regression equations for each cultivar between yield and disease rating of sub-crown internodes or multiple regression equations between yield and disease parameters of sub-crown internodes or tiller bases were established. A third method involved the projection of yield losses from one cultivar to other cultivars and in a fourth method yield losses were estimated from actual yields. In addition, an equation Yield loss (%) = 3 46 + 0.23 disease severity) (%) was established in one experiment and used as a fifth method in the other experiments. Yield losses estimated by methods 1 and 2 were similar and higher than those from the other methods. In areas where disease severity is high, methods 1 and 5 appear to be the most suitable for determining yield losses. Losses in a susceptible cultivar, Timgalen, varied between 13.9 and 23.9% whereas those in a partially resistant cultivar, 1008 C16, varied between 6.8 and 13.6%.


2019 ◽  
Vol 33 (1) ◽  
pp. 173-177 ◽  
Author(s):  
Taïga B. Cholette ◽  
Nader Soltani ◽  
David C. Hooker ◽  
Darren E. Robinson ◽  
Peter H. Sikkema

AbstractField studies were conducted to determine the possible rate and timing of nicosulfuron to suppress annual ryegrass (ARG) seeded as a cover crop at the time of corn planting without affecting corn performance near Ridgetown, ON, Canada, in 2016 and 2017. Nicosulfuron was applied at rates from 0.8 to 50 g ai ha–1when the ARG was at the two- to three- or four- to five-leaf stages, or approximately 3 or 4 wk after emergence of both corn and ARG. There were no differences between the two application timings in grain yield responses or ARG suppression. As the rate of nicosulfuron increased from 0.8 to 50 g ai ha–1, ARG was suppressed 6% to 76% and 5% to 96%, at 1 and 4 wk after application (WAA), respectively. At 4 WAA, ARG biomass decreased from 29 to 1 g m–2as the rate of nicosulfuron increased from 0.8 to 50 g ai ha–1, compared to 36 g m–2in the untreated control. Where nicosulfuron was not applied to ARG, grain corn yield was reduced by 6% compared to the ARG-free control; similar effects on corn yield were observed with nicosulfuron at the lowest rate applied at 0.8 g ai ha–1. Grain corn yield was reduced by 2.5% with the application of nicosulfuron at 25 g ai ha–1(label rate for corn) compared to no ARG control, but this was not statistically significant. This study identified rates of nicosulfuron that suppressed ARG when emerged approximately the same day as corn, but there was evidence that grain corn yields were lowered because of interference, possibly during the critical weed control period. Based on this study, an ARG cover crop should not be seeded at the same time as corn unless one is willing to accept a risk for corn grain yield losses for the sake of the cover crop.


Sign in / Sign up

Export Citation Format

Share Document