Effect of hydroperiod on seed-bank composition in semipermanent prairie wetlands

1989 ◽  
Vol 67 (3) ◽  
pp. 856-864 ◽  
Author(s):  
Karen A. Poiani ◽  
W. Carter Johnson

Bottom samples were collected from two semipermanent prairie wetlands to determine if known hydroperiod differences were reflected in seed-bank composition. Samples were also taken a 2nd year in one wetland to assess between-year variation. Seed density and composition were determined by counting and identifying seedlings that emerged from samples placed in a greenhouse. Most seed-bank characteristics were statistically indistinguishable between wetlands, including floristic composition and total seed density. A significant difference occurred, however, in the relative importance of mudflat annuals and émergents. The mudflat group was much more abundant in the wetland with the shorter hydroperiod (82 vs. 51%). More frequent exposure of the substrate probably yielded greater seed production. Also, a longer hydroperiod depressed seed density in the open water zone (open water zone, 1309 seeds/m2; two emergent zones, 2840 and 9893 seeds/m2). A seed-bank assay may detect subtle hydroperiod differences among wetlands of the same permanence class more quickly and economically than long-term hydrological monitoring. A sharp increase in mudflat seeds in the 2nd year of sampling after a drawdown supports the use of seed banks in determining hydroperiod events in these wetlands.

1989 ◽  
Vol 67 (6) ◽  
pp. 1878-1884 ◽  
Author(s):  
Carol E. Wienhold ◽  
A. G. van der Valk

To determine the potential role of seed banks in the restoration of drained wetlands, the seed banks of 30 extant and 52 drained and cultivated prairie potholes were sampled in Iowa, Minnesota, and North Dakota; the potholes had been drained between 5 and 70 years ago. The midsummer vegetation of most of these potholes was also sampled. The number of species in the seed bank of a pothole declined from a mean of 12.3 in extant potholes to 7.5, 5.4, 5.0, 7.4, 3.2, and 2.1 in potholes drained up to 5, 10, 20, 30, 40, and 70 years ago, respectively. The mean total seed density of extant potholes was 3600 seeds/m2. It increased to 7000 seeds/m2 up to 5 years after drainage, but then declined rapidly to 1400, 1200, 600, 300, and 160 after up to 10, 20, 30, 40, and 70 years after drainage. Changes in both species richness and seed density with increasing duration of drainage varied from state to state. About 60% of the species present in the seed banks of extant or recently drained wetlands were not detected in wetlands that had been drained for more than 20 years. Vegetation surveys of extant and drained wetlands indicated that as many or more wetland species not detected in the seed bank were present in the vegetation, as there were wetland species in the seed bank.


2010 ◽  
Vol 20 (3) ◽  
pp. 201-207 ◽  
Author(s):  
Shiro Tsuyuzaki

AbstractSeed longevity in situ is a prerequisite for understanding the life histories and community dynamics of species, although long-term longevity under thick tephra has not been documented because of a lack of opportunity and/or awareness. The seed bank for this study was estimated by both germination and flotation tests. Seeds of 17 species have survived with high density, having been buried under thick tephra for 30 years, since the 1977–1978 eruptions on Mount Usu, Hokkaido Island, northern Japan. The total seed density was >1000/m2. Rumex obtusifolius was the most common seed-bank species for 30 years, but decreased in density between 20 and 30 years. More seeds of Hypericum erectum occurred in deeper soil. The total seed density decreased gradually for 30 years, but H. erectum and Juncus effusus did not decline. Native seeds tended to be viable longer than exotic seeds. These results suggest that small, native seeds tend to survive longer with deep burial, while the more numerous weedy, exotic seeds located at the soil surface declined faster. The seed bank provides long-term monitoring of seed survival under natural conditions, and could be used to detect genetic changes.


2004 ◽  
Vol 82 (12) ◽  
pp. 1809-1816 ◽  
Author(s):  
Luis Marone ◽  
Víctor R Cueto ◽  
Fernando A Milesi ◽  
Javier Lopez de Casenave

We assessed soil seed bank composition and size over several microhabitats of two habitats of the central Monte Desert of Argentina (open Prosopis woodland and Larrea shrubland) to analyse differences among them. Seed densities were similar to those already reported for other deserts, but we found consistent differences in seed composition among microhabitats. Whereas grass seeds (e.g., Aristida, Pappophorum, Neobouteloua, Trichloris, Digitaria) prevailed in natural depressions of open areas, forb seeds (e.g., Phacelia, Lappula, Descurainia, Plantago, Chenopodium) were more abundant under trees. The comparison of seed production during primary dispersal (i.e., seed rain) with seed density on the ground at the end of dispersal indicated that most forb seeds entered the habitat through the micro habitats located beneath the canopy of trees and tall shrubs, and remained there after redistribution. Most grass seeds, by contrast, entered it through bare-soil and under-grass microhabitats, and reached more even distributions after secondary dispersal, especially because of dramatic losses in bare soil. Patterns of plant recruitment and seed dynamics in specific microhabitats were better understood when differences of soil seed bank composition, but not of total seed density, were taken into account.Key words: Monte Desert, seed dispersal, seed predation, seed production, seeds.


Author(s):  
Natali Gomes Bordon ◽  
Niwton Leal Filho ◽  
Tony Vizcarra Bentos

The seed bank is directly related to forest resilience because it contributes to the greatest number of regenerants after the occurrence of disturbances. Changes in seed density, floristic composition, and life forms completely alter the successional trajectory of forest environments. These changes are directly related to land use. For example, suppression of the seed bank can occur in pastures, that experience frequent fires with increase of density of seeds and predominance of herbs are typical of highly degraded areas, such as Poaceae, Rubiaceae, Asteraceae, and Cyperaceae. Melastomataceae seedlings are an important component of the seed bank in the Amazon rainforest. On the other hand, Urticaceae has greater representation in forests that exhibit low-impact land use. Any change in seed bank functionality is bound to compromise the diversity, regeneration potential and overall maintenance of tropical forests. Therefore, it is necessary to expand studies that investigate seed banks in the Amazon rainforest. It is as important to prioritize sampling methods and pursue standardization of data presentation, as well as improve the identification of species that occur in the seed bank.


2014 ◽  
Vol 67 (3-4) ◽  
pp. 301-311 ◽  
Author(s):  
Krystyna Falińska

Studies of the relation between the size and floristic composition of the seed bank and the vegetation during succession were conducted in the years 1976-1996. The results of the research did not confirm the hypothesis of directional changes in the density of seeds as the observed changes were of fluctuating nature. At the start of succession the seed bank was small (2970 seeds/m<sup>2</sup>), and in subsequent years the density of seeds increased successively to reach the highest level after 15 years (9170/m<sup>2</sup> ), then dropped again. After 20 years the density of seeds reached the value recorded at the initial stage (2468/m<sup>2</sup>). Successional changes in floristic richness of seed bank were directional with the number of species falling from 52 to 24. All data indicate that the floristic composition of the seed bank is not a mirror reflection of the vegetation structure, but rather a record of a long-term turnover of species and many different events which in various periods of time influenced the inflow of seeds to the soil (spatial dynamics of the necromass, perennial herb and willow canopy structure, seed migration).


2004 ◽  
Vol 82 (7) ◽  
pp. 992-1000 ◽  
Author(s):  
Rick E Landenberger ◽  
James B McGraw

Little is known about the seed banks of mixed-mesophytic forest clearcuts or their associated forest edges. Seed banks were described and compared to better understand how seed density, species richness, and composition change with increasing distance from clearcuts. Thirty-two taxa were found in the seed bank of clearcuts, and 44 were found in adjacent forest edges. Annuals represented 41% of seeds in clearcuts, but only 8% in edges, while trees and shrubs represented 3% in both areas. Seed-bank density and species richness varied significantly within and between clearcuts, but clearcuts were no different in seed-bank density and richness from interior forest seed banks. Seed-bank density declined significantly with distance from clearcuts on west-facing forest edges, but demonstrated no discernable spatial pattern on south-facing edges. Overall, edge effects from clearcutting on adjacent forest seed banks were demonstrated in total seed density and in several common wind-dispersed, early-successional herbaceous species, including Erechtites hieraciifolia (L.) Raf. and Lobelia inflata L., and Vitis, a common ingestion-dispersed species. The seed-shadow edge effect may influence both current and future community characteristics and population dynamics of vegetation in mixed-mesophytic forest edges adjacent to clearcuts.Key words: seed banks, clearcutting, edge effects, mixed-mesophytic forest, West Virginia.


Problems when calculating reinforced concrete structures based on the concrete deformation under compression diagram, which is presented both in Russian and foreign regulatory documents on the design of concrete and reinforced concrete structures are considered. The correctness of their compliance for all classes of concrete remains very approximate, especially a significant difference occurs when using Euronorm due to the different shape and sizes of the samples. At present, there are no methodical recommendations for determining the ultimate relative deformations of concrete under axial compression and the construction of curvilinear deformation diagrams, which leads to limited experimental data and, as a result, does not make it possible to enter more detailed ultimate strain values into domestic standards. The results of experimental studies to determine the ultimate relative deformations of concrete under compression for different classes of concrete, which allowed to make analytical dependences for the evaluation of the ultimate relative deformations and description of curvilinear deformation diagrams, are presented. The article discusses various options for using the deformation model to assess the stress-strain state of the structure, it is concluded that it is necessary to use not only the finite values of the ultimate deformations, but also their intermediate values. This requires reliable diagrams "s–e” for all classes of concrete. The difficulties of measuring deformations in concrete subjected to peak load, corresponding to the prismatic strength, as well as main cracks that appeared under conditions of long-term step loading are highlighted. Variants of more accurate measurements are proposed. Development and implementation of the new standard GOST "Concretes. Methods for determination of complete diagrams" on the basis of the developed method for obtaining complete diagrams of concrete deformation under compression for the evaluation of ultimate deformability of concrete under compression are necessary.


2020 ◽  
Vol 132 (5) ◽  
pp. 1405-1413 ◽  
Author(s):  
Michael D. Staudt ◽  
Holger Joswig ◽  
Gwynedd E. Pickett ◽  
Keith W. MacDougall ◽  
Andrew G. Parrent

OBJECTIVEThe prevalence of trigeminal neuralgia (TN) in patients with multiple sclerosis (MS-TN) is higher than in the general population (idiopathic TN [ITN]). Glycerol rhizotomy (GR) is a percutaneous lesioning surgery commonly performed for the treatment of medically refractory TN. While treatment for acute pain relief is excellent, long-term pain relief is poorer. The object of this study was to assess the efficacy of percutaneous retrogasserian GR for the treatment of MS-TN versus ITN.METHODSA retrospective chart review was performed, identifying 219 patients who had undergone 401 GR procedures from 1983 to 2018 at a single academic institution. All patients were diagnosed with medically refractory MS-TN (182 procedures) or ITN (219 procedures). The primary outcome measures of interest were immediate pain relief and time to pain recurrence following initial and repeat GR procedures. Secondary outcomes included medication usage and presence of periprocedural hypesthesia.RESULTSThe initial pain-free response rate was similar between groups (p = 0.726): MS-TN initial GR 89.6%; MS-TN repeat GR 91.9%; ITN initial GR 89.6%; ITN repeat GR 87.0%. The median time to recurrence after initial GR was similar between MS-TN (2.7 ± 1.3 years) and ITN (2.1 ± 0.6 years) patients (p = 0.87). However, there was a statistically significant difference in the time to recurrence after repeat GR between MS-TN (2.3 ± 0.5 years) and ITN patients (1.2 ± 0.2 years; p < 0.05). The presence of periprocedural hypesthesia was highly predictive of pain-free survival (p < 0.01).CONCLUSIONSPatients with MS-TN achieve meaningful pain relief following GR, with an efficacy comparable to that following GR in patients with ITN. Initial and subsequent GR procedures are equally efficacious.


2021 ◽  
Vol 4 (Supplement_1) ◽  
pp. 234-236
Author(s):  
P Willems ◽  
J Hercun ◽  
C Vincent ◽  
F Alvarez

Abstract Background The natural history of primary sclerosing cholangitis (PSC) in children seems to differ from PSC in adults. However, studies on this matter have been limited by short follow-up periods and inconsistent classification of patients with autoimmune cholangitis (AIC) (or overlap syndrome). Consequently, it remains unclear if long-term outcomes are affected by the clinical phenotype. Aims The aims of this is study are to describe the long-term evolution of PSC and AIC in a pediatric cohort with extension of follow-up into adulthood and to evaluate the influence of phenotype on clinical outcomes. Methods This is a retrospective study of patients with AIC or PSC followed at CHU-Sainte-Justine, a pediatric referral center in Montreal. All charts between January 1998 and December 2019 were reviewed. Patients were classified as either AIC (duct disease on cholangiography with histological features of autoimmune hepatitis) or PSC (large or small duct disease on cholangiography and/or histology). Extension of follow-up after the age of 18 was done for patients followed at the Centre hospitalier de l’Université de Montréal. Clinical features at diagnosis, response to treatment at one year and liver-related outcomes were compared. Results 40 patients (27 PSC and 13 AIC) were followed for a median time of 71 months (range 2 to 347), with 52.5% followed into adulthood. 70% (28/40) had associated inflammatory bowel disease (IBD) (78% PSC vs 54% AIC; p=0.15). A similar proportion of patients had biopsy-proven significant fibrosis at diagnosis (45% PSC vs 67% AIC; p=0.23). Baseline liver tests were similar in both groups. At diagnosis, all patients were treated with ursodeoxycholic acid. Significantly more patients with AIC (77% AIC vs 30 % PSC; p=0.005) were initially treated with immunosuppressive drugs, without a significant difference in the use of Anti-TNF agents (0% AIC vs 15% PSC; p= 0.12). At one year, 55% (15/27) of patients in the PSC group had normal liver tests versus only 15% (2/13) in the AIC group (p=0.02). During follow-up, more liver-related events (cholangitis, liver transplant and cirrhosis) were reported in the AIC group (HR=3.7 (95% CI: 1.4–10), p=0.01). Abnormal liver tests at one year were a strong predictor of liver-related events during follow-up (HR=8.9(95% CI: 1.2–67.4), p=0.03), while having IBD was not (HR=0.48 (95% CI: 0.15–1.5), p=0.22). 5 patients required liver transplantation with no difference between both groups (8% CAI vs 15% CSP; p=0.53). Conclusions Pediatric patients with AIC and PSC show, at onset, similar stage of liver disease with comparable clinical and biochemical characteristics. However, patients with AIC receive more often immunosuppressive therapy and treatment response is less frequent. AIC is associated with more liver-related events and abnormal liver tests at one year are predictor of bad outcomes. Funding Agencies None


2021 ◽  
Vol 108 (Supplement_2) ◽  
Author(s):  
A Fung ◽  
A Ward ◽  
K Patel ◽  
M Krkovic

Abstract Introduction Infection is a major complication of open fractures. Antibiotic-impregnated calcium sulfate (AICS) beads are widely used as an adjuvant to systemic antibiotics. Whilst their efficacy in the secondary prevention of infection is established, we present the first retrospective study evaluating AICS beads in the primary prevention of infection in open fractures. Method 214 open femur and tibia fractures in 207 patients were reviewed over a seven-year period. 148 fractures received only systemic antibiotic prophylaxis. 66 fractures also received AICS beads. The occurrence of acute infection (wound infection and acute osteomyelitis) was recorded, as well as that of long-term complications (chronic osteomyelitis, non-union and death). Results Fractures that received AICS with systemic antibiotics had an overall acute infection rate of 42% (28/66), compared to 43% (63/148) in fractures that received only systemic antibiotics (p &gt; 0.05). There was no significant difference in infection rate even when fractures were stratified by Gustilo-Anderson grade. There was also no significant difference in the rate of long-term complications. Conclusions Our results indicate that the adjuvant use of AICS beads is not effective for the primary prevention of acute infection or long-term complications in open leg fractures. Further research is needed to elucidate the factors influencing the outcomes of AICS use.


Sign in / Sign up

Export Citation Format

Share Document