Effects of variation in fire intensity on regeneration of co-occurring species of small trees in the Sydney region

2000 ◽  
Vol 48 (1) ◽  
pp. 71 ◽  
Author(s):  
David A. Morrison ◽  
John A. Renwick

Fire is a common source of change for the plant species of Mediterranean-type ecosystems, but little is known about the comparative effects of different fire intensities. Accordingly, nine species of small tree (Acacia binervia, Acacia implexa, Acacia parramattensis, Casuarina littoralis, Casuarina torulosa, Hakea sericea, Jacksonia scoparia, Leptospermum trinervium, Persoonia linearis) were studied 1 year after each of two low-intensity prescribed fires and a high-intensity wildfire at a site in the outer western region of the Sydney metropolitan area, south-eastern Australia. All of the species except H. sericea proved to be at least partly tolerant of the low-intensity fires (40–80% of their stems surviving the fires), but only C. torulosa, L. trinervium and P. linearis were tolerant of the high-intensity fire (20–30% stem survival). All of the fire-tolerant species had more of their smaller stems killed by the fires, and the high-intensity fire killed larger stems than did the low-intensity-fires. The size of surviving stems was related to the fire-tolerance characteristics for these species, specifically the presence or absence of insulating bark and epicormic or lignotuberous buds, as well as stem height (preventing 100% leaf-scorch). Those species with post-fire shoots at the stem base produced them when the upper part of the stem had been killed, with variable response to the fire intensities in the number of shoots produced. Those species with post-fire epicormic shoots produced them if the stem was alive post-fire, usually with fewer shoots produced after the high-intensity than the low-intensity fire. The number of shoots produced was positively related to the size of the stem for both fire intensities. These different sets of responses to the fire intensities have important implications for the ability to predict community responses to fire based on the study of only a few species, as well for the long-term effects of prescribing a particular fire regime.

Fire Ecology ◽  
2019 ◽  
Vol 15 (1) ◽  
Author(s):  
Valerie S. Densmore ◽  
Emma S. Clingan

Abstract Background Prescribed burning is used to reduce fire hazard in highly flammable vegetation types, including Banksia L.f. woodland that occurs on the Swan Coastal Plain (SCP), Western Australia, Australia. The 2016 census recorded well over 1.9 million people living on the SCP, which also encompasses Perth, the fourth largest city in Australia. Banksia woodland is prone to frequent ignitions that can cause extensive bushfires that consume canopy-stored banksia seeds, a critical food resource for an endangered bird, the Carnaby’s cockatoo (Calyptorynchus latirostris, Carnaby 1948). The time needed for banksias to reach maturity and maximum seed production is several years longer than the typical interval between prescribed burns. We compared prescribed burns to bushfires and unburned sites at three locations in banksia woodland to determine whether low-intensity prescribed burns affect the number of adult banksias and their seed production. Study sites were matched to the same vegetation complex, fire regime, and time-since-fire to isolate fire intensity as a variable. Results Headfire rates of spread and differenced normalized burn ratios indicated that prescribed burning was generally of a much lower intensity than bushfire. The percentage survival of adult banksias and their production of cones and follicles (seeds) did not decrease during the first three years following a prescribed burn. However, survival and seed production were significantly diminished followed high-intensity bushfire. Thus, carrying capacity for Carnaby’s cockatoo was unchanged by prescribed burning but decreased markedly following bushfire in banksia woodland. Conclusions These results suggest that prescribed burning is markedly different from bushfire when considering appropriate fire intervals to conserve canopy habitats in fire-resilient vegetation communities. Therefore, low-intensity prescribed burning represents a viable management tool to reduce the frequency and extent of bushfire impacts on banksia woodland and Carnaby’s cockatoo.


Fire ◽  
2021 ◽  
Vol 4 (3) ◽  
pp. 56
Author(s):  
Filippe L.M. Santos ◽  
Joana Nogueira ◽  
Rodrigo A. F. de Souza ◽  
Rodrigo M. Falleiro ◽  
Isabel B. Schmidt ◽  
...  

Brazil has recently (2014) changed from a zero-fire policy to an Integrated Fire Management (IFM) program with the active use of prescribed burning (PB) in federal Protected Areas (PA) and Indigenous Territories (IT) of the Brazilian savanna (Cerrado). PB is commonly applied in the management of fire-prone ecosystems to mitigate large, high-intensity wildfires, the associated emissions, and high fire suppression costs. However, the effectiveness of such fire management in reducing large wildfires and emissions over Brazil remains mostly unevaluated. Here, we aim to fill the gap in the scientific evidence of the PB benefits by relying on the most up-to-date, satellite-derived fire datasets of burned area (BA), fire size, duration, emissions, and intensity from 2003 to 2018. We focused on two Cerrado ITs with different sizes and hydrological regimes, Xerente and Araguaia, where IFM has been in place since 2015. To understand fire regime dynamics, we divided the study period into three phases according to the prevalent fire policy and the individual fire scars into four size classes. We considered two fire seasons: management fire season (MFS, which goes from rainy to mid-dry season, when PBs are undertaken) and wildfires season (WFS, when PBs are not performed and fires tend to grow out of control). Our results show that the implementation of the IFM program was responsible for a decrease of the areas affected by high fire recurrence in Xerente and Araguaia, when compared with the Zero Fire Phase (2008–2013). In both regions, PB effectively reduced the large wildfires occurrence, the number of medium and large scars, fire intensity, and emissions, changing the prevalent fire season from the WFS to the MFS. Such reductions are significant since WFS causes higher negative impacts on biodiversity conservation and higher greenhouse gas emissions. We conclude that the effect on wildfires can still be reduced if effective fire management policies, including PB, continue to be implemented during the coming decades.


2006 ◽  
Vol 15 (2) ◽  
pp. 261 ◽  
Author(s):  
Mark K. J. Ooi ◽  
Robert J. Whelan ◽  
Tony D. Auld

Understanding how a species persists under a particular fire regime requires knowledge of the response to fire of individual plants. However, categorising the fire response of a species solely based on known responses of individual plants can be misleading when predicting a population response. In the present study, we sought to determine the fire responses of several Leucopogon species at the population level, including the threatened L. exolasius. We found that, whilst all species studied were obligate seeders, the population responses of species to fire were dependent upon fire intensity and patchiness. Results showed first that low intensity fires were significantly patchier than higher intensity fires. Second, the proportion of plants killed within a population decreased with increased fire patchiness. We also assessed how populations were structured and found that stands were multi-aged at most sites, and did not have a single-aged structure, which is often assumed for obligate seeders. Both spatial complexity within the fire regime leading to adult plant persistence, and inter-fire recruitment, contributed to the multi-aged structure. It is possible that these Leucopogon species are gap recruiters, and may tolerate fire rather than be specifically adapted to it. Inter-fire recruitment may enable L. exolasius populations to persist for a much longer fire-free period than many other species in the region.


1997 ◽  
Vol 77 (04) ◽  
pp. 685-689 ◽  
Author(s):  
Paul A Kyrle ◽  
Johannes Brockmeier ◽  
Ansgar Weltermann ◽  
Sabine Eichinger ◽  
Wolfgang Speiser ◽  
...  

SummaryCoumarin-induced skin necrosis is believed to be due to a transient hypercoagulable state resulting from a more rapid decline of the protein C activity relative to that of coagulation factors (F) II, IX and X during initiation of oral anticoagulant therapy. We studied hemostatic system activation during early oral anticoagulant treatment with a technique that investigates coagulation activation in the microcirculation.We determined in 10 healthy volunteers the concentrations of prothrombin fragment F1+2 (f1.2) and thrombin-antithrombin complex (TAT) in blood emerging from an injury of the microvasculature (bleeding time incision) before and after initiation of both high-inten- sity and low-intensity coumarin therapy. In addition, f1.2, TAT, activated F VII (F Vila) and the activities of FII, F VII, F X and protein C were measured in venous blood.A rapid decline of F VII and protein C was observed in venous blood with activities at 24 h of 7 ± 1% and 43 ± 2%, respectively, during the high-intensity regimen. A 20 to 30% reduction of f1.2 and TAT was seen in venous blood at 72 h with no major difference between the high- and the low-intensity regimen. F Vila levels were substantially affected by anticoagulation with a >90% reduction at 48 h during the high-intensity regimen. Following high-intensity coumarin, a >50% decrease in the fl.2 and TAT levels was found in shed blood at 48 h suggesting substantial inhibition of thrombin generation during early oral anticoagulation. An increase in the f1.2 and TAT levels was seen neither in shed blood nor in venous blood.Our data do not support the concept of a transient imbalance between generation and inhibition of thrombin as the underlying pathomechanism of coumarin-induced skin nekrosis.


Author(s):  
Goncalo V. Mendonca ◽  
Carolina Vila-Chã ◽  
Carolina Teodósio ◽  
André D. Goncalves ◽  
Sandro R. Freitas ◽  
...  

2020 ◽  
Vol 47 (1) ◽  
Author(s):  
Rabab S. Zaghlol ◽  
Sahar S. Khalil ◽  
Ahmed M. Attia ◽  
Ghada A. Dawa

Abstract Background Total knee replacement operation (TKR) is the treatment of choice in severe knee osteoarthritis (OA). Rehabilitation post-TKR is still not well studied. The aim of this study was to compare between the high-intensity (HI) rehabilitation program and the low-intensity (LI) rehabilitation program following TKR. Results At 1 month following the TKR operations, significant improvements were found in the first group compared to the second group in all the measured parameters except for the knee range of motion (ROM). At 3 and 12 months follow-up periods, there were statistically significant differences between both groups in all the evaluated parameters except for the numeric pain rating scale and the knee ROM. Conclusions Both high-intensity and low-intensity rehabilitation programs are effective; however, HI program had superior functional gain and patient-reported outcomes compared to the LI program. Moreover, HI group has a long-term functional gain.


2020 ◽  
pp. 108201322097379
Author(s):  
Jahir Antonio Barajas-Ramírez ◽  
Ana Luisa Gutiérrez-Salomón ◽  
Sonia Guadalupe Sáyago-Ayerdi

The calyces of roselle ( Hibiscus sabdariffa L.) are used to make a refreshing drink with high content of anthocyanins and other phenolic compounds, although the process for obtaining the beverage is not standardized. In this research it was determined physicochemical characteristics, total soluble polyphenols content, antioxidant activity and acceptance for beverages prepared by decoction at four concentrations of calyces in water (1.0, 2.5, 5.0 and 10.0%) and two concentrations of sucrose (11 and 16%). Color parameters in beverages permitted to describe them as red, which turned darker as the content of hibiscus increased. Total soluble polyphenols content and antioxidant activity were directly correlated with content of calyces in beverages and inversely correlated with pH, which means that beverages with higher content of calyces could be perceived as more acidic and more intense in characteristics associated with hibiscus presence, such as acid, astringent and the presence of intense dark red color, although the higher concentration of sucrose might have contributed to mask slightly the sourness and astringency. Acceptance for hibiscus beverages allowed to observe two segments of consumers, high-intensity and low-intensity likers but both confluence in overall liking values for beverages prepared with 2.5% calyces and 16% sucrose.


2014 ◽  
Vol 42 (6) ◽  
pp. 760-764 ◽  
Author(s):  
Katharine A. Rimes ◽  
Janet Wingrove ◽  
Rona Moss-Morris ◽  
Trudie Chalder

Background: Cognitive behavioural interventions are effective in the treatment of chronic fatigue, chronic fatigue syndrome (sometimes known as ME or CFS/ME) and irritable bowel syndrome (IBS). Such interventions are increasingly being provided not only in specialist settings but in primary care settings such as Improving Access to Psychological Therapies (IAPT) services. There are no existing competences for the delivery of “low-intensity” or “high-intensity” cognitive behavioural interventions for these conditions. Aims: To develop “high-intensity” and “low-intensity” competences for cognitive behavioural interventions for chronic fatigue, CFS/ME and IBS. Method: The initial draft drew on a variety of sources including treatment manuals and other information from randomized controlled trials. Therapists with experience in providing cognitive behavioural interventions for CF, CFS/ME and IBS in research and clinical settings were consulted on the initial draft competences and their suggestions for minor amendments were incorporated into the final versions. Results: Feedback from experienced therapists was positive. Therapists providing low intensity interventions reported that the competences were also helpful in highlighting training needs. Conclusions: These sets of competences should facilitate the training and supervision of therapists providing cognitive behavioural interventions for chronic fatigue, CFS/ME and IBS. The competences are available online (see table of contents for this issue: http://journals.cambridge.org/jid_BCP) or on request from the first author.


1988 ◽  
Vol 78 (4) ◽  
pp. 673-682 ◽  
Author(s):  
Garrick McDonald ◽  
A. Mark Smith

AbstractPopulations of Nysius vinitor Bergroth were studied from 1979 to 1982 in two weed hosts, Arctotheca calendula and Polygonum aviculare, and eight irrigated sunflower crops in a summer cropping area of northern Victoria, Australia. The spring generation began with the adults colonizing flowering A.calendula plants in September and concluded with the rapid development of late stage nymphs and an exodus of adults from these plants from mid-November to December. Gradual invasion of sunflowers occurred mostly in late December and reached a peak at flowering, after which nymphs appeared. P. aviculare attracted adults from February and hosted a number of overlapping generations until winter. The weed sustained diminishing numbers of adults through the winter, except in 1982, when a further generation produced an early spring peak. Immigrant populations were regarded as a common source of adults for initiating the spring and summer generations. The rate of development of N. vinitor in spring was more rapid than that predicted by phenological simulation based on ambient temperatures and laboratory-derived day-degree estimates. This was attributed to increased body temperatures through absorption of solar radiation, and the simulation model was adjusted by increasing daily minimum and maximum temperatures by 1·3 and 5·5°C for young and older instars, respectively. This suggested that older nymphs have lower developmental thresholds or are better able to optimize body temperatures.


Sign in / Sign up

Export Citation Format

Share Document