Estimating burn severity at the regional level using optically based indices

2011 ◽  
Vol 41 (4) ◽  
pp. 863-872 ◽  
Author(s):  
Mihai Tanase ◽  
Juan de la Riva ◽  
Fernando Pérez-Cabello

During the last decades, the average number of fires per year increased significantly. A twofold increase was observed in the Mediterranean Basin, whereas in the western United States, the increase was fourfold. Regional models for burn severity estimation are necessary to avoid time consuming and costly fieldwork at each individual site. Furthermore, the estimation errors should be assessed by burn severity classes to avoid overestimating models accuracy. To develop such models, this study assessed the relationship between the composite burn index (CBI) and several spectral indices across five burned sites in northeastern Spain. The nonlinear models coupled with spectral indices containing information from the short wavelength infrared provided the best statistical fit of the data at most individual sites and for the pooled data set. The estimation errors for highly burned sites were well below 10%, but for burned sites of low and moderate severity, the errors increased significantly. A strong linear relation was found between burn severity at the plot level and understory and overstory composites. This study demonstrates (i) the model consistency at the regional level and (ii) the need for new estimation methods in areas affected by low to moderate burn severities, even for relatively homogeneous forests.

Author(s):  
Parisa Torkaman

The generalized inverted exponential distribution is introduced as a lifetime model with good statistical properties. This paper, the estimation of the probability density function and the cumulative distribution function of with five different estimation methods: uniformly minimum variance unbiased(UMVU), maximum likelihood(ML), least squares(LS), weighted least squares (WLS) and percentile(PC) estimators are considered. The performance of these estimation procedures, based on the mean squared error (MSE) by numerical simulations are compared. Simulation studies express that the UMVU estimator performs better than others and when the sample size is large enough the ML and UMVU estimators are almost equivalent and efficient than LS, WLS and PC. Finally, the result using a real data set are analyzed.


2021 ◽  
Vol 13 (10) ◽  
pp. 1966
Author(s):  
Christopher W Smith ◽  
Santosh K Panda ◽  
Uma S Bhatt ◽  
Franz J Meyer ◽  
Anushree Badola ◽  
...  

In recent years, there have been rapid improvements in both remote sensing methods and satellite image availability that have the potential to massively improve burn severity assessments of the Alaskan boreal forest. In this study, we utilized recent pre- and post-fire Sentinel-2 satellite imagery of the 2019 Nugget Creek and Shovel Creek burn scars located in Interior Alaska to both assess burn severity across the burn scars and test the effectiveness of several remote sensing methods for generating accurate map products: Normalized Difference Vegetation Index (NDVI), Normalized Burn Ratio (NBR), and Random Forest (RF) and Support Vector Machine (SVM) supervised classification. We used 52 Composite Burn Index (CBI) plots from the Shovel Creek burn scar and 28 from the Nugget Creek burn scar for training classifiers and product validation. For the Shovel Creek burn scar, the RF and SVM machine learning (ML) classification methods outperformed the traditional spectral indices that use linear regression to separate burn severity classes (RF and SVM accuracy, 83.33%, versus NBR accuracy, 73.08%). However, for the Nugget Creek burn scar, the NDVI product (accuracy: 96%) outperformed the other indices and ML classifiers. In this study, we demonstrated that when sufficient ground truth data is available, the ML classifiers can be very effective for reliable mapping of burn severity in the Alaskan boreal forest. Since the performance of ML classifiers are dependent on the quantity of ground truth data, when sufficient ground truth data is available, the ML classification methods would be better at assessing burn severity, whereas with limited ground truth data the traditional spectral indices would be better suited. We also looked at the relationship between burn severity, fuel type, and topography (aspect and slope) and found that the relationship is site-dependent.


2020 ◽  
Vol 70 (5) ◽  
pp. 1211-1230
Author(s):  
Abdus Saboor ◽  
Hassan S. Bakouch ◽  
Fernando A. Moala ◽  
Sheraz Hussain

AbstractIn this paper, a bivariate extension of exponentiated Fréchet distribution is introduced, namely a bivariate exponentiated Fréchet (BvEF) distribution whose marginals are univariate exponentiated Fréchet distribution. Several properties of the proposed distribution are discussed, such as the joint survival function, joint probability density function, marginal probability density function, conditional probability density function, moments, marginal and bivariate moment generating functions. Moreover, the proposed distribution is obtained by the Marshall-Olkin survival copula. Estimation of the parameters is investigated by the maximum likelihood with the observed information matrix. In addition to the maximum likelihood estimation method, we consider the Bayesian inference and least square estimation and compare these three methodologies for the BvEF. A simulation study is carried out to compare the performance of the estimators by the presented estimation methods. The proposed bivariate distribution with other related bivariate distributions are fitted to a real-life paired data set. It is shown that, the BvEF distribution has a superior performance among the compared distributions using several tests of goodness–of–fit.


Risks ◽  
2021 ◽  
Vol 9 (11) ◽  
pp. 204
Author(s):  
Chamay Kruger ◽  
Willem Daniel Schutte ◽  
Tanja Verster

This paper proposes a methodology that utilises model performance as a metric to assess the representativeness of external or pooled data when it is used by banks in regulatory model development and calibration. There is currently no formal methodology to assess representativeness. The paper provides a review of existing regulatory literature on the requirements of assessing representativeness and emphasises that both qualitative and quantitative aspects need to be considered. We present a novel methodology and apply it to two case studies. We compared our methodology with the Multivariate Prediction Accuracy Index. The first case study investigates whether a pooled data source from Global Credit Data (GCD) is representative when considering the enrichment of internal data with pooled data in the development of a regulatory loss given default (LGD) model. The second case study differs from the first by illustrating which other countries in the pooled data set could be representative when enriching internal data during the development of a LGD model. Using these case studies as examples, our proposed methodology provides users with a generalised framework to identify subsets of the external data that are representative of their Country’s or bank’s data, making the results general and universally applicable.


Blood ◽  
2020 ◽  
Vol 136 (Supplement 1) ◽  
pp. 35-36
Author(s):  
Sandra Tong ◽  
Robert P. Numerof ◽  
Jane Datangel ◽  
Esteban Masuda

Introduction: Fostamatinib is an oral, potent inhibitor of spleen tyrosine kinase (SYK) with proven efficacy and a manageable safety profile for the treatment of ITP. SYK is situated in an intracellular signaling pathway upstream of Bruton's tyrosine kinase (BTK). Long-term safety data on fostamatinib at various dosing regimens (up to 150 mg BID) has been collected in >4000 patients with ITP, rheumatoid arthritis (RA), and other autoimmune, allergic and neoplastic disorders. The safety and tolerability of fostamatinib were consistent across different patient populations (apart from disease specific events). We present a summary analysis of the fostamatinib safety data from the ITP and RA studies. Methods: Fostamatinib safety data from 2 randomized, double-blind, placebo-controlled, phase 3 studies and the long-term, open-label, extension (OLE) study in ITP were pooled and are based on a starting dose of 200 mg/day, which was increased to 300 mg/day after 4 weeks in 88% of patients. Fostamatinib safety data from 13 phase 2/3 studies in RA were pooled and are based on a dosing regimen of 100-150 mg/day (n=1232) or 200-300 mg/day (n=2205). Results: The pooled data set for ITP included 146 patients; 60% were female, and the median age was 53 years (range 20-88). The mean duration of fostamatinib treatment was 19 months (range <1-62 months), representing 229 patient exposure years. Adverse events (AEs) were reported in 87% of patients, and 63% were mild to moderate. Serious AEs were reported in 31% of patients. The incidence of diarrhea, hypertension, alanine aminotransferase increase (ALT), and aspartate aminotransferase (AST) increase was evaluated in 58 patients who received fostamatinib for ≥1 year. This enabled a comparison of the incidence of these AEs in quartiles over the first year to assess the cumulative effects of fostamatinib. The AEs were reported with decreasing frequency during the second, third, and fourth quarters of fostamatinib treatment compared with the first quarter of the initial year of treatment in the 58 patients (see Figure 1). In the same 58 patients, the use of rescue therapy decreased while median platelet counts increased each quarter of the first year. The pooled data set for RA included 3437 patients who received fostamatinib; 83% were female, and the median age was 54 (range 18 -87). The mean duration of treatment was 18 months (range <1-81) representing 5134 patient exposure years. AEs were reported in 86% of RA patients and were mild to moderate in 73% of RA patients. Serious AEs occurred in 14%. In the placebo-controlled RA studies, 2414 patients received fostamatinib with 823 patient exposure years and 1169 received placebo with 367 patient exposure years. Despite a two-fold (125%) increase in exposure with fostamatinib vs placebo (823 vs 367 patient exposure years), there was only a 26% increase in AEs with fostamatinib vs placebo (68% vs 54%). The most common events in the ITP and RA studies were diarrhea (36% and 24%), hypertension (22% and 19%) and nausea (19% and 8%), apart from disease-related AEs. Epistaxis (19% and 0.5%), petechiae (15% and 0.3%), contusion (12% and 2%), and fatigue (10% and 2%) are associated with ITP and were uncommon in the RA population. Rheumatoid arthritis was reported as an AE in 9% of patients with RA and in none with ITP. Some AEs may be dose-related, and one-third of the RA patients were on lower dosages (100-150 mg/day) than were generally given in the ITP trials (200-300 mg/day). Conclusions: Fostamatinib has been evaluated in >4000 patients across different disease populations. Fostamatinib has a consistent and manageable safety profile. No new safety signals and no cumulative toxicity were observed with up to 81 months (6.8 years) of continuous treatment. Figure 1 Disclosures Tong: Rigel: Current Employment, Current equity holder in publicly-traded company. Numerof:Rigel: Current Employment, Current equity holder in publicly-traded company. Datangel:Rigel: Current Employment, Current equity holder in publicly-traded company. Masuda:Rigel: Current Employment, Current equity holder in publicly-traded company.


In this paper, we have defined a new two-parameter new Lindley half Cauchy (NLHC) distribution using Lindley-G family of distribution which accommodates increasing, decreasing and a variety of monotone failure rates. The statistical properties of the proposed distribution such as probability density function, cumulative distribution function, quantile, the measure of skewness and kurtosis are presented. We have briefly described the three well-known estimation methods namely maximum likelihood estimators (MLE), least-square (LSE) and Cramer-Von-Mises (CVM) methods. All the computations are performed in R software. By using the maximum likelihood method, we have constructed the asymptotic confidence interval for the model parameters. We verify empirically the potentiality of the new distribution in modeling a real data set.


2021 ◽  
Vol 20 ◽  
pp. 288-299
Author(s):  
Refah Mohammed Alotaibi ◽  
Yogesh Mani Tripathi ◽  
Sanku Dey ◽  
Hoda Ragab Rezk

In this paper, inference upon stress-strength reliability is considered for unit-Weibull distributions with a common parameter under the assumption that data are observed using progressive type II censoring. We obtain di_erent estimators of system reliability using classical and Bayesian procedures. Asymptotic interval is constructed based on Fisher information matrix. Besides, boot-p and boot-t intervals are also obtained. We evaluate Bayes estimates using Lindley's technique and Metropolis-Hastings (MH) algorithm. The Bayes credible interval is evaluated using MH method. An unbiased estimator of this parametric function is also obtained under know common parameter case. Numerical simulations are performed to compare estimation methods. Finally, a data set is studied for illustration purposes.


Geophysics ◽  
2018 ◽  
Vol 83 (4) ◽  
pp. M41-M48 ◽  
Author(s):  
Hongwei Liu ◽  
Mustafa Naser Al-Ali

The ideal approach for continuous reservoir monitoring allows generation of fast and accurate images to cope with the massive data sets acquired for such a task. Conventionally, rigorous depth-oriented velocity-estimation methods are performed to produce sufficiently accurate velocity models. Unlike the traditional way, the target-oriented imaging technology based on the common-focus point (CFP) theory can be an alternative for continuous reservoir monitoring. The solution is based on a robust data-driven iterative operator updating strategy without deriving a detailed velocity model. The same focusing operator is applied on successive 3D seismic data sets for the first time to generate efficient and accurate 4D target-oriented seismic stacked images from time-lapse field seismic data sets acquired in a [Formula: see text] injection project in Saudi Arabia. Using the focusing operator, target-oriented prestack angle domain common-image gathers (ADCIGs) could be derived to perform amplitude-versus-angle analysis. To preserve the amplitude information in the ADCIGs, an amplitude-balancing factor is applied by embedding a synthetic data set using the real acquisition geometry to remove the geometry imprint artifact. Applying the CFP-based target-oriented imaging to time-lapse data sets revealed changes at the reservoir level in the poststack and prestack time-lapse signals, which is consistent with the [Formula: see text] injection history and rock physics.


Author(s):  
Mekides Assefa Abebe ◽  
Jon Yngve Hardeberg ◽  
Gunnar Vartdal

In recent years, smartphone-based colour imaging systems are being increasingly used for Neonatal jaundice detection applications. These systems are based on the estimation of bilirubin concentration levels that correlates with newborns’ skin colour images corresponding to total serum bilirubin (TSB) and transcutaneous bilirubinometry (TcB) measurements. However, the colour reproduction capacity of smartphone cameras are known to be influenced by various factors including the technological and acquisition process variabilities. To make an accurate bilirubin estimation, irrespective of the type of smartphone and illumination conditions used to capture the newborns’ skin images, an inclusive and complete model, or data set, which can represent all the possible real world acquisitions scenarios needs to be utilized. Due to various challenges in generating such a model or a data set, some solutions tend towards the application of reduced data set (designed for reference conditions and devices only) and colour correction systems (for the transformation of other smartphone skin images to the reference space). Such approaches will make the bilirubin estimation methods highly dependent on the accuracy of their employed colour correction systems, and the capability of reducing device-to-device colour reproduction variability. However, the state-of-the-art methods with similar methodologies were only evaluated and validated on a single smartphone camera. The vulnerability of the systems in making an incorrect jaundice diagnosis can only be shown with a thorough investigation of the colour reproduction variability for extended number of smartphones and illumination conditions. Accordingly, this work presents and discuss the results of such broad investigation, including the evaluation of seven smartphone cameras, ten light sources, and three different colour correction approaches. The overall results show statistically significant colour differences among devices, even after colour correction applications, and that further analysis on clinically significance of such differences is required for skin colour based jaundice diagnosis.


2019 ◽  
Vol 71 ◽  
pp. 04004
Author(s):  
T. Krasnova ◽  
T. Plotnikova ◽  
A. Pozdnyakov ◽  
A. Vilgelm

This paper proposes a new approach for monitoring of managing the modernisation of regional economic. The model built on proposed methodology will make it possible to smooth out the influence of non-urban areas on the unevenness of economic activity in spatial development. This paper has two goals. The first is to provide a new compilation of data on spatial distribution of economic activity at the sub-regional level. This data set allows us to monitoring of different indicators within macroregions such as Siberia. The second goal is to construct an instrument that helps to overcome the endogeneity problem using new economic geography hypothesis about the mechanisms of distribution of economic activity. Section 2 describes the data and method that we have proposed, discusses the construction of the Theil indexes using these data at the sub-federal and the sub-regional level. Section 3 presents the correlations between spatial distribution of economic activity and local market potential, discusses the robustness of the results; and the last section concludes.


Sign in / Sign up

Export Citation Format

Share Document