scholarly journals Biochemical characterization of catilages by a novel approach of blunt impact testing

2007 ◽  
pp. S61-S68
Author(s):  
F Varga ◽  
M Držík ◽  
M Handl ◽  
J Chlpík ◽  
P Kos ◽  
...  

The present article introduces a novel method of characterizing the macromechanical cartilage properties based on dynamic testing. The proposed approach of instrumented impact testing shows the possibility of more detailed investigation of the acting dynamic forces and corresponding deformations within the wide range of strain rates and loads, including the unloading part of stress-strain curves and hysteresis loops. The presented results of the unconfined compression testing of both the native joint cartilage tissues and potential substitute materials outlined the opportunity to measure the dissipation energy and thus to identify the initial mechanical deterioration symptoms and to introduce a better definition of material damage. Based on the analysis of measured specimen deformation, the intact and pathologically changed cartilage tissue can be distinguished and the differences revealed.

Author(s):  
Denis Tikhomirov

The purpose of the article is to typologize terminological definitions of security, to find out the general, to identify the originality of their interpretations depending on the subject of legal regulation. The methodological basis of the study is the methods that made it possible to obtain valid conclusions, in particular, the method of comparison, through which it became possible to correlate different interpretations of the term "security"; method of hermeneutics, which allowed to elaborate texts of normative legal acts of Ukraine, method of typologization, which made it possible to create typologization groups of variants of understanding of the term "security". Scientific novelty. The article analyzes the understanding of the term "security" in various regulatory acts in force in Ukraine. Typological groups were understood to understand the term "security". Conclusions. The analysis of the legal material makes it possible to confirm that the issues of security are within the scope of both legislative regulation and various specialized by-laws. However, today there is no single conception on how to interpret security terminology. This is due both to the wide range of social relations that are the subject of legal regulation and to the relativity of the notion of security itself and the lack of coherence of views on its definition in legal acts and in the scientific literature. The multiplicity of definitions is explained by combinations of material and procedural understanding, static - dynamic, and conditioned by the peculiarities of a particular branch of legal regulation, limited ability to use methods of one or another branch, the inter-branch nature of some variations of security, etc. Separation, common and different in the definition of "security" can be used to further standardize, in fact, the regulatory legal understanding of security to more effectively implement the legal regulation of the security direction.


Author(s):  
Tim Rutherford-Johnson

By the start of the 21st century many of the foundations of postwar culture had disappeared: Europe had been rebuilt and, as the EU, had become one of the world’s largest economies; the United States’ claim to global dominance was threatened; and the postwar social democratic consensus was being replaced by market-led neoliberalism. Most importantly of all, the Cold War was over, and the World Wide Web had been born. Music After The Fall considers contemporary musical composition against this changed backdrop, placing it in the context of globalization, digitization, and new media. Drawing on theories from the other arts, in particular art and architecture, it expands the definition of Western art music to include forms of composition, experimental music, sound art, and crossover work from across the spectrum, inside and beyond the concert hall. Each chapter considers a wide range of composers, performers, works, and institutions are considered critically to build up a broad and rich picture of the new music ecosystem, from North American string quartets to Lebanese improvisers, from South American electroacoustic studios to pianos in the Australian outback. A new approach to the study of contemporary music is developed that relies less on taxonomies of style and technique, and more on the comparison of different responses to common themes, among them permission, fluidity, excess, and loss.


2021 ◽  
Vol 28 ◽  
Author(s):  
Ersin Karataş ◽  
Ahmet Tülek ◽  
Mehmet Mervan Çakar ◽  
Faruk Tamtürk ◽  
Fatih Aktaş ◽  
...  

Background: Polygalacturonases are a group of enzymes under pectinolytic enzymes related to enzymes that hydrolyse pectic substances. Polygalacturonases have been used in various industrial applications such as fruit juice clarification, retting of plant fibers, wastewater treatment drinks fermentation, and oil extraction. Objectives: The study was evaluated at the heterologous expression, purification, biochemical characterization, computational modeling, and performance in apple juice clarification of a new exo-polygalacturonase from Sporothrix schenckii 1099-18 (SsExo-PG) in Pichia pastoris. Methods: Recombinant DNA technology was used in this study. Two different pPIC9K plasmids were constructed with native signal sequence-ssexo-pg and alpha signal sequence-ssexo-pg separately. Protein expression and purification performed after plasmids transformed into the Pichia pastoris. Biochemical and structural analyses were performed by using pure SsExo-PG. Results: The purification of SsExo-PG was achieved using a Ni-NTA chromatography system. The enzyme was found to have a molecular mass of approximately 52 kDa. SsExo-PG presented as stable at a wide range of temperature and pH values, and to be more storage stable than other commercial pectinolytic enzyme mixtures. Structural analysis revealed that the catalytic residues of SsExo-PG are somewhat similar to other Exo-PGs. The KM and kcat values for the degradation of polygalacturonic acid (PGA) by the purified enzyme were found to be 0.5868 µM and 179 s-1, respectively. Cu2+ was found to enhance SsExo-PG activity while Ag2+ and Fe2+ almost completely inhibited enzyme activity. The enzyme reduced turbidity up to 80% thus enhanced the clarification of apple juice. SsExo-PG showed promising performance when compared with other commercial pectinolytic enzyme mixtures. Conclusion: The clarification potential of SsExo-PG was revealed by comparing it with commercial pectinolytic enzymes. The following parameters of the process of apple juice clarification processes showed that SsExo-PG is highly stable and has a novel performance.


Author(s):  
Branka Vulesevic ◽  
Naozumi Kubota ◽  
Ian G Burwash ◽  
Claire Cimadevilla ◽  
Sarah Tubiana ◽  
...  

Abstract Aims Severe aortic valve stenosis (AS) is defined by an aortic valve area (AVA) <1 cm2 or an AVA indexed to body surface area (BSA) <0.6 cm/m2, despite little evidence supporting the latter approach and important intrinsic limitations of BSA indexation. We hypothesized that AVA indexed to height (H) might be more applicable to a wide range of populations and body morphologies and might provide a better predictive accuracy. Methods and results In 1298 patients with degenerative AS and preserved ejection fraction from three different countries and continents (derivation cohort), we aimed to establish an AVA/H threshold that would be equivalent to 1.0 cm2 for defining severe AS. In a distinct prospective validation cohort of 395 patients, we compared the predictive accuracy of AVA/BSA and AVA/H. Correlations between AVA and AVA/BSA or AVA/H were excellent (all R2 > 0.79) but greater with AVA/H. Regressions lines were markedly different in obese and non-obese patients with AVA/BSA (P < 0.0001) but almost identical with AVA/H (P = 0.16). AVA/BSA values that corresponded to an AVA of 1.0 cm2 were markedly different in obese and non-obese patients (0.48 and 0.59 cm2/m2) but not with AVA/H (0.61 cm2/m for both). Agreement for the diagnosis of severe AS (AVA < 1 cm2) was significantly higher with AVA/H than with AVA/BSA (P < 0.05). Similar results were observed across the three countries. An AVA/H cut-off value of 0.6 cm2/m [HR = 8.2(5.6–12.1)] provided the best predictive value for the occurrence of AS-related events [absolute AVA of 1 cm2: HR = 7.3(5.0–10.7); AVA/BSA of 0.6 cm2/m2 HR = 6.7(4.4–10.0)]. Conclusion In a large multinational/multiracial cohort, AVA/H was better correlated with AVA than AVA/BSA and a cut-off value of 0.6 cm2/m provided a better diagnostic and prognostic value than 0.6 cm2/m2. Our results suggest that severe AS should be defined as an AVA < 1 cm2 or an AVA/H < 0.6 cm2/m rather than a BSA-indexed value of 0.6 cm2/m2.


2021 ◽  
Vol 5 (EICS) ◽  
pp. 1-23
Author(s):  
Markku Laine ◽  
Yu Zhang ◽  
Simo Santala ◽  
Jussi P. P. Jokinen ◽  
Antti Oulasvirta

Over the past decade, responsive web design (RWD) has become the de facto standard for adapting web pages to a wide range of devices used for browsing. While RWD has improved the usability of web pages, it is not without drawbacks and limitations: designers and developers must manually design the web layouts for multiple screen sizes and implement associated adaptation rules, and its "one responsive design fits all" approach lacks support for personalization. This paper presents a novel approach for automated generation of responsive and personalized web layouts. Given an existing web page design and preferences related to design objectives, our integer programming -based optimizer generates a consistent set of web designs. Where relevant data is available, these can be further automatically personalized for the user and browsing device. The paper includes presentation of techniques for runtime adaptation of the designs generated into a fully responsive grid layout for web browsing. Results from our ratings-based online studies with end users (N = 86) and designers (N = 64) show that the proposed approach can automatically create high-quality responsive web layouts for a variety of real-world websites.


2021 ◽  
Vol 15 (5) ◽  
pp. 1-32
Author(s):  
Quang-huy Duong ◽  
Heri Ramampiaro ◽  
Kjetil Nørvåg ◽  
Thu-lan Dam

Dense subregion (subgraph & subtensor) detection is a well-studied area, with a wide range of applications, and numerous efficient approaches and algorithms have been proposed. Approximation approaches are commonly used for detecting dense subregions due to the complexity of the exact methods. Existing algorithms are generally efficient for dense subtensor and subgraph detection, and can perform well in many applications. However, most of the existing works utilize the state-or-the-art greedy 2-approximation algorithm to capably provide solutions with a loose theoretical density guarantee. The main drawback of most of these algorithms is that they can estimate only one subtensor, or subgraph, at a time, with a low guarantee on its density. While some methods can, on the other hand, estimate multiple subtensors, they can give a guarantee on the density with respect to the input tensor for the first estimated subsensor only. We address these drawbacks by providing both theoretical and practical solution for estimating multiple dense subtensors in tensor data and giving a higher lower bound of the density. In particular, we guarantee and prove a higher bound of the lower-bound density of the estimated subgraph and subtensors. We also propose a novel approach to show that there are multiple dense subtensors with a guarantee on its density that is greater than the lower bound used in the state-of-the-art algorithms. We evaluate our approach with extensive experiments on several real-world datasets, which demonstrates its efficiency and feasibility.


2021 ◽  
Vol 31 ◽  
Author(s):  
ANDREA VEZZOSI ◽  
ANDERS MÖRTBERG ◽  
ANDREAS ABEL

Abstract Proof assistants based on dependent type theory provide expressive languages for both programming and proving within the same system. However, all of the major implementations lack powerful extensionality principles for reasoning about equality, such as function and propositional extensionality. These principles are typically added axiomatically which disrupts the constructive properties of these systems. Cubical type theory provides a solution by giving computational meaning to Homotopy Type Theory and Univalent Foundations, in particular to the univalence axiom and higher inductive types (HITs). This paper describes an extension of the dependently typed functional programming language Agda with cubical primitives, making it into a full-blown proof assistant with native support for univalence and a general schema of HITs. These new primitives allow the direct definition of function and propositional extensionality as well as quotient types, all with computational content. Additionally, thanks also to copatterns, bisimilarity is equivalent to equality for coinductive types. The adoption of cubical type theory extends Agda with support for a wide range of extensionality principles, without sacrificing type checking and constructivity.


2021 ◽  
Vol 5 (1) ◽  
pp. 38
Author(s):  
Chiara Giola ◽  
Piero Danti ◽  
Sandro Magnani

In the age of AI, companies strive to extract benefits from data. In the first steps of data analysis, an arduous dilemma scientists have to cope with is the definition of the ’right’ quantity of data needed for a certain task. In particular, when dealing with energy management, one of the most thriving application of AI is the consumption’s optimization of energy plant generators. When designing a strategy to improve the generators’ schedule, a piece of essential information is the future energy load requested by the plant. This topic, in the literature it is referred to as load forecasting, has lately gained great popularity; in this paper authors underline the problem of estimating the correct size of data to train prediction algorithms and propose a suitable methodology. The main characters of this methodology are the Learning Curves, a powerful tool to track algorithms performance whilst data training-set size varies. At first, a brief review of the state of the art and a shallow analysis of eligible machine learning techniques are offered. Furthermore, the hypothesis and constraints of the work are explained, presenting the dataset and the goal of the analysis. Finally, the methodology is elucidated and the results are discussed.


Author(s):  
Ying Pin Chua ◽  
Ying Xie ◽  
Poay Sian Sabrina Lee ◽  
Eng Sing Lee

Background: Multimorbidity presents a key challenge to healthcare systems globally. However, heterogeneity in the definition of multimorbidity and design of epidemiological studies results in difficulty in comparing multimorbidity studies. This scoping review aimed to describe multimorbidity prevalence in studies using large datasets and report the differences in multimorbidity definition and study design. Methods: We conducted a systematic search of MEDLINE, EMBASE, and CINAHL databases to identify large epidemiological studies on multimorbidity. We used the Preferred Reporting Items for Systematic Reviews and Meta-analysis Extension for Scoping Reviews (PRISMA-ScR) protocol for reporting the results. Results: Twenty articles were identified. We found two key definitions of multimorbidity: at least two (MM2+) or at least three (MM3+) chronic conditions. The prevalence of multimorbidity MM2+ ranged from 15.3% to 93.1%, and 11.8% to 89.7% in MM3+. The number of chronic conditions used by the articles ranged from 15 to 147, which were organized into 21 body system categories. There were seventeen cross-sectional studies and three retrospective cohort studies, and four diagnosis coding systems were used. Conclusions: We found a wide range in reported prevalence, definition, and conduct of multimorbidity studies. Obtaining consensus in these areas will facilitate better understanding of the magnitude and epidemiology of multimorbidity.


Sign in / Sign up

Export Citation Format

Share Document