Once again about the hapax grammar: Epigenetic Linguistics

2020 ◽  
Vol 3 (1) ◽  
pp. 23-27
Author(s):  
Dan Faltýnek ◽  
Ľudmila Lacková ◽  
Hana Owsianková

AbstractIn this article, we deal with the similarity between epigenetic marks in DNA and hapax legomena in language; based on the so-called hapaxes, a grammar description is designed. We reflect hapax analysis of Czech language provided by Novotná (2013) and avoid random selection of the corpus. For this reason, we analyze a corpus of 12 authentic books from 12 authors who elaborated the theme “What’s new in…” concerning their field of science, assigned by Nová beseda publishing. By analyzing a middle-sized corpus, we expected results similar to those of large-scale national corpus (see Novotná 2013). We chose to classify hapaxes into different categories in comparison to Novotná, yet the results show similar language productive categories. This kind of language potentiality seems to be analogical to epigenetic processes in biology, which is briefly introduced.

2007 ◽  
Vol 135 (6) ◽  
pp. 2135-2154 ◽  
Author(s):  
Young-Hwa Byun ◽  
Song-You Hong

Abstract This study describes a revised approach for the subgrid-scale convective properties of a moist convection scheme in a global model and evaluates its effects on a simulated model climate. The subgrid-scale convective processes tested in this study comprise three components: 1) the random selection of cloud top, 2) the inclusion of convective momentum transport, and 3) a revised large-scale destabilization effect considering synoptic-scale forcing in the cumulus convection scheme of the National Centers for Environmental Prediction medium-range forecast model. Each component in the scheme has been evaluated within a single-column model (SCM) framework forced by the Tropical Ocean Global Atmosphere Coupled Ocean–Atmosphere Response Experiment data. The impact of the changes in the scheme on seasonal predictions has been examined for the boreal summers of 1996, 1997, and 1999. In the SCM simulations, an experiment that includes all the modifications reproduces the typical convective heating and drying feature. The simulated surface rainfall is in good agreement with the observed precipitation. Random selection of the cloud top effectively moistens and cools the upper troposphere, and it induces drying and warming below the cloud-top level due to the cloud–radiation feedback. However, the two other components in the revised scheme do not play a significant role in the SCM simulations. On the other hand, the role of each modification component in the scheme is significant in the ensemble seasonal simulations. The random selection process of the cloud top preferentially plays an important role in the adjustment of the thermodynamic profile in a manner similar to that in the SCM framework. The inclusion of convective momentum transport in the scheme weakens the meridional circulation. The revised large-scale destabilization process plays an important role in the modulation of the meridional circulation when this process is combined with other processes; on the other hand, this process does not induce significant changes in large-scale fields by itself. Consequently, the experiment that involves all the modifications shows a significant improvement in the seasonal precipitation, thereby highlighting the importance of nonlinear interaction between the physical processes in the model and the simulated climate.


1996 ◽  
Vol 76 (06) ◽  
pp. 0939-0943 ◽  
Author(s):  
B Boneu ◽  
G Destelle ◽  

SummaryThe anti-aggregating activity of five rising doses of clopidogrel has been compared to that of ticlopidine in atherosclerotic patients. The aim of this study was to determine the dose of clopidogrel which should be tested in a large scale clinical trial of secondary prevention of ischemic events in patients suffering from vascular manifestations of atherosclerosis [CAPRIE (Clopidogrel vs Aspirin in Patients at Risk of Ischemic Events) trial]. A multicenter study involving 9 haematological laboratories and 29 clinical centers was set up. One hundred and fifty ambulatory patients were randomized into one of the seven following groups: clopidogrel at doses of 10, 25, 50,75 or 100 mg OD, ticlopidine 250 mg BID or placebo. ADP and collagen-induced platelet aggregation tests were performed before starting treatment and after 7 and 28 days. Bleeding time was performed on days 0 and 28. Patients were seen on days 0, 7 and 28 to check the clinical and biological tolerability of the treatment. Clopidogrel exerted a dose-related inhibition of ADP-induced platelet aggregation and bleeding time prolongation. In the presence of ADP (5 \lM) this inhibition ranged between 29% and 44% in comparison to pretreatment values. The bleeding times were prolonged by 1.5 to 1.7 times. These effects were non significantly different from those produced by ticlopidine. The clinical tolerability was good or fair in 97.5% of the patients. No haematological adverse events were recorded. These results allowed the selection of 75 mg once a day to evaluate and compare the antithrombotic activity of clopidogrel to that of aspirin in the CAPRIE trial.


2003 ◽  
Vol 17 (1) ◽  
pp. 1-14 ◽  
Author(s):  
Peggy A. Hite ◽  
John Hasseldine

This study analyzes a random selection of Internal Revenue Service (IRS) office audits from October 1997 to July 1998, the type of audit that concerns most taxpayers. Taxpayers engage paid preparers in order to avoid this type of audit and to avoid any resulting tax adjustments. The study examines whether there are more audit adjustments and penalty assessments on tax returns with paid-preparer assistance than on tax returns without paid-preparer assistance. By comparing the frequency of adjustments on IRS office audits, the study finds that there are significantly fewer tax adjustments on paid-preparer returns than on self-prepared returns. Moreover, CPA-prepared returns resulted in fewer audit adjustments than non CPA-prepared returns.


2021 ◽  
Vol 13 (6) ◽  
pp. 3571
Author(s):  
Bogusz Wiśnicki ◽  
Dorota Dybkowska-Stefek ◽  
Justyna Relisko-Rybak ◽  
Łukasz Kolanda

The paper responds to research problems related to the implementation of large-scale investment projects in waterways in Europe. As part of design and construction works, it is necessary to indicate river ports that play a major role within the European transport network as intermodal nodes. This entails a number of challenges, the cardinal one being the optimal selection of port locations, taking into account the new transport, economic, and geopolitical situation that will be brought about by modernized waterways. The aim of the paper was to present an original methodology for determining port locations for modernized waterways based on non-cost criteria, as an extended multicriteria decision-making method (MCDM) and employing GIS (Geographic Information System)-based tools for spatial analysis. The methodology was designed to be applicable to the varying conditions of a river’s hydroengineering structures (free-flowing river, canalized river, and canals) and adjustable to the requirements posed by intermodal supply chains. The method was applied to study the Odra River Waterway, which allowed the formulation of recommendations regarding the application of the method in the case of different river sections at every stage of the research process.


2021 ◽  
Vol 22 (15) ◽  
pp. 7773
Author(s):  
Neann Mathai ◽  
Conrad Stork ◽  
Johannes Kirchmair

Experimental screening of large sets of compounds against macromolecular targets is a key strategy to identify novel bioactivities. However, large-scale screening requires substantial experimental resources and is time-consuming and challenging. Therefore, small to medium-sized compound libraries with a high chance of producing genuine hits on an arbitrary protein of interest would be of great value to fields related to early drug discovery, in particular biochemical and cell research. Here, we present a computational approach that incorporates drug-likeness, predicted bioactivities, biological space coverage, and target novelty, to generate optimized compound libraries with maximized chances of producing genuine hits for a wide range of proteins. The computational approach evaluates drug-likeness with a set of established rules, predicts bioactivities with a validated, similarity-based approach, and optimizes the composition of small sets of compounds towards maximum target coverage and novelty. We found that, in comparison to the random selection of compounds for a library, our approach generates substantially improved compound sets. Quantified as the “fitness” of compound libraries, the calculated improvements ranged from +60% (for a library of 15,000 compounds) to +184% (for a library of 1000 compounds). The best of the optimized compound libraries prepared in this work are available for download as a dataset bundle (“BonMOLière”).


Land ◽  
2021 ◽  
Vol 10 (3) ◽  
pp. 295
Author(s):  
Yuan Gao ◽  
Anyu Zhang ◽  
Yaojie Yue ◽  
Jing’ai Wang ◽  
Peng Su

Suitable land is an important prerequisite for crop cultivation and, given the prospect of climate change, it is essential to assess such suitability to minimize crop production risks and to ensure food security. Although a variety of methods to assess the suitability are available, a comprehensive, objective, and large-scale screening of environmental variables that influence the results—and therefore their accuracy—of these methods has rarely been explored. An approach to the selection of such variables is proposed and the criteria established for large-scale assessment of land, based on big data, for its suitability to maize (Zea mays L.) cultivation as a case study. The predicted suitability matched the past distribution of maize with an overall accuracy of 79% and a Kappa coefficient of 0.72. The land suitability for maize is likely to decrease markedly at low latitudes and even at mid latitudes. The total area suitable for maize globally and in most major maize-producing countries will decrease, the decrease being particularly steep in those regions optimally suited for maize at present. Compared with earlier research, the method proposed in the present paper is simple yet objective, comprehensive, and reliable for large-scale assessment. The findings of the study highlight the necessity of adopting relevant strategies to cope with the adverse impacts of climate change.


1988 ◽  
Vol 32 (17) ◽  
pp. 1179-1182 ◽  
Author(s):  
P. Jay Merkle ◽  
Douglas B. Beaudet ◽  
Robert C. Williges ◽  
David W. Herlong ◽  
Beverly H. Williges

This paper describes a systematic methodology for selecting independent variables to be considered in large-scale research problems. Five specific procedures including brainstorming, prototype interface representation, feasibility/relevance analyses, structured literature reviews, and user subjective ratings are evaluated and incorporated into an integrated strategy. This methodology is demonstrated in the context of designing the user interface for a telephone-based information inquiry system. The procedure was successful in reducing an initial set of 95 independent variables to a subset of 19 factors that warrant subsequent detailed analysis. These results are discussed in terms of a comprehensive sequential research methodology useful for investigating human factors problems.


Sign in / Sign up

Export Citation Format

Share Document