scholarly journals A virtual “Werkstatt” for digitization in the sciences

2020 ◽  
Vol 6 ◽  
Author(s):  
Sheeba Samuel ◽  
Maha Shadaydeh ◽  
Sebastian Böcker ◽  
Bernd Brügmann ◽  
Solveig Franziska Bucher ◽  
...  

Data is central in almost all scientific disciplines nowadays. Furthermore, intelligent systems have developed rapidly in recent years, so that in many disciplines the expectation is emerging that with the help of intelligent systems, significant challenges can be overcome and science can be done in completely new ways. In order for this to succeed, however, first, fundamental research in computer science is still required, and, second, generic tools must be developed on which specialized solutions can be built. In this paper, we introduce a recently started collaborative project funded by the Carl Zeiss Foundation, a virtual manufactory for digitization in the sciences, the “Werkstatt”, which is being established at the Michael Stifel Center Jena (MSCJ) for data-driven and simulation science to address fundamental questions in computer science and applications. The Werkstatt focuses on three key areas, which include generic tools for machine learning, knowledge generation using machine learning processes, and semantic methods for the data life cycle, as well as the application of these topics in different disciplines. Core and pilot projects address the key aspects of the topics and form the basis for sustainable work in the Werkstatt.

Proceedings ◽  
2021 ◽  
Vol 74 (1) ◽  
pp. 24
Author(s):  
Eduard Alexandru Stoica ◽  
Daria Maria Sitea

Nowadays society is profoundly changed by technology, velocity and productivity. While individuals are not yet prepared for holographic connection with banks or financial institutions, other innovative technologies have been adopted. Lately, a new world has been launched, personalized and adapted to reality. It has emerged and started to govern almost all daily activities due to the five key elements that are foundations of the technology: machine to machine (M2M), internet of things (IoT), big data, machine learning and artificial intelligence (AI). Competitive innovations are now on the market, helping with the connection between investors and borrowers—notably crowdfunding and peer-to-peer lending. Blockchain technology is now enjoying great popularity. Thus, a great part of the focus of this research paper is on Elrond. The outcomes highlight the relevance of technology in digital finance.


2021 ◽  
Vol 7 (19) ◽  
pp. eabf8441
Author(s):  
Sarah Klassen ◽  
Alison K. Carter ◽  
Damian H. Evans ◽  
Scott Ortman ◽  
Miriam T. Stark ◽  
...  

Angkor is one of the world’s largest premodern settlement complexes (9th to 15th centuries CE), but to date, no comprehensive demographic study has been completed, and key aspects of its population and demographic history remain unknown. Here, we combine lidar, archaeological excavation data, radiocarbon dates, and machine learning algorithms to create maps that model the development of the city and its population growth through time. We conclude that the Greater Angkor Region was home to approximately 700,000 to 900,000 inhabitants at its apogee in the 13th century CE. This granular, diachronic, paleodemographic model of the Angkor complex can be applied to any ancient civilization.


2018 ◽  
Vol 211 ◽  
pp. 17009
Author(s):  
Natalia Espinoza Sepulveda ◽  
Jyoti Sinha

The development of technologies for the maintenance industry has taken an important role to meet the demanding challenges. One of the important challenges is to predict the defects, if any, in machines as early as possible to manage the machines downtime. The vibration-based condition monitoring (VCM) is well-known for this purpose but requires the human experience and expertise. The machine learning models using the intelligent systems and pattern recognition seem to be the future avenue for machine fault detection without the human expertise. Several such studies are published in the literature. This paper is also on the machine learning model for the different machine faults classification and detection. Here the time domain and frequency domain features derived from the measured machine vibration data are used separated in the development of the machine learning models using the artificial neutral network method. The effectiveness of both the time and frequency domain features based models are compared when they are applied to an experimental rig. The paper presents the proposed machine learning models and their performance in terms of the observations and results.


2020 ◽  
Vol 287 (1920) ◽  
pp. 20192882 ◽  
Author(s):  
Maya Wardeh ◽  
Kieran J. Sharkey ◽  
Matthew Baylis

Diseases that spread to humans from animals, zoonoses, pose major threats to human health. Identifying animal reservoirs of zoonoses and predicting future outbreaks are increasingly important to human health and well-being and economic stability, particularly where research and resources are limited. Here, we integrate complex networks and machine learning approaches to develop a new approach to identifying reservoirs. An exhaustive dataset of mammal–pathogen interactions was transformed into networks where hosts are linked via their shared pathogens. We present a methodology for identifying important and influential hosts in these networks. Ensemble models linking network characteristics with phylogeny and life-history traits are then employed to predict those key hosts and quantify the roles they undertake in pathogen transmission. Our models reveal drivers explaining host importance and demonstrate how these drivers vary by pathogen taxa. Host importance is further integrated into ensemble models to predict reservoirs of zoonoses of various pathogen taxa and quantify the extent of pathogen sharing between humans and mammals. We establish predictors of reservoirs of zoonoses, showcasing host influence to be a key factor in determining these reservoirs. Finally, we provide new insight into the determinants of zoonosis-sharing, and contrast these determinants across major pathogen taxa.


2009 ◽  
Vol 14 (4) ◽  
pp. 524-548 ◽  
Author(s):  
Elke Teich ◽  
Mônica Holtz

We report on a project investigating the lexico-grammatical properties of English scientific texts. The goal of this project is to gain insight into the linguistic effects of two scientific disciplines coming into contact with one another (e.g. computer science and linguistics) and possibly forming a merged, new discipline (i.e. computational linguistics). The crucial question to be addressed is how such merged disciplines construe their own, distinctive identity and which kinds of linguistic means they employ to this end. To approach this question, we apply the notion of register, i.e. functional variation or variation according to context of use. On the basis of a corpus of scientific research articles from nine scientific domains, we explore selected lexico-grammatical patterns and assess their contribution to register formation.


2020 ◽  
Author(s):  
Daniela De Souza Gomes ◽  
Marcos Henrique Fonseca Ribeiro ◽  
Giovanni Ventorim Comarela ◽  
Gabriel Philippe Pereira

High failure rates are a worrying and relevant problem in Brazilian universities. From a data set of student transcripts, we performed a study case for both general and Computer Science contexts, in which Data Mining Techniques were used to find patterns concerning failures. The knowledge acquired can be used for better educational administration and also build intelligent systems to support students’ decision making.


Author(s):  
V. V. Voevodin

Supercomputing technologies are used in almost all fields of science today, and determine the competitiveness of science and industry among national entities. USA, Europe, China, and Japan are investing billions in the development of supercomputing technologies, promoting national programs for the development of this field, and these programs will determine the success of future developments. The underestimation of the value of supercomputing technologies has led Russia to fall behind in global trends, and squanders the enormous potential of Russian specialists in computer science. Russia needs a comprehensive supercomputer program, and cannot delay its enactment.


Sign in / Sign up

Export Citation Format

Share Document