scholarly journals Deep Multiphysics and Particle–Neuron Duality: A Computational Framework Coupling (Discrete) Multiphysics and Deep Learning

2019 ◽  
Vol 9 (24) ◽  
pp. 5369
Author(s):  
Alessio Alexiadis

There are two common ways of coupling first-principles modelling and machine learning. In one case, data are transferred from the machine-learning algorithm to the first-principles model; in the other, from the first-principles model to the machine-learning algorithm. In both cases, the coupling is in series: the two components remain distinct, and data generated by one model are subsequently fed into the other. Several modelling problems, however, require in-parallel coupling, where the first-principle model and the machine-learning algorithm work together at the same time rather than one after the other. This study introduces deep multiphysics; a computational framework that couples first-principles modelling and machine learning in parallel rather than in series. Deep multiphysics works with particle-based first-principles modelling techniques. It is shown that the mathematical algorithms behind several particle methods and artificial neural networks are similar to the point that can be unified under the notion of particle–neuron duality. This study explains in detail the particle–neuron duality and how deep multiphysics works both theoretically and in practice. A case study, the design of a microfluidic device for separating cell populations with different levels of stiffness, is discussed to achieve this aim.

Author(s):  
Petr Berka ◽  
Ivan Bruha

The genuine symbolic machine learning (ML) algorithms are capable of processing symbolic, categorial data only. However, real-world problems, e.g. in medicine or finance, involve both symbolic and numerical attributes. Therefore, there is an important issue of ML to discretize (categorize) numerical attributes. There exist quite a few discretization procedures in the ML field. This paper describes two newer algorithms for categorization (discretization) of numerical attributes. The first one is implemented in the KEX (Knowledge EXplorer) as its preprocessing procedure. Its idea is to discretize the numerical attributes in such a way that the resulting categorization corresponds to KEX knowledge acquisition algorithm. Since the categorization for KEX is done "off-line" before using the KEX machine learning algorithm, it can be used as a preprocessing step for other machine learning algorithms, too. The other discretization procedure is implemented in CN4, a large extension of the well-known CN2 machine learning algorithm. The range of numerical attributes is divided into intervals that may form a complex generated by the algorithm as a part of the class description. Experimental results show a comparison of performance of KEX and CN4 on some well-known ML databases. To make the comparison more exhibitory, we also used the discretization procedure of the MLC++ library. Other ML algorithms such as ID3 and C4.5 were run under our experiments, too. Then, the results are compared and discussed.


Author(s):  
Dharmendra Sharma

In this chapter, we propose a multi-agent-based information technology (IT) security approach (MAITS) as a holistic solution to the increasing needs of securing computer systems. Each specialist task for security requirements is modeled as a specialist agent. MAITS has five groups of working agents—administration assistant agents, authentication and authorization agents, system log *monitoring agents, intrusion detection agents, and pre-mortem-based computer forensics agents. An assessment center, which is comprised of yet another special group of agents, plays a key role in coordinating the interaction of the other agents. Each agent has an agent engine of an appropriate machine-learning algorithm. The engine enables the agent with learning, reasoning, and decision-making abilities. Each agent also has an agent interface, through which the agent interacts with other agents and also the environment.


2021 ◽  
Vol 11 (9) ◽  
pp. 3880
Author(s):  
Gemma Bel-Enguix ◽  
Helena Gómez-Adorno ◽  
Alejandro Pimentel ◽  
Sergio-Luis Ojeda-Trueba ◽  
Brian Aguilar-Vizuet

In this paper, we introduce the T-MexNeg corpus of Tweets written in Mexican Spanish. It consists of 13,704 Tweets, of which 4895 contain negation structures. We performed an analysis of negation statements embedded in the language employed on social media. This research paper aims to present the annotation guidelines along with a novel resource targeted at the negation detection task. The corpus was manually annotated with labels of negation cue, scope, and, event. We report the analysis of the inter-annotator agreement for all the components of the negation structure. This resource is freely available. Furthermore, we performed various experiments to automatically identify negation using the T-MexNeg corpus and the SFU ReviewSP-NEG for training a machine learning algorithm. By comparing two different methodologies, one based on a dictionary and the other based on the Conditional Random Fields algorithm, we found that the results of negation identification on Twitter are lower when the model is trained on the SFU ReviewSP-NEG Corpus. Therefore, this paper shows the importance of having resources built specifically to deal with social media language.


Sensors ◽  
2019 ◽  
Vol 19 (23) ◽  
pp. 5207 ◽  
Author(s):  
Anton Gradišek ◽  
Marion van Midden ◽  
Matija Koterle ◽  
Vid Prezelj ◽  
Drago Strle ◽  
...  

We used a 16-channel e-nose demonstrator based on micro-capacitive sensors with functionalized surfaces to measure the response of 30 different sensors to the vapours from 11 different substances, including the explosives 1,3,5-trinitro-1,3,5-triazinane (RDX), 1-methyl-2,4-dinitrobenzene (DNT) and 2-methyl-1,3,5-trinitrobenzene (TNT). A classification model was developed using the Random Forest machine-learning algorithm and trained the models on a set of signals, where the concentration and flow of a selected single vapour were varied independently. It is demonstrated that our classification models are successful in recognizing the signal pattern of different sets of substances. An excellent accuracy of 96% was achieved for identifying the explosives from among the other substances. These experiments clearly demonstrate that the silane monolayers used in our sensors as receptor layers are particularly well suited to selecting and recognizing TNT and similar types of explosives from among other substances.


Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
M. Safdar Munir ◽  
Imran Sarwar Bajwa ◽  
Amna Ashraf ◽  
Waheed Anwar ◽  
Rubina Rashid

Smart parsimonious and economical ways of irrigation have build up to fulfill the sweet water requirements for the habitants of this world. In other words, water consumption should be frugal enough to save restricted sweet water resources. The major portion of water was wasted due to incompetent ways of irrigation. We utilized a smart approach professionally capable of using ontology to make 50% of the decision, and the other 50% of the decision relies on the sensor data values. The decision from the ontology and the sensor values collectively become the source of the final decision which is the result of a machine learning algorithm (KNN). Moreover, an edge server is introduced between the main IoT server and the GSM module. This method will not only avoid the overburden of the IoT server for data processing but also reduce the latency rate. This approach connects Internet of Things with a network of sensors to resourcefully trace all the data, analyze the data at the edge server, transfer only some particular data to the main IoT server to predict the watering requirements for a field of crops, and display the result by using an android application edge.


2018 ◽  
Author(s):  
C.H.B. van Niftrik ◽  
F. van der Wouden ◽  
V. Staartjes ◽  
J. Fierstra ◽  
M. Stienen ◽  
...  

Author(s):  
Kunal Parikh ◽  
Tanvi Makadia ◽  
Harshil Patel

Dengue is unquestionably one of the biggest health concerns in India and for many other developing countries. Unfortunately, many people have lost their lives because of it. Every year, approximately 390 million dengue infections occur around the world among which 500,000 people are seriously infected and 25,000 people have died annually. Many factors could cause dengue such as temperature, humidity, precipitation, inadequate public health, and many others. In this paper, we are proposing a method to perform predictive analytics on dengue’s dataset using KNN: a machine-learning algorithm. This analysis would help in the prediction of future cases and we could save the lives of many.


Sign in / Sign up

Export Citation Format

Share Document