scholarly journals Neuronless Knowledge Processing in Forests

2020 ◽  
Vol 10 (7) ◽  
pp. 2509
Author(s):  
Aviv Segev ◽  
Dorothy Curtis ◽  
Christine Balili ◽  
Sukhwan Jung

Neurons are viewed as the basic cells that process and transmit information. Trees and neurons share a similar structure and neurotransmitter-like substances. No evidence for structures such as neurons, synapses, or a brain has been found inside plants. Consequently, the ability of a network of trees to process information in a method similar to that of a neural network and to make decisions regarding the usage of resources is unperceived. We show that the network between trees is used for knowledge processing to implement decisions that prioritize the forest over a single tree regarding forest use and optimization of resources, similar to the processes of a biological neural network. We found that when there is resection of a network of trees in a forest, namely a trail, each network part will try optimizing its overall access to light resources, represented by canopy tree coverage, independently. This was analyzed in 323 forests in different locations across the US where forest resection is performed by trails. Our results demonstrate that neuron-like relations can occur in a forest knowledge processing system. We anticipate that other systems exist in nature where the basic knowledge processing for resource usage is performed by components other than neurons.

Author(s):  
Yuko Osana ◽  
◽  
Masafumi Hagiwara

In this paper, we propose a knowledge processing system using chaotic associative memory (KPCAM). KPCAM is based on a chaotic neural network (CAM) composed of chaotic neurons. In conventional chaotic neural network, when a stored pattern is given continuously to the network as an external input, the input pattern vicinity is searched. The CAM makes use of this property to separate superimposed patterns and to deal with many-tomany associations. In this research, the CAM is applied to knowledge processing in which knowledge is represented in a form of semantic network. The proposed KPCAM has the following features: (1) it can deal with knowledge represented in a form of semantic network; (2) it can deal with characteristic inheritance; (3) it is robust for noisy input. A series of computer simulations shows the effectiveness of the proposed system.


2020 ◽  
Vol 24 (5 Part B) ◽  
pp. 3059-3068
Author(s):  
Qinghong Wu

The paper uses the flame image processing technology to diagnose the furnace flame combustion achieve the measurement of boiler heat energy. The paper obtains the combustion image of the flame image processing system, and extracts the flame image characteristics of the boiler thermal energy diagnosis, constructs the neural network model of the boiler thermal energy diagnosis, and trains and tests the extracted flame image feature parameter values as the input of the neural network. A rough diagnosis of the boiler?s thermal energy is obtained while predicting the state of combustion. According to the research results, a boiler thermal energy diagnosis system was designed and tested on the boiler of 200 MW unit. The experimental results confirmed the applicability of the system, which can realize on-line monitoring of boiler heat energy and evaluate the combustion situation.


2021 ◽  
Vol 3 (1) ◽  
Author(s):  
Navid Moshtaghi Yazdani

In the present paper, a method for reliable estimation of defect profile in CK45 steel structures is presented using an eddy current testing based measurement system and post-processing system based on deep learning technique. So a deep learning method is used to determine the defect characteristics in metallic structures by magnetic field C-scan images obtained by an anisotropic magneto-resistive sensor. Having designed and adjusting the deep convolution neural network and applied it to C-scan images obtained from the measurement system, the performance of deep learning method proposed is compared with conventional artificial neural network methods such as multilayer perceptron and radial basis function on a number of metallic specimens with different defects. The results confirm the superiority of the proposed method for characterizing defects compared to other classical training-oriented methods.


Author(s):  
Karl-Friedrich Müller-Reißmann ◽  
Hartmut Bossel ◽  
Bernd R. Hornung

2015 ◽  
Vol 24 (05) ◽  
pp. 1550020 ◽  
Author(s):  
Avelino J. Gonzalez ◽  
Brian Sherwell ◽  
Johann Nguyen ◽  
Brian C. Becker ◽  
Víctor Hung ◽  
...  

This article describes a knowledge preservation and re-use tool designed to capture the knowledge of a specific individual at the US National Science Foundation, for later retrieval by successors after his retirement. The system is designed in a Q&A format, where it is sufficiently intelligent to ask for clarifying questions. The primary objective was to create a system that would result in acceptance of the system by the users. The domain of interest to be preserved and re-used was programmatic knowledge about the NSF Industry/University Collaborative Research Centers (I/UCRC) Program, and more specifically, the knowledge of its long-time director, Dr. Alex Schwarzkopf. The system is called AskAlex and it uses a trio of techniques to accomplish its objectives. Contextual graphs (CxG) are used as the basic knowledge representation structure. CxG’s are assisted by a search engine and an ontology of terms to help find the proper contextual graph that can best answer the question being asked. Evaluations with users and potential users generally confirm our selection and provided some guidance for improvements in the system.


Information ◽  
2019 ◽  
Vol 10 (3) ◽  
pp. 113 ◽  
Author(s):  
Joao Ferreira ◽  
Gustavo Callou ◽  
Albert Josua ◽  
Dietmar Tutsch ◽  
Paulo Maciel

Due to the high demands of new technologies such as social networks, e-commerce and cloud computing, more energy is being consumed in order to store all the data produced and provide the high availability required. Over the years, this increase in energy consumption has brought about a rise in both the environmental impacts and operational costs. Some companies have adopted the concept of a green data center, which is related to electricity consumption and CO2 emissions, according to the utility power source adopted. In Brazil, almost 70% of electrical power is derived from clean electricity generation, whereas in China 65% of generated electricity comes from coal. In addition, the value per kWh in the US is much lower than in other countries surveyed. In the present work, we conducted an integrated evaluation of costs and CO2 emissions of the electrical infrastructure in data centers, considering the different energy sources adopted by each country. We used a multi-layered artificial neural network, which could forecast consumption over the following months, based on the energy consumption history of the data center. All these features were supported by a tool, the applicability of which was demonstrated through a case study that computed the CO2 emissions and operational costs of a data center using the energy mix adopted in Brazil, China, Germany and the US. China presented the highest CO2 emissions, with 41,445 tons per year in 2014, followed by the US and Germany, with 37,177 and 35,883, respectively. Brazil, with 8459 tons, proved to be the cleanest. Additionally, this study also estimated the operational costs assuming that the same data center consumes energy as if it were in China, Germany and Brazil. China presented the highest kWh/year. Therefore, the best choice according to operational costs, considering the price of energy per kWh, is the US and the worst is China. Considering both operational costs and CO2 emissions, Brazil would be the best option.


Sign in / Sign up

Export Citation Format

Share Document