scholarly journals Convolutional Embedding of Attributed Molecular Graphs for Physical Property Prediction

2017 ◽  
Vol 57 (8) ◽  
pp. 1757-1772 ◽  
Author(s):  
Connor W. Coley ◽  
Regina Barzilay ◽  
William H. Green ◽  
Tommi S. Jaakkola ◽  
Klavs F. Jensen
2014 ◽  
Vol 7 (1) ◽  
pp. 62-68 ◽  
Author(s):  
Deng-Fang Ruan ◽  
Zhi-Hao Chen ◽  
Kui-Fang Wang ◽  
Yuan Chen ◽  
Fan Yang

This paper focuses on the physical property prediction of waste cooking oil biodiesel and examines the accuracy of different methods to estimate reliable basic physical properties, including density, viscosity and surface tension of waste cooking oil biodiesel in a wide range of temperature based on its fatty acid methyl ester component. A program for the physical property prediction of the biodiesel was developed and experimental measurements for the density, viscosity and surface tension of the biodiesel were performed to validate the chosen methods. The results show that the modified Rackett equation and the Orrick-Erbar method have the high estimation accuracy for the prediction of the density and the viscosity, respectively. The Sastri-Rao method and the Pizer method have the accuracy enough to predict the surface tension.


2019 ◽  
Author(s):  
Benson Chen ◽  
Regina Barzilay ◽  
Tommi S Jaakkola

<div>Much of the recent work on learning molecular representations has been based on Graph Convolution Networks (GCN). These models rely on local aggregation operations and can therefore miss higher-order graph properties. To remedy this, we propose Path-Augmented Graph Transformer Networks (PAGTN) that are explicitly built on longer-range dependencies in graphstructured data. Specifically, we use path features in molecular graphs to create global attention layers. We compare our PAGTN model against the GCN model and show that our model consistently</div><div>outperforms GCNs on molecular property prediction datasets including quantum chemistry (QM7, QM8, QM9), physical chemistry (ESOL, Lipophilictiy) and biochemistry (BACE, BBBP)2.</div>


Author(s):  
Benson Chen ◽  
Regina Barzilay ◽  
Tommi S Jaakkola

<div>Much of the recent work on learning molecular representations has been based on Graph Convolution Networks (GCN). These models rely on local aggregation operations and can therefore miss higher-order graph properties. To remedy this, we propose Path-Augmented Graph Transformer Networks (PAGTN) that are explicitly built on longer-range dependencies in graphstructured data. Specifically, we use path features in molecular graphs to create global attention layers. We compare our PAGTN model against the GCN model and show that our model consistently</div><div>outperforms GCNs on molecular property prediction datasets including quantum chemistry (QM7, QM8, QM9), physical chemistry (ESOL, Lipophilictiy) and biochemistry (BACE, BBBP)2.</div>


1980 ◽  
Vol 32 (06) ◽  
pp. 968-970 ◽  
Author(s):  
M. Vasquez ◽  
H.D. Beggs

Sign in / Sign up

Export Citation Format

Share Document