On the Empirical Determination of Some Harmonic and Anharmonic Force Constants in Benzene

2001 ◽  
Vol 105 (26) ◽  
pp. 6499-6505 ◽  
Author(s):  
S. Rashev
2004 ◽  
Vol 82 (6) ◽  
pp. 998-1005
Author(s):  
Rafael Escribano ◽  
Ismael K Ortega ◽  
Rafael G Mosteo ◽  
Pedro C Gómez

A number of bromine oxides and mixed chlorine–bromine oxides for which spectroscopic information is available have been chosen to investigate the nature and characteristics of the Br—O bond. The study consists of the empirical determination of stretching force constants for these bonds from observed vibrational spectroscopic data and the analysis of the topological characteristics of the bonds via ab initio calculations. The latter have been performed at the MP2 level with a 6-311+G(2df) basis set, to provide a uniform and systematic framework for comparing these species. Three types of Br—O bonds have been found, with different characteristics of strength and electron density. The results are compared with those recently found for the Cl—O bond in chlorine oxides.Key words: bromine oxides, bond electronic structure.


2006 ◽  
Vol 110 (51) ◽  
pp. 13769-13774 ◽  
Author(s):  
Svetoslav Rashev ◽  
David C. Moule ◽  
Svetlana T. Djambova

1973 ◽  
Vol 16 (1) ◽  
pp. 149-157 ◽  
Author(s):  
T.R. Ananthakrishnan ◽  
C.P. Girijavallabhan ◽  
G. Aruldhas

Information ◽  
2021 ◽  
Vol 12 (7) ◽  
pp. 264
Author(s):  
Jinghan Wang ◽  
Guangyue Li ◽  
Wenzhao Zhang

The powerful performance of deep learning is evident to all. With the deepening of research, neural networks have become more complex and not easily generalized to resource-constrained devices. The emergence of a series of model compression algorithms makes artificial intelligence on edge possible. Among them, structured model pruning is widely utilized because of its versatility. Structured pruning prunes the neural network itself and discards some relatively unimportant structures to compress the model’s size. However, in the previous pruning work, problems such as evaluation errors of networks, empirical determination of pruning rate, and low retraining efficiency remain. Therefore, we propose an accurate, objective, and efficient pruning algorithm—Combine-Net, introducing Adaptive BN to eliminate evaluation errors, the Kneedle algorithm to determine the pruning rate objectively, and knowledge distillation to improve the efficiency of retraining. Results show that, without precision loss, Combine-Net achieves 95% parameter compression and 83% computation compression on VGG16 on CIFAR10, 71% of parameter compression and 41% computation compression on ResNet50 on CIFAR100. Experiments on different datasets and models have proved that Combine-Net can efficiently compress the neural network’s parameters and computation.


Sign in / Sign up

Export Citation Format

Share Document