Utilizing high performance computing for chemistry: parallel computational chemistry

2010 ◽  
Vol 12 (26) ◽  
pp. 6896 ◽  
Author(s):  
Wibe A. de Jong ◽  
Eric Bylaska ◽  
Niranjan Govind ◽  
Curtis L. Janssen ◽  
Karol Kowalski ◽  
...  
2012 ◽  
Vol 23 (07) ◽  
pp. 1230001 ◽  
Author(s):  
PABLO GARCÍA-RISUEÑO ◽  
PABLO E. IBÁÑEZ

The increase of existing computational capabilities has made simulation emerge as a third discipline of Science, lying midway between experimental and purely theoretical branches [G. Makov, C. Gattinoni and A. D. Vita, Model. Simul. Mater. Sci. Eng.17, 084008 (2009); C. J. Cramer, Essentials of Computational Chemistry: Theories and Models, 2nd edn. (John Wiley & Sons, Chichester, 2002)]. Simulation enables the evaluation of quantities which otherwise would not be accessible, helps to improve experiments and provides new insights on systems which are analyzed [T. C. Germann, K. Kadau and S. Swaminarayan, Concurrency Comput. Pract. Exp.21, 2143 (2009); M. A. L. Marques, X. Lopez, D. Varsano, A. Castro and A. Rubio, Phys. Rev. Lett.90, 258101 (2003); D. E. Shaw, P. Maragakis, K. Lindorff-Larsen, S. Piana, R. O. Dror, M. P. Eastwood, J. A. Bank, J. M. Jumper, J. K. Salmon, Y. Shan and W. Wriggers, Science330, 341 (2010); D. Marx, Chem. Phys. Chem.7, 1848 (2006)]. Knowing the fundamentals of computation can be very useful for scientists, for it can help them to improve the performance of their theoretical models and simulations. This review includes some technical essentials that can be useful to this end, and it is devised as a complement for researchers whose education is focused on scientific issues and not on technological respects. In this document, we attempt to discuss the fundamentals of high performance computing (HPC) [G. Hager and G. Wellein, Introduction to High Performance Computing for Scientists and Engineers, 1st edn. (CRC Press, Taylor & Francis Group, 2011)] in a way which is easy to understand without much previous background. We sketch the way standard computers and supercomputers work, as well as discuss distributed computing and discuss essential aspects to take into account when running scientific calculations in computers.


2007 ◽  
pp. 209-316 ◽  
Author(s):  
Rick A. Kendall ◽  
Robert J. Harrison ◽  
Rik J. Littlefield ◽  
Martyn F. Guest

MRS Bulletin ◽  
1997 ◽  
Vol 22 (10) ◽  
pp. 5-6
Author(s):  
Horst D. Simon

Recent events in the high-performance computing industry have concerned scientists and the general public regarding a crisis or a lack of leadership in the field. That concern is understandable considering the industry's history from 1993 to 1996. Cray Research, the historic leader in supercomputing technology, was unable to survive financially as an independent company and was acquired by Silicon Graphics. Two ambitious new companies that introduced new technologies in the late 1980s and early 1990s—Thinking Machines and Kendall Square Research—were commercial failures and went out of business. And Intel, which introduced its Paragon supercomputer in 1994, discontinued production only two years later.During the same time frame, scientists who had finished the laborious task of writing scientific codes to run on vector parallel supercomputers learned that those codes would have to be rewritten if they were to run on the next-generation, highly parallel architecture. Scientists who are not yet involved in high-performance computing are understandably hesitant about committing their time and energy to such an apparently unstable enterprise.However, beneath the commercial chaos of the last several years, a technological revolution has been occurring. The good news is that the revolution is over, leading to five to ten years of predictable stability, steady improvements in system performance, and increased productivity for scientific applications. It is time for scientists who were sitting on the fence to jump in and reap the benefits of the new technology.


Sign in / Sign up

Export Citation Format

Share Document