I/O-Aware Batch Scheduling for Petascale Computing Systems

Author(s):  
Zhou Zhou ◽  
Xu Yang ◽  
Dongfang Zhao ◽  
Paul Rich ◽  
Wei Tang ◽  
...  
Author(s):  
Al Geist ◽  
Daniel A Reed

Commodity clusters revolutionized high-performance computing when they first appeared two decades ago. As scale and complexity have grown, new challenges in reliability and systemic resilience, energy efficiency and optimization and software complexity have emerged that suggest the need for re-evaluation of current approaches. This paper reviews the state of the art and reflects on some of the challenges likely to be faced when building trans-petascale computing systems, using insights and perspectives drawn from operational experience and community debates.


2016 ◽  
Vol 58 ◽  
pp. 107-116 ◽  
Author(s):  
Zhou Zhou ◽  
Xu Yang ◽  
Dongfang Zhao ◽  
Paul Rich ◽  
Wei Tang ◽  
...  

Author(s):  
Douglas L. Dorset ◽  
Barbara Moss

A number of computing systems devoted to the averaging of electron images of two-dimensional macromolecular crystalline arrays have facilitated the visualization of negatively-stained biological structures. Either by simulation of optical filtering techniques or, in more refined treatments, by cross-correlation averaging, an idealized representation of the repeating asymmetric structure unit is constructed, eliminating image distortions due to radiation damage, stain irregularities and, in the latter approach, imperfections and distortions in the unit cell repeat. In these analyses it is generally assumed that the electron scattering from the thin negativelystained object is well-approximated by a phase object model. Even when absorption effects are considered (i.e. “amplitude contrast“), the expansion of the transmission function, q(x,y)=exp (iσɸ (x,y)), does not exceed the first (kinematical) term. Furthermore, in reconstruction of electron images, kinematical phases are applied to diffraction amplitudes and obey the constraints of the plane group symmetry.


TAPPI Journal ◽  
2015 ◽  
Vol 14 (1) ◽  
pp. 51-60
Author(s):  
HONGHI TRAN ◽  
DANNY TANDRA

Sootblowing technology used in recovery boilers originated from that used in coal-fired boilers. It started with manual cleaning with hand lancing and hand blowing, and evolved slowly into online sootblowing using retractable sootblowers. Since 1991, intensive research and development has focused on sootblowing jet fundamentals and deposit removal in recovery boilers. The results have provided much insight into sootblower jet hydrodynamics, how a sootblower jet interacts with tubes and deposits, and factors influencing its deposit removal efficiency, and have led to two important innovations: fully-expanded sootblower nozzles that are used in virtually all recovery boilers today, and the low pressure sootblowing technology that has been implemented in several new recovery boilers. The availability of powerful computing systems, superfast microprocessors and data acquisition systems, and versatile computational fluid dynamics (CFD) modeling capability in the past two decades has also contributed greatly to the advancement of sootblowing technology. High quality infrared inspection cameras have enabled mills to inspect the deposit buildup conditions in the boiler during operation, and helped identify problems with sootblower lance swinging and superheater platens and boiler bank tube vibrations. As the recovery boiler firing capacity and steam parameters have increased markedly in recent years, sootblowers have become larger and longer, and this can present a challenge in terms of both sootblower design and operation.


Sign in / Sign up

Export Citation Format

Share Document