The Problem-Based Scientific Workflow Design and Performance in Grid Environments

Author(s):  
Liu Xiping ◽  
Dou Wanchun ◽  
Fan Shaokun ◽  
Cai Shijie
2006 ◽  
Vol 14 (3-4) ◽  
pp. 217-230 ◽  
Author(s):  
Jia Yu ◽  
Rajkumar Buyya

Grid technologies have progressed towards a service-oriented paradigm that enables a new way of service provisioning based on utility computing models, which are capable of supporting diverse computing services. It facilitates scientific applications to take advantage of computing resources distributed world wide to enhance the capability and performance. Many scientific applications in areas such as bioinformatics and astronomy require workflow processing in which tasks are executed based on their control or data dependencies. Scheduling such interdependent tasks on utility Grid environments need to consider users' QoS requirements. In this paper, we present a genetic algorithm approach to address scheduling optimization problems in workflow applications, based on two QoS constraints, deadline and budget.


Author(s):  
V. Pouli ◽  
C. Marinos ◽  
M. Grammatikou ◽  
S. Papavassiliou ◽  
V. Maglaris

Traditionally, network Service Providers specify Service Level Agreements (SLAs) to guarantee service availability and performance to their customers. However, these SLAs are rather static and span a single provider domain. Thus, they are not applicable to a multi–domain environment. In this paper, the authors present a framework for automatic creation and management of SLAs in a multi-domain environment. The framework is based on Service Oriented Computing (SOC) and contains a collection of web service calls and modules that allow for the automatic creation, configuration, and delivery of an end-to-end SLA, created from the merging of the per-domain SLAs. This paper also presents a monitoring procedure to monitor the QoS guarantees stipulated in the SLA. The SLA establishment and monitoring procedures are tested through a Grid application scenario targeted to perform remote control and monitoring of instrument elements distributed across the Grid.


Author(s):  
V. Pouli ◽  
C. Marinos ◽  
M. Grammatikou ◽  
S. Papavassiliou ◽  
V. Maglaris

Traditionally, network Service Providers specify Service Level Agreements (SLAs) to guarantee service availability and performance to their customers. However, these SLAs are rather static and span a single provider domain. Thus, they are not applicable to a multi–domain environment. In this paper, the authors present a framework for automatic creation and management of SLAs in a multi-domain environment. The framework is based on Service Oriented Computing (SOC) and contains a collection of web service calls and modules that allow for the automatic creation, configuration, and delivery of an end-to-end SLA, created from the merging of the per-domain SLAs. This paper also presents a monitoring procedure to monitor the QoS guarantees stipulated in the SLA. The SLA establishment and monitoring procedures are tested through a Grid application scenario targeted to perform remote control and monitoring of instrument elements distributed across the Grid.


Author(s):  
Radu Prodan

Grid computing promises to enable a scalable, reliable, and easy-to-use computational infrastructure for e-Science. To materialize this promise, Grids need to provide full automation of the entire development and execution cycle starting with application modeling and specification, continuing with experiment design and management, and ending with the collection and analysis of results. Often, this automation relies on the execution of workflow processes. Not much is known much about Grid workflow characteristics, scalability, and workload, which hampers the development of new techniques and algorithms, and slows the tuning of existing ones. This chapter describes techniques developed in the ASKALON project for modeling and analyzing the executions of scientific workflows in Grid environments. The authors first outline the architecture, services, and tools developed by ASKALON and then introduce a new systematic scalability analysis technique to help scientists understand the most severe sources of performance losses that occur when executing scientific workflows in heterogeneous Grid environments. A method for analyzing workload traces is presented, focusing on the intrinsic and environment-related characteristics of scientific workflows. The authors illustrate concrete results that validate the methods for a variety of real-world applications modeled as scientific workflows and executed in the Austrian Grid environment.


Author(s):  
H. M. Thieringer

It has repeatedly been show that with conventional electron microscopes very fine electron probes can be produced, therefore allowing various micro-techniques such as micro recording, X-ray microanalysis and convergent beam diffraction. In this paper the function and performance of an SIEMENS ELMISKOP 101 used as a scanning transmission microscope (STEM) is described. This mode of operation has some advantages over the conventional transmission microscopy (CTEM) especially for the observation of thick specimen, in spite of somewhat longer image recording times.Fig.1 shows schematically the ray path and the additional electronics of an ELMISKOP 101 working as a STEM. With a point-cathode, and using condensor I and the objective lens as a demagnifying system, an electron probe with a half-width ob about 25 Å and a typical current of 5.10-11 amp at 100 kV can be obtained in the back focal plane of the objective lens.


Author(s):  
Huang Min ◽  
P.S. Flora ◽  
C.J. Harland ◽  
J.A. Venables

A cylindrical mirror analyser (CMA) has been built with a parallel recording detection system. It is being used for angular resolved electron spectroscopy (ARES) within a SEM. The CMA has been optimised for imaging applications; the inner cylinder contains a magnetically focused and scanned, 30kV, SEM electron-optical column. The CMA has a large inner radius (50.8mm) and a large collection solid angle (Ω > 1sterad). An energy resolution (ΔE/E) of 1-2% has been achieved. The design and performance of the combination SEM/CMA instrument has been described previously and the CMA and detector system has been used for low voltage electron spectroscopy. Here we discuss the use of the CMA for ARES and present some preliminary results.The CMA has been designed for an axis-to-ring focus and uses an annular type detector. This detector consists of a channel-plate/YAG/mirror assembly which is optically coupled to either a photomultiplier for spectroscopy or a TV camera for parallel detection.


Author(s):  
Joe A. Mascorro ◽  
Gerald S. Kirby

Embedding media based upon an epoxy resin of choice and the acid anhydrides dodecenyl succinic anhydride (DDSA), nadic methyl anhydride (NMA), and catalyzed by the tertiary amine 2,4,6-Tri(dimethylaminomethyl) phenol (DMP-30) are widely used in biological electron microscopy. These media possess a viscosity character that can impair tissue infiltration, particularly if original Epon 812 is utilized as the base resin. Other resins that are considerably less viscous than Epon 812 now are available as replacements. Likewise, nonenyl succinic anhydride (NSA) and dimethylaminoethanol (DMAE) are more fluid than their counterparts DDSA and DMP- 30 commonly used in earlier formulations. This work utilizes novel epoxy and anhydride combinations in order to produce embedding media with desirable flow rate and viscosity parameters that, in turn, would allow the medium to optimally infiltrate tissues. Specifically, embeding media based on EmBed 812 or LX 112 with NSA (in place of DDSA) and DMAE (replacing DMP-30), with NMA remaining constant, are formulated and offered as alternatives for routine biological work.Individual epoxy resins (Table I) or complete embedding media (Tables II-III) were tested for flow rate and viscosity. The novel media were further examined for their ability to infilftrate tissues, polymerize, sectioning and staining character, as well as strength and stability to the electron beam and column vacuum. For physical comparisons, a volume (9 ml) of either resin or media was aspirated into a capillary viscocimeter oriented vertically. The material was then allowed to flow out freely under the influence of gravity and the flow time necessary for the volume to exit was recored (Col B,C; Tables). In addition, the volume flow rate (ml flowing/second; Col D, Tables) was measured. Viscosity (n) could then be determined by using the Hagen-Poiseville relation for laminar flow, n = c.p/Q, where c = a geometric constant from an instrument calibration with water, p = mass density, and Q = volume flow rate. Mass weight and density of the materials were determined as well (Col F,G; Tables). Infiltration schedules utilized were short (1/2 hr 1:1, 3 hrs full resin), intermediate (1/2 hr 1:1, 6 hrs full resin) , or long (1/2 hr 1:1, 6 hrs full resin) in total time. Polymerization schedules ranging from 15 hrs (overnight) through 24, 36, or 48 hrs were tested. Sections demonstrating gold interference colors were collected on unsupported 200- 300 mesh grids and stained sequentially with uranyl acetate and lead citrate.


Sign in / Sign up

Export Citation Format

Share Document