Are global computing systems useful? Comparison of client-server global computing systems Ninf, NetSolve versus CORBA

Author(s):  
T. Suzumura ◽  
T. Nakagawa ◽  
S. Matsuoka ◽  
H. Nakada ◽  
S. Sekiguchi
The Server ◽  
2018 ◽  
pp. 297-338
Author(s):  
Markus Krajewski

This chapter considers the forms subalterns assume under the conditions of advanced technology. As servers, demons, or other virtual creatures, they work without being seen or ever taking a break. By means of a comparative analysis of early computing systems, the mainframes, and the first machines from the age of personal computers, the discussion focuses on the conceptual and historical transfer from servant to server. The analysis is based on fieldwork conducted in California in the 1970s, when researchers at the legendary Xerox PARC center took a closer look at the conditions of formation of electronic services. What defines the communicative structure of the Internet was a specific informational architecture, the so-called client-server principle, developed in Silicon Valley after 1973.


2009 ◽  
Vol 236 ◽  
pp. 117-130 ◽  
Author(s):  
Antonio Bucchiarone ◽  
Greg Dennis ◽  
Stefania Gnesi

Author(s):  
Colin English ◽  
Waleed Wagealla ◽  
Paddy Nixon ◽  
Sotirios Terzis ◽  
Helen Lowe ◽  
...  

Author(s):  
Patricia Logan ◽  
Charles Lutz

What is the nature of the process of implementing a new technology? How should the dynamics of implementing a new technology be studied? What research methods are best-suited to the study of complex issues of social and organizational impacts arising from the implementation of a new technology? Client-server computing represents a significant new technology that has not been a focus of research investigations. As companies pursue client-server technology as a replacement for legacy computing systems, there is a need to provide practitioners with grounded research that discover patterns of organizational and social dynamics that influence the successful outcome of a transition to this new technology. This article suggests that naturalistic research studies can formulate realistic business foundations for the successful implementation of client-server computing.


Author(s):  
Grzegorz Chmaj ◽  
Krzysztof Walkowiak ◽  
Michał Tarnawski ◽  
Michał Kucharzak

Abstract Recently, distributed computing system have been gaining much attention due to a growing demand for various kinds of effective computations in both industry and academia. In this paper, we focus on Peer-to-Peer (P2P) computing systems, also called public-resource computing systems or global computing systems. P2P computing systems, contrary to grids, use personal computers and other relatively simple electronic equipment (e.g., the PlayStation console) to process sophisticated computational projects. A significant example of the P2P computing idea is the BOINC (Berkeley Open Infrastructure for Network Computing) project. To improve the performance of the computing system, we propose to use the P2P approach to distribute results of computational projects, i.e., results are transmitted in the system like in P2P file sharing systems (e.g., BitTorrent). In this work, we concentrate on offline optimization of the P2P computing system including two elements: scheduling of computations and data distribution. The objective is to minimize the system OPEX cost related to data processing and data transmission. We formulate an Integer Linear Problem (ILP) to model the system and apply this formulation to obtain optimal results using the CPLEX solver. Next, we propose two heuristic algorithms that provide results very


Sign in / Sign up

Export Citation Format

Share Document