Design and Demonstration of Multi-Domain, Multi-Technology Software Defined Networks for High-Performance Cloud Computing Infrastructure

Author(s):  
M. Channegowda ◽  
D. Simeonidou ◽  
S. Peng ◽  
R. Nejabati ◽  
M. Rashidifard ◽  
...  
Author(s):  
Wolfgang Gentzsch ◽  
Burak Yenier

The adoption of cloud computing for engineering and scientific applications is still lagging behind, although many cloud providers today offer powerful computing infrastructure as a service, and enterprises are already making routine use of it. Reasons for this slow adoption are many: complex access to clouds, inflexible software licensing, time-consuming big data transfer, loss of control over their assets, service provider lock-in, to name a few. But recently, with the advent of the UberCloud's novel high-performance software container technology, many of these roadblocks are currently being removed. In this paper the authors describe the current status and landscape of clouds for engineers and scientists, the benefits and challenges, and how UberCloud is providing an online solution platform and container technology which reduce or even remove many of the current roadblock, and thus offer every engineer and scientist additional compute power on demand, in an easily accessible way.


Author(s):  
Jeremy Cohen ◽  
Ioannis Filippis ◽  
Mark Woodbridge ◽  
Daniela Bauer ◽  
Neil Chue Hong ◽  
...  

Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.


Author(s):  
Sarah Richmond ◽  
Chantal Huijbers

Recent technologies have enabled consistent and continuous collection of ecological data at high resolutions across large spatial scales. The challenge remains, however, to bring these data together and expose them to methods and tools to analyse the interaction between biodiversity and the environment. These challenges are mostly associated with the accessibility, visibility and interoperability of data, and the technical computation needed to interpret the data. Australia has invested in digital research infrastructures through the National Collaborative Research Infrastructure Strategy (NCRIS). Here we present two platforms that provide easy access to global biodiversity, climate and environmental datasets integrated with a suite of analytical tools and linked to high-performance cloud computing infrastructure. The Biodiversity and Climate Change Virtual Laboratory (BCCVL) is a point-and-click online platform for modelling species responses to environmental conditions, which provides an easy introduction into the scientific concepts of models without the need for the user to understand the underlying code. For ecologists who write their own modelling scripts, we have developed ecocloud: a new online environment that provides access to data connected with command-line analysis tools like RStudio and Jupyter Notebooks as well as a virtual desktop environment using Australia’s national cloud computing infrastructure. ecocloud is built through collaborations among key facilities within the ecosciences domain, establishing a collective long-term vision of creating an ecosystem of infrastructure that provides the capability to enable reliable prediction of future environmental outcomes. Underpinning these tools is an innovative training program, ecoEd, which provides cohesive training and skill development to enhance the translation of Australia’s digital research infrastructures to the ecoscience community by educating and upskilling the next generation of environmental scientists and managers. Both of these platforms are built using a best-practice microservice model that allows for complete flexibility, scalability and stability in a cloud environment. Both the BCCVL and ecocloud are open-source developments and provide opportunities for interoperability with other platforms (e.g. Atlas of Living Austalia). In Australia, the same technical infrastructure is also used for a platform for the humanities and social science domain, indicating that the underlying technologies are not domain specific. We therefore welcome collaborations with other organisations to further develop these platforms for the wider bio- and ecoinformatics community. This presentation will showcase the tools, services, and underpinning infrastructure alongside our training and engagement framework as an exemplar in building platforms for next generation biodiversity science.


2016 ◽  
Vol 11 (1) ◽  
pp. 72-80
Author(s):  
O.V. Darintsev ◽  
A.B. Migranov

In article one of possible approaches to synthezis of group control of mobile robots which is based on use of cloud computing is considered. Distinctive feature of the offered techniques is adequate reflection of specifics of a scope and the robots of tasks solved by group in architecture of control-information systems, methods of the organization of information exchange, etc. The approach offered by authors allows to increase reliability and robustness of collectives of robots, to lower requirements to airborne computers when saving summary high performance in general.


2020 ◽  
Vol 13 (3) ◽  
pp. 313-318 ◽  
Author(s):  
Dhanapal Angamuthu ◽  
Nithyanandam Pandian

<P>Background: The cloud computing is the modern trend in high-performance computing. Cloud computing becomes very popular due to its characteristic of available anywhere, elasticity, ease of use, cost-effectiveness, etc. Though the cloud grants various benefits, it has associated issues and challenges to prevent the organizations to adopt the cloud. </P><P> Objective: The objective of this paper is to cover the several perspectives of Cloud Computing. This includes a basic definition of cloud, classification of the cloud based on Delivery and Deployment Model. The broad classification of the issues and challenges faced by the organization to adopt the cloud computing model are explored. Examples for the broad classification are Data Related issues in the cloud, Service availability related issues in cloud, etc. The detailed sub-classifications of each of the issues and challenges discussed. The example sub-classification of the Data Related issues in cloud shall be further classified into Data Security issues, Data Integrity issue, Data location issue, Multitenancy issues, etc. This paper also covers the typical problem of vendor lock-in issue. This article analyzed and described the various possible unique insider attacks in the cloud environment. </P><P> Results: The guideline and recommendations for the different issues and challenges are discussed. The most importantly the potential research areas in the cloud domain are explored. </P><P> Conclusion: This paper discussed the details on cloud computing, classifications and the several issues and challenges faced in adopting the cloud. The guideline and recommendations for issues and challenges are covered. The potential research areas in the cloud domain are captured. This helps the researchers, academicians and industries to focus and address the current challenges faced by the customers.</P>


2021 ◽  
pp. 108151
Author(s):  
Arwa Mohamed ◽  
Mosab Hamdan ◽  
Suleman Khan ◽  
Ahmed Abdelaziz ◽  
Sharief F. Babiker ◽  
...  

2014 ◽  
Vol 687-691 ◽  
pp. 3733-3737
Author(s):  
Dan Wu ◽  
Ming Quan Zhou ◽  
Rong Fang Bie

Massive image processing technology requires high requirements of processor and memory, and it needs to adopt high performance of processor and the large capacity memory. While the single or single core processing and traditional memory can’t satisfy the need of image processing. This paper introduces the cloud computing function into the massive image processing system. Through the cloud computing function it expands the virtual space of the system, saves computer resources and improves the efficiency of image processing. The system processor uses multi-core DSP parallel processor, and develops visualization parameter setting window and output results using VC software settings. Through simulation calculation we get the image processing speed curve and the system image adaptive curve. It provides the technical reference for the design of large-scale image processing system.


Sign in / Sign up

Export Citation Format

Share Document