Remediation, convergence, and big data

Author(s):  
Asta Zelenkauskaite

The era of multiplatform media and big data provide new opportunities to reconsider data access by media companies. Outlined here is the discussion surrounding data access from media institutional logic and user-centric perspectives in the contexts of digitalization and big data. The discussion includes technological affordances that can be geared toward users or that merely reinforce media companies’ prominence. However, limitations of information architecture lie in its structure and the inability to facilitate navigation by users across multiple content streams. Media companies concentrate access around their own cross-platform content. Despite technological feasibility, media companies continue to choose cross-platform architecture that is structurally limiting to users. Cross-platform conceptual limits are discussed within the context of the broader socioeconomic landscape of mass media digitalization and big data.

2019 ◽  
Vol 6 (1) ◽  
Author(s):  
Mahdi Torabzadehkashi ◽  
Siavash Rezaei ◽  
Ali HeydariGorji ◽  
Hosein Bobarshad ◽  
Vladimir Alves ◽  
...  

AbstractIn the era of big data applications, the demand for more sophisticated data centers and high-performance data processing mechanisms is increasing drastically. Data are originally stored in storage systems. To process data, application servers need to fetch them from storage devices, which imposes the cost of moving data to the system. This cost has a direct relation with the distance of processing engines from the data. This is the key motivation for the emergence of distributed processing platforms such as Hadoop, which move process closer to data. Computational storage devices (CSDs) push the “move process to data” paradigm to its ultimate boundaries by deploying embedded processing engines inside storage devices to process data. In this paper, we introduce Catalina, an efficient and flexible computational storage platform, that provides a seamless environment to process data in-place. Catalina is the first CSD equipped with a dedicated application processor running a full-fledged operating system that provides filesystem-level data access for the applications. Thus, a vast spectrum of applications can be ported for running on Catalina CSDs. Due to these unique features, to the best of our knowledge, Catalina CSD is the only in-storage processing platform that can be seamlessly deployed in clusters to run distributed applications such as Hadoop MapReduce and HPC applications in-place without any modifications on the underlying distributed processing framework. For the proof of concept, we build a fully functional Catalina prototype and a CSD-equipped platform using 16 Catalina CSDs to run Intel HiBench Hadoop and HPC benchmarks to investigate the benefits of deploying Catalina CSDs in the distributed processing environments. The experimental results show up to 2.2× improvement in performance and 4.3× reduction in energy consumption, respectively, for running Hadoop MapReduce benchmarks. Additionally, thanks to the Neon SIMD engines, the performance and energy efficiency of DFT algorithms are improved up to 5.4× and 8.9×, respectively.


2019 ◽  
Author(s):  
J-Donald Tournier ◽  
Robert Smith ◽  
David Raffelt ◽  
Rami Tabbara ◽  
Thijs Dhollander ◽  
...  

AbstractMRtrix3 is an open-source, cross-platform software package for medical image processing, analysis and visualization, with a particular emphasis on the investigation of the brain using diffusion MRI. It is implemented using a fast, modular and flexible general-purpose code framework for image data access and manipulation, enabling efficient development of new applications, whilst retaining high computational performance and a consistent command-line interface between applications. In this article, we provide a high-level overview of the features of the MRtrix3 framework and general-purpose image processing applications provided with the software.


2020 ◽  
Vol 1 ◽  
pp. 1-23
Author(s):  
Majid Hojati ◽  
Colin Robertson

Abstract. With new forms of digital spatial data driving new applications for monitoring and understanding environmental change, there are growing demands on traditional GIS tools for spatial data storage, management and processing. Discrete Global Grid System (DGGS) are methods to tessellate globe into multiresolution grids, which represent a global spatial fabric capable of storing heterogeneous spatial data, and improved performance in data access, retrieval, and analysis. While DGGS-based GIS may hold potential for next-generation big data GIS platforms, few of studies have tried to implement them as a framework for operational spatial analysis. Cellular Automata (CA) is a classic dynamic modeling framework which has been used with traditional raster data model for various environmental modeling such as wildfire modeling, urban expansion modeling and so on. The main objectives of this paper are to (i) investigate the possibility of using DGGS for running dynamic spatial analysis, (ii) evaluate CA as a generic data model for dynamic phenomena modeling within a DGGS data model and (iii) evaluate an in-database approach for CA modelling. To do so, a case study into wildfire spread modelling is developed. Results demonstrate that using a DGGS data model not only provides the ability to integrate different data sources, but also provides a framework to do spatial analysis without using geometry-based analysis. This results in a simplified architecture and common spatial fabric to support development of a wide array of spatial algorithms. While considerable work remains to be done, CA modelling within a DGGS-based GIS is a robust and flexible modelling framework for big-data GIS analysis in an environmental monitoring context.


2019 ◽  
Vol 4 (2) ◽  
Author(s):  
Putri Maulina

The development of information and communication technology is currently endemic to almost every line of people's lives. One of the effects of technological progress is a massive change in the management of mass media companies. The mass media company must be able to increase its capacity in accordance with the technological advances, one of which is by utilizing the presence of other media that have different platforms in a media company management. The use of many media platforms is done by integrating a manual management system into various management systems based on modern technology. Integrating manual systems into various modern systems, in mass media companies is referred to as the process of media convergence. Tempo Media is a large print media company that has also integrated its management system into various types of media platforms. The system change in the Tempo media management certainly had an impact on the changes in policies in carrying out the company.Keyword : Policy, Technology, Media Convergence, Tempo Media


Author(s):  
R. Lance Holbert

This chapter offers a systematic assessment of DICTION’s ability to address a wide range of media content. Each of the media-related works in this volume reflects a unique mix of communication inputs, and DICTION proves itself able to generate valid and reliable insights on a diverse range of material. In addition, the chapter focuses on a series of challenges (e.g., Message Tailoring, Hypertext, Interactivity) and opportunities (e.g., big data) for DICTION in relation to the study of media content. The program and the researchers who utilize it need to continue to evolve with the changing media landscape in order to generate practical knowledge that is relevant to improving communication.


Sign in / Sign up

Export Citation Format

Share Document