Compressed data organization for high throughput parallel entropy coding

2015 ◽  
Author(s):  
Amir Said ◽  
Abo-Talib Mahfoodh ◽  
Sehoon Yea
2014 ◽  
Vol 81 (1) ◽  
pp. 59-69 ◽  
Author(s):  
Jung-Ah Choi ◽  
Yo-Sung Ho

Algorithms ◽  
2020 ◽  
Vol 13 (7) ◽  
pp. 159 ◽  
Author(s):  
Shinichi Yamagiwa ◽  
Eisaku Hayakawa ◽  
Koichi Marumo

Toward strong demand for very high-speed I/O for processors, physical performance growth of hardware I/O speed was drastically increased in this decade. However, the recent Big Data applications still demand the larger I/O bandwidth and the lower latency for the speed. Because the current I/O performance does not improve so drastically, it is the time to consider another way to increase it. To overcome this challenge, we focus on lossless data compression technology to decrease the amount of data itself in the data communication path. The recent Big Data applications treat data stream that flows continuously and never allow stalling processing due to the high speed. Therefore, an elegant hardware-based data compression technology is demanded. This paper proposes a novel lossless data compression, called ASE coding. It encodes streaming data by applying the entropy coding approach. ASE coding instantly assigns the fewest bits to the corresponding compressed data according to the number of occupied entries in a look-up table. This paper describes the detailed mechanism of ASE coding. Furthermore, the paper demonstrates performance evaluations to promise that ASE coding adaptively shrinks streaming data and also works on a small amount of hardware resources without stalling or buffering any part of data stream.


2006 ◽  
Vol 321-323 ◽  
pp. 1262-1265
Author(s):  
Chung Hyo Kim ◽  
Tae Sik Kong ◽  
Young Jun Lee ◽  
Hee Dong Kim ◽  
Young Ho Ju

H.264/AVC is adopted as a next generation moving picture compression standard. Context-based Adaptive Binary Arithmetic Coding (CABAC) is the major entropy coding algorithm employed in H.264/AVC. Although the performance gain of H.264/AVC is mostly resulted from CABAC, it is difficult to implement a high-throughput decoder due to its decoding complexity. Although CABAC excludes a multiplication, the algorithm is basically sequential and needs large computations to compute some important variables, which are range, offset and context variables. Therefore, it is difficult to achieve fast decoding performance. In this paper, a prediction scheme is proposed to decode maximally two bits at a time and thus to reduce overall decoding time. A CABAC decoder based on the proposed prediction scheme reduces total cycles by 24% compared to conventional decoders.


2012 ◽  
Vol 22 (12) ◽  
pp. 1778-1791 ◽  
Author(s):  
Vivienne Sze ◽  
Madhukar Budagavi

2007 ◽  
Vol 177 (4S) ◽  
pp. 52-53
Author(s):  
Stefano Ongarello ◽  
Eberhard Steiner ◽  
Regina Achleitner ◽  
Isabel Feuerstein ◽  
Birgit Stenzel ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document