mobile gpus
Recently Published Documents


TOTAL DOCUMENTS

42
(FIVE YEARS 4)

H-INDEX

5
(FIVE YEARS 0)

2021 ◽  
Vol 20 (5s) ◽  
pp. 1-25
Author(s):  
Chanyoung Oh ◽  
Junhyuk So ◽  
Sumin Kim ◽  
Youngmin Yi

Over the past several years, the need for on-device deep learning has been rapidly increasing, and efficient CNN inference on mobile platforms has been actively researched. Sparsity exploitation has been one of the most active research themes, but the studies mostly focus on weight sparsity by weight pruning. Activation sparsity, on the contrary, requires compression at runtime for every input tensor. Hence, the research on activation sparsity mainly targets NPUs that can efficiently process this with their own hardware logic. In this paper, we observe that it is difficult to accelerate CNN inference on mobile GPUs with natural activation sparsity and that the widely used CSR-based sparse convolution is not sufficiently effective due to the compression overhead. We propose several novel sparsification methods that can boost activation sparsity without harming accuracy. In particular, we selectively sparsify some layers with an extremely high sparsity and adopt sparse convolution or dense convolution depending on the layers. Further, we present an efficient sparse convolution method without compression and demonstrate that it can be faster than the CSR implementation. With ResNet-50, we achieved 1.88 speedup compared to TFLite on a Mali-G76 GPU.


2021 ◽  
Author(s):  
Andrey Ignatov ◽  
Kim Byeoung-Su ◽  
Radu Timofte ◽  
Angeline Pouget ◽  
Fenglong Song ◽  
...  

IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Sumin Kim ◽  
Gunju Park ◽  
Youngmin Yi

Author(s):  
Shiqi Jiang ◽  
Lihao Ran ◽  
Ting Cao ◽  
Yusen Xu ◽  
Yunxin Liu
Keyword(s):  

Author(s):  
Bolan Jiang ◽  
Jeng-Hau Lin ◽  
Adarsh Golikeri ◽  
Li He ◽  
Hongqiang Wang ◽  
...  

2019 ◽  
Vol 30 (2) ◽  
pp. 473-485 ◽  
Author(s):  
Enrique de Lucas ◽  
Pedro Marcuello ◽  
Joan-Manuel Parcerisa ◽  
Antonio Gonzalez

Sign in / Sign up

Export Citation Format

Share Document