Lossy volume compression using Tucker truncation and thresholding

Tensor decompositions, in particular the Tucker model, are a powerful family of techniques for dimensionality reduction and are being increasingly used for compactly encoding large multidimensional arrays, images and other visual data sets. In interactive applications, volume data often needs to be...

Full description

Saved in:
Bibliographic Details
Published in:The Visual computer Vol. 32; no. 11; pp. 1433 - 1446
Main Authors: Ballester-Ripoll, Rafael, Pajarola, Renato
Format: Journal Article
Language:English
Published: Berlin/Heidelberg Springer Berlin Heidelberg 01.11.2016
Springer Nature B.V
Subjects:
ISSN:0178-2789, 1432-2315
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Tensor decompositions, in particular the Tucker model, are a powerful family of techniques for dimensionality reduction and are being increasingly used for compactly encoding large multidimensional arrays, images and other visual data sets. In interactive applications, volume data often needs to be decompressed and manipulated dynamically; when designing data reduction and reconstruction methods, several parameters must be taken into account, such as the achievable compression ratio, approximation error and reconstruction speed. Weighing these variables in an effective way is challenging, and here we present two main contributions to solve this issue for Tucker tensor decompositions. First, we provide algorithms to efficiently compute, store and retrieve good choices of tensor rank selection and decompression parameters in order to optimize memory usage, approximation quality and computational costs. Second, we propose a Tucker compression alternative based on coefficient thresholding and zigzag traversal, followed by logarithmic quantization on both the transformed tensor core and its factor matrices. In terms of approximation accuracy, this approach is theoretically and empirically better than the commonly used tensor rank truncation method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
ISSN:0178-2789
1432-2315
DOI:10.1007/s00371-015-1130-y