Skip to main content
SHARE
Publication

Understanding and Modeling Lossy Compression Schemes on HPC Scientific Data

Publication Type
Conference Paper
Book Title
2018 IEEE International Parallel and Distributed Processing Symposium (IPDPS)
Publication Date
Page Numbers
348 to 357
Issue
0
Publisher Location
New Jersey, United States of America
Conference Name
32nd IEEE International Parallel and Distributed Processing Symposium (IPDPS 2018)
Conference Location
Vancouver, Canada
Conference Sponsor
IEEE
Conference Date
-

Scientific simulations generate large amounts of floating-point data, which are often not very compressible using the traditional reduction schemes, such as deduplication or lossless compression. The emergence of lossy floating-point compression holds promise to satisfy the data reduction demand from HPC applications; however, lossy compression has not been widely adopted in science production. We believe a fundamental reason is that there is a lack of understanding of the benefits, pitfalls, and performance of lossy compression on scientific data. In this paper, we conduct a comprehensive study on state-of-the-art lossy compression, including ZFP, SZ, and ISABELA, using real and representative HPC datasets. Our evaluation reveals the complex interplay between compressor design, data features and compression performance. The impact of reduced accuracy on data analytics is also examined through a case study of fusion blob detection, offering domain scientists with the insights of what to expect from fidelity loss. Furthermore, the trial and error approach to understanding compression performance involves substantial compute and storage overhead. To this end, we propose a sampling based estimation method that extrapolates the reduction ratio from data samples, to guide domain scientists to make more informed data reduction decisions.