Genome compression: A novel approach for large collections

49Citations
Citations of this article
87Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Motivation: Genomic repositories are rapidly growing, as witnessed by the 1000 Genomes or the UK10K projects. Hence, compression of multiple genomes of the same species has become an active research area in the past years. The well-known large redundancy in human sequences is not easy to exploit because of huge memory requirements from traditional compression algorithms. Results: We show how to obtain several times higher compression ratio than of the best reported results, on two large genome collections (1092 human and 775 plant genomes). Our inputs are variant call format files restricted to their essential fields. More precisely, our novel Ziv-Lempel-style compression algorithm squeezes a single human genome to ∼400KB. The key to high compression is to look for similarities across the whole collection, not just against one reference sequence, what is typical for existing solutions. Availability: http://sun.aei.polsl.pl/tgc (also as Supplementary Material) under a free license. Supplementary data: Supplementary data are available at Bioinformatics online. Contact: sebastian.deorowicz@polsl.pl. © The Author 2013. Published by Oxford University Press.

Cite

CITATION STYLE

APA

Deorowicz, S., Danek, A., & Grabowski, S. (2013). Genome compression: A novel approach for large collections. Bioinformatics, 29(20), 2572–2578. https://doi.org/10.1093/bioinformatics/btt460

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free