Using HaMMLET for Bayesian segmentation of WGS read-depth data

1Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

CNV detection requires a high-quality segmentation of genomic data. In many WGS experiments, sample and control are sequenced together in a multiplexed fashion using DNA barcoding for economic reasons. Using the differential read depth of these two conditions cancels out systematic additive errors. Due to this detrending, the resulting data is appropriate for inference using a hidden Markov model (HMM), arguably one of the principal models for labeled segmentation. However, while the usual frequentist approaches such as Baum-Welch are problematic for several reasons, they are often preferred to Bayesian HMM inference, which normally requires prohibitively long running times and exceeds a typical user’s computational resources on a genome scale data. HaMMLET solves this problem using a dynamic wavelet compression scheme, which makes Bayesian segmentation of WGS data feasible on standard consumer hardware.

Cite

CITATION STYLE

APA

Wiedenhoeft, J., & Schliep, A. (2018). Using HaMMLET for Bayesian segmentation of WGS read-depth data. In Methods in Molecular Biology (Vol. 1833, pp. 83–93). Humana Press Inc. https://doi.org/10.1007/978-1-4939-8666-8_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free