Modeling of digital background calibration with signal-dependent dithering for A 14-bit, 100-MS/s pipelined ADC

3Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents a scheme of 14-bit 100MS/s pipelined analog-to-digital converters (ADCs) with digital calibration. Signal-dependent pseudo-random dithering has been used in the proposed model to measure the errors caused by finite gain and capacitors mismatch in multiplying digital-to-analog converters and correct them in the digital domain. Comparing with fixed-magnitude PN dithering, a signal-dependent dithering scheme has an advantage of injecting larger dithering, so that the signal decorrelation time could be shorter while the signal range maintains the same. According to the calibration scheme, a behavior model has been established, furthermore the simulation results showed that when the ADC worked at the sampling speed of lOOMS/s, a 14-bit pipelined ADC with 0.1% capacitors mismatch achieved a signal-to-noise and distortion ratio of 74 dB, and the INL could be limited within ±0.9 LSB.

Cite

CITATION STYLE

APA

Sun, K., Wang, X., & He, L. (2010). Modeling of digital background calibration with signal-dependent dithering for A 14-bit, 100-MS/s pipelined ADC. In PrimeAsia 2010 - 2nd Asia Pacific Conference on Postgraduate Research in Microelectronics and Electronics (pp. 123–126). https://doi.org/10.1109/PRIMEASIA.2010.5604947

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free