Farhat, Nabil HYuan, JieVan der Spiegel, Jan2023-05-222023-05-222008-02-012008-04-09https://repository.upenn.edu/handle/20.500.14332/33544A new all-digital background calibration method, using a piecewise linear model to estimate the stage error pattern, is presented. The method corrects both linear and nonlinear errors. The proposed procedure converges in a few milliseconds and requires low hardware overhead, without the need of a high-capacity ROM or RAM. The calibration procedure is tested on a 0.6- µm CMOS pipeline analog-to-digital converter (ADC), which suffers from a high degree of nonlinear errors. The calibration gives improvements of 17 and 26 dB for signal-noise-and-distortion ratio (SNDR) and spurious-free dynamic range (SFDR), respectively, for the Nyquist input signal at the sampling rate of 33 MSample/s. The calibrated ADC achieves SNDR of 70.3 dB and SFDR of 81.3 dB at 33 MSample/s, which results in a resolution of about 12 b.analog-to-digital converter (ADC)background calibrationCMOS ADCnonlinear error calibrationpipeline ADCBackground Calibration With Piecewise Linearized Error Model for CMOS Pipeline A/D ConverterArticle