Abstract
An A/D converter is used to transform the analog output of a detector into digital data. This paper examines the influences of the sampling intervals of the detection system on the relative standard deviation (RSD) of measurements. Two chromatographic systems (HPLC and IC) were taken. The measurement RSD is provided by the FUMI theory, without replication. If the sampling intervals are too long, the RSD is unnecessarily large because of poor peak recognition. Excessively short intervals cannot improve the precision greatly. As a rule of thumb, we show that 30-50 data points over the signal domain of a peak are appropriate for obtaining the reasonable precision from the digital data.
Author supplied keywords
Cite
CITATION STYLE
Yomota, C., Tagashira, Y., Katsumine, M., Iwaki, K., Matsuda, R., & Hayashi, Y. (2000). Study of appropriate sampling intervals of digital data in chromatography. Bunseki Kagaku, 49(4), 225–231. https://doi.org/10.2116/bunsekikagaku.49.225
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.