To study the effect of calibration set on quantitatively determining test weight of maize by near-infrared spectroscopy, 584 maize samples were collected and scanned for near-infrared spectral data. Test weight was measured following the standard GB 1353-2009, resulting the sample test weight of 693–732 g•L−1. Two calibration models were respectively built using partial least squares regression, based on two different calibration sets. Test weight of two calibration sets distribute differently, with normal and homogeneous distributions. Both quantitative models were selected by root mean square error of cross validation (RMSECV), and evaluated by validation set. Results show the RMSECV of the model based on normal distribution calibration set is 4.28 g•L−1, the RMSECV of the model based on homogeneous distribution calibration set is 2.99 g•L−1, the predication of two models have significant difference for the samples with high or low test weight.
CITATION STYLE
Jia, L., Jiao, P., Zhang, J., Zeng, Z., & Jiang, X. (2019). Effect of calibration set selection on quantitatively determining test weight of maize by near-infrared spectroscopy. In IFIP Advances in Information and Communication Technology (Vol. 509, pp. 481–488). Springer New York LLC. https://doi.org/10.1007/978-3-030-06155-5_49
Mendeley helps you to discover research relevant for your work.