Purpose: Recent evidence suggests that increasing perimetric contrast all the way to 0 dB may not be clinically useful. This study examines whether raising the floor for point-wise sensitivities affects the ability of global indices to detect change. Methods: Longitudinal data from eyes with progressive glaucoma were used. Point-wise sensitivities were censored at various cutoffs (12–19 dB). At each cutoff, mean deviations (MD) were recalculated using censored sensitivities, called censored mean deviation (CMD). Both MD and CMD were fitted using a linear model. MD and CMD rate of changes (signal) and the standard deviations (SD) of the residuals (noise) were obtained from the fitted models. The linear signal to noise ratio (LSNR) for MD (LSNRMD) and CMD (LSNRCMD) were compared. Additionally, at each cutoff, the ratios of LSNRCMD to LSNRMD were calculated and tested. Results: CMD provided significantly (P <0.05) better LSNR than MD when using any point-wise sensitivity cutoff between 15–19 dB for progressing eyes. Moreover, the ratios of LSNRCMD to LSNRMD were significantly (P <0.05) greater than 1 at all cutoffs from 15–19 dB. Conclusion: This study demonstrates that censoring is an effective tool to reduce variability at low sensitivities for progressing eyes. Translational Relevance: This study suggests that 15–19 dB could be a more suitable endpoint for perimetric testing algorithms.
CITATION STYLE
Pathak, M., Demirel, S., & Gardiner, S. K. (2017). Reducing variability of perimetric global indices from eyes with progressive glaucoma by censoring unreliable sensitivity data. Translational Vision Science and Technology, 6(4). https://doi.org/10.1167/tvst.6.4.11
Mendeley helps you to discover research relevant for your work.