Signal detection theory offers several indexes of sensitivity (d′, Az, and A′) that are appropriate for two-choice discrimination when data consist of one hit rate and one false alarm rate per condition. These measures require simplifying assumptions about how target and lure evidence is distributed. We examine three statistical properties of these indexes: accuracy (good agreement between the parameter and the sampling distribution mean), precision (small variance of the sampling distribution), and robustness (small influence of violated assumptions on accuracy). We draw several conclusions from the results. First, a variety of parameters (sample size, degree of discriminability, and magnitude of hits and false alarms) influence statistical bias in these indexes. Comparing conditions that differ in these parameters entails discrepancies that can be reduced by increasing N. Second, unequal variance of the evidence distributions produces significant bias that cannot be reduced by increasing N-a serious drawback to the use of these sensitivity indexes when variance is unknown. Finally, their relative statistical performances suggest that Az is preferable to A′. Copyright 2006 Psychonomic Society, Inc.
CITATION STYLE
Verde, M. F., MacMillan, N. A., & Rotello, C. M. (2006). Measures of sensitivity based on a single hit rate and false alarm rate: The accuracy, precision, and robustness of d′, Az, and A′. Perception and Psychophysics, 68(4), 643–654. https://doi.org/10.3758/BF03208765
Mendeley helps you to discover research relevant for your work.