Damage-Risk Criteria for Line Spectra

  • Ward W
N/ACitations
Citations of this article
10Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Damage-risk criteria in current use assume that pure tones are 10 db more dangerous than broad-band noise. However, Thompson and Gales [J. Acoust. Soc. Am. 33, 1593 (1961)] have recently shown that 5-min exposures to 110-db sound-pressure level (SPL) produced the same temporary threshold shift (TTS) whether the TTS-producing agent was a pure tone or a band of noise of any width up to an octave; this was true for stimuli centered at 500 and at 3200 cps. In an attempt to extend the generality of their findings, the present study compared the effects of short exposures to (1) a 1200- to 2400-cps band of noise and (2) a 1650-cps pure tone, at intensities up to 125-db SPL. The results indicate that at 115-db SPL and above the pure tone will indeed produce more TTS than the broad-band noise. However, one cannot say just how much more dangerous the line spectra are, because the two types of stimuli shift different test frequencies by different amounts (and by different ratios). Certainly a single-decibel correction applied to all levels and to any frequency range is a gross over-simplification. (This research was supported by grants from the National Institute of Neurological Diseases and Blindness.)

Cite

CITATION STYLE

APA

Ward, W. D. (1962). Damage-Risk Criteria for Line Spectra. The Journal of the Acoustical Society of America, 34(5_Supplement), 720–720. https://doi.org/10.1121/1.1937184

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free