Over the past 10 years, water calorimetry has been adopted widely by national metrology institutes as a primary reference standard for measuring absorbed dose to water in clinical radiotherapy beams. The first such instrument was built at NIST in the early 1990s for use in60Co, and, since then, the domain of application has been extended elsewhere for use with high-energy X-rays and electrons. A second-generation water calorimeter has been designed and built for NIST to meet new challenges with the aid of newer and more sophisticated electronics and computing tools, though heat transfer artifacts inherent to calorimeters of a similar design still persist. Such artifacts conventionally are corrected by means of computer simulations of calorimeter response to heat input whose spatial and temporal characteristics approximate experimental conditions. The correction factors thus obtained exhibit a complicated time dependence that is sensitive to the initial temperature distribution. Therefore, experimental runs typically incorporate long equilibration intervals during which the phantom is stirred, in order to remove thermal gradients. In the present work, we explore the feasibility of using frequency-domain techniques to measure the stationary spectral characteristics of the desired signal and heat transport effects. By modulating the radiation beam and employing Fourier analysis, we obtain a system transfer function that accounts for the systematic bias at all radiation time intervals. The system transfer function is investigated with a one-dimensional analytical model which produces exact agreement with finite-element simulation. The latter is found to be in qualitative agreement with experimental data when using a two-dimensional model. These results suggest that heat-conduction artifacts may be corrected parametrically for arbitrary irradiation conditions.
Mendeley saves you time finding and organizing research
Choose a citation style from the tabs below