Abstract
We consider the evolution of magnetic fields under the influence of Hall drift and Ohmic decay. The governing equation is solved numerically, in a spherical shell with ri/ro = 0.75. Starting with simple free-decay modes as initial conditions, we then consider the subsequent evolution. The Hall effect induces so-called helicoidal oscillations, in which energy is redistributed among the different modes. We find that the amplitude of these oscillations can be quite substantial, with some of the higher harmonics becoming comparable with the original field. Nevertheless, this transfer of energy to the higher harmonics is not sufficient to accelerate significantly the decay of the original field, at least not at the RB = O(100) parameter values accessible to us, where this Hall parameter RBmeasures the ratio of the Ohmic time-scale to the Hall time-scale. We do find clear evidence though of increasingly fine structures developing for increasingly large RB, suggesting that perhaps this Hall-induced cascade to ever-shorter length-scales is eventually sufficiently vigorous to enhance the decay of the original field. Finally, the implications for the evolution of neutron star magnetic fields are discussed.
Author supplied keywords
Cite
CITATION STYLE
Hollerbach, R., & Rüdiger, G. (2002). The influence of Hall drift on the magnetic fields of neutron stars. Monthly Notices of the Royal Astronomical Society, 337(1), 216–224. https://doi.org/10.1046/j.1365-8711.2002.05905.x
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.