Nonlinear theory of magnetic fluctuations in random flow: the Hall effect Academic Article uri icon


  • A nonlinear theory of magnetic fluctuations excited by random flow of a conducting fluid is developed. A mechanism of amplification of magnetic fluctuations in the presence of zero mean field, proposed by Zeldovich, is applied to the theory by means of a nonlinear equation derived from the induction equation; the nonlinearity is associated with the Hall effect. To derive the nonlinear equation we used a method [S. A. Molchanov, A. A. Ruzmaikin, and D. D. Sokoloff, Sov. Phys. Usp. 28, 307 (1985)] the main idea of which is to replace the magnetic diffusion by the Wiener process. The diffusive motion is described by means of an average over an ensemble of random Wiener trajectories. The nonlinear equation describes the evolution of the correlation function of the magnetic field and resembles the Schrödinger equation except for a variable mass and the absence of the imaginary unit in the time-derivative term. The local spatial distribution of the magnetic field is intermittent: the field is concentrated inside flux tubes separated by regions with weak fields. In the limit of large Reynolds number the formulation is amenable to treatment by a modified WKB method. The general properties of the nonlinear stationary asymptotic solution are confirmed by the numerical solution. The results obtained are of interest for the ionosphere of Venus.

publication date

  • July 1, 1994