Effect of a stabilizing solute gradient on the onset of thermal convection

5Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this study the effectiveness of a stabilizing solute concentration gradient in delaying the onset of convection in a horizontal fluid layer subjected to transient cooling from the top is considered. The linear amplification theory, which has been successful in predicting the onset time in layers with a uniform initial distribution, is used. The initial stable density stratification can cause the magnitude of the introduced disturbances to vanish. Therefore, the disturbances are reintroduced and the process is repeated until growth is obtained. An allowance is made for molecular diffusion of solute concentration, thus making its distribution time dependent. Quantitative results are obtained for the onset time in a system involving double diffusion of heat and salt in water. The effect of the layer thickness on the dimensional onset time is examined. The results show that for a situation where the onset time is significantly delayed, the dimensional onset time depends on the fluid depth. © 1984 American Institute of Physics.

Cite

CITATION STYLE

APA

Kaviany, M. (1984). Effect of a stabilizing solute gradient on the onset of thermal convection. Physics of Fluids, 27(5), 1108–1113. https://doi.org/10.1063/1.864757

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free