Upper and lower conditional probabilities are defined by Hausdorff outer and inner measures, when the conditioning events have positive and finite Hausdorff outer and inner measures in their dimension, otherwise, when the conditioning events have Hausdorff outer or inner measure equal to zero or infinity in their dimension, they are defined by a 0-1 valued finitely, but not countably, additive probability. Examples are given when the σ-field of the conditioning events is the σ-field of countable and co-countable subsets of [0, 1], the tail σ-field and the σ-field of symmetric events. The definitions of s-independence and s-irrelevance with respect to these upper and lower conditional probabilities are introduced to assure that logical independence is a necessary condition of stochastic independence. It is also proved that s-irrelevance is a sufficient condition for strong independence introduced for credal sets. An example is given to show that the converse is not true. The definitions of s-independence and strong independence are equivalent when all subsets of the sample space Ω have the same Hausdorff dimension, as it happens when Ω is a finite set. The definition of s-conditional irrelevance is given and a generalized factorization property is proposed as a necessary condition of s-conditional irrelevance. Examples are given to show that s-conditional irrelevance and s-irrelevance are not related; moreover, sufficient conditions are given for equivalence between s-conditional irrelevance and s-irrelevance. © 2007 Elsevier Inc. All rights reserved.
Mendeley saves you time finding and organizing research
Choose a citation style from the tabs below