Conditional Plausibility and Probabilistic Update #
@cite{halpern-2003} @cite{jeffrey-1965}@cite{halpern-2003} axiomatizes conditional plausibility measures, generalizing Bayesian conditioning, Popper spaces, Jeffrey's rule, and imaging under a single algebraic framework (Cond1–Cond4).
This file formalizes:
- Conditional probability measures (Popper space axioms P1–P4)
- The ratio construction (Halpern Thm 3.3.1): FinAddMeasure → CondMeasure
- Jeffrey's rule: update under uncertain evidence
- Conditioning modes: classifying the update operations across linglib
Conditioning Unification #
Four conditioning operations in linglib are instances of conditional plausibility:
| Module | Operation | Mode |
|---|---|---|
EpistemicScale/Conditional | condMu A B | Bayesian (ratio) |
BayesianSemantics | FinitePMF.prob | Bayesian (marginalization) |
Dynamic/Core/Update | InfoState.update s φ | Eliminative |
SDS/MeasureTheory | (placeholder) | Continuous Bayesian |
The eliminative mode is the special case where P(A|B) ∈ {0, 1}: each world either survives or is eliminated.
Classification of conditioning modes used across linglib.
- eliminative: keep only worlds satisfying evidence. The
resulting measure is 0/1-valued. (
Dynamic/Core/Update.lean) - bayesian: P(A|B) = P(A ∩ B)/P(B). Standard ratio conditioning.
(
BayesianSemantics.lean, this file) - jeffrey: update with uncertain evidence over a partition. Generalizes Bayesian: P'(A) = Σᵢ P(A|Eᵢ) · q(Eᵢ).
- eliminative : ConditioningMode
- bayesian : ConditioningMode
- jeffrey : ConditioningMode
- ranking : ConditioningMode
Instances For
Equations
- Core.Scale.instBEqConditioningMode.beq x✝ y✝ = (x✝.ctorIdx == y✝.ctorIdx)
Instances For
Equations
Equations
- One or more equations did not get rendered due to their size.
Instances For
A conditional probability measure: P(A | B) axiomatized directly.
Extends FinAddMeasure with a conditional component satisfying
Popper's axioms (@cite{halpern-2003}, Ch. 3, Cond1–Cond4). A set B is
normal if P(B|B) ≠ 0; for normal B, P(B|B) = 1. The only
abnormal set (for finite W with positive singletons) is ∅.
Conditional measure:
condMu A B = P(A | B)P1: non-negativity
P2: normalization — P(B|B) = 1 for normal B
- cond_additive (A₁ A₂ B : Set W) : (∀ x ∈ A₁, x ∉ A₂) → self.condMu (A₁ ∪ A₂) B = self.condMu A₁ B + self.condMu A₂ B
P3: conditional additivity
P4: chain rule — P(A ∩ B | C) = P(A | B ∩ C) · P(B | C)
Unconditional connection: P(A | Ω) = μ(A)
Instances For
Posterior measure given evidence B: P_B(A) := P(A|B).
Instances For
Bayes' theorem: P(A|B) · P(B) = P(B|A) · P(A). Follows from the chain rule applied in two directions: P(A ∩ B | Ω) = P(A | B) · P(B | Ω) = P(B | A) · P(A | Ω).
Construct conditional probability via the ratio P(A|B) = μ(A ∩ B)/μ(B).
@cite{halpern-2003}, Theorem 3.3.1: every finitely additive measure extends to a conditional measure satisfying P1–P4 via this construction. When μ(B) = 0, P(A|B) = 0 (B is "abnormal" in Popper's sense).
In Lean's ℚ arithmetic, division by zero yields 0, so the abnormal case is handled automatically.
Equations
Instances For
An evidence partition: a finite collection of mutually exclusive, exhaustive propositions with new probability assignments.
The partition cells
New probability for each cell
Cells and weights are aligned
Weights are non-negative
Weights sum to 1
Instances For
Jeffrey's rule: update a conditional measure with uncertain evidence.
Given a partition {E₁,..., Eₙ} with new probabilities q₁,..., qₙ: P'(A) = Σᵢ P(A | Eᵢ) · qᵢ
This generalizes Bayesian conditioning: standard conditioning on E is the special case where qₑ = 1 and all other qᵢ = 0.
@cite{jeffrey-1965}, The Logic of Decision; @cite{halpern-2003} §3.4.
Equations
Instances For
Bayesian conditioning is Jeffrey conditioning with a point partition.
A conditional measure induces a conditional epistemic comparison: A ≿_B C iff P(A|B) ≥ P(C|B). This conditional comparison satisfies reflexivity and monotonicity (System W axioms) for each fixed B.
A conditional measure extends to a conditional comparison on
propositions: A is conditionally at least as likely as C given B
iff P(A|B) ≥ P(C|B). This is the conditional version of
FinAddMeasure.inducedGe.
Equations
- m.inducedCondGe A C B = (m.condMu A B ≥ m.condMu C B)