Documentation

Linglib.Phenomena.Causation.Studies.KonukEtAl2026

@cite{konuk-et-al-2026}: Plural Causes #

@cite{konuk-et-al-2026}

Formalizes Konuk, Quillien & Mascarenhas (2026) "Plural causes," Open Mind.

Core Contributions #

  1. Compound causes: A∧B is treated as a single compound binary variable for causal selection, not decomposed into individual contributions.
  2. Necessity-Sufficiency Model (NSM): NSM(C) = P(C)·Suf(C) + (1-P(C))·Nec(C) from @cite{icard-et-al-2017}, applied to compound causes.
  3. Anti-linearity: NSM(INT∧HIGH) > NSM(LOW∧INT) even though LOW and HIGH have comparable individual causal strength (Experiment 1).
  4. Homogeneous loss: Loss judgments follow LOSS_strong = ¬A∧¬B∧¬C∧¬D, not classical ¬((A∧B)∨(C∧D)) (Experiment 2), mixed with classical via fitted parameter w ≈ 0.77.
  5. Crossing avoidance: Within-disjunct plural causes (A∧B) preferred over cross-disjunct (A∧C) when the rule is (A∧B)∨(C∧D) (Experiment 2, Overdetermined Positive round).

Bridges #

ConceptConnects toModule
Compound sufficiency/necessitycausallySufficient/causallyNecessaryCore.StructuralEquationModel
NSM (Nec/Suf weighting)nsmCausation.CausalSelection (@cite{icard-et-al-2017})
LOSS_strong (all absent)noneSatisfyPlural.Distributivity (@cite{kriz-spector-2021})
Compound sufficiencyallSatisfyPlural.Distributivity (@cite{kriz-spector-2021})
Loss gap (classical − strong)inGapPlural.Distributivity (@cite{kriz-spector-2021})
CausalLaw.conjunctiveThreshold/disjunctive modelsCore.StructuralEquationModel
Crossing avoidancestructural sufficiency gapCore.StructuralEquationModel

§ 1. Compound Sufficiency and Necessity #

Extend the SEM's individual-variable causallySufficient/causallyNecessary to compound (plural) causes. A compound cause C = {v₁,...,vₙ} is sufficient iff setting all components to true produces the effect, and necessary iff setting all to false prevents it.

A compound cause is sufficient iff setting all its variables to true produces the effect under normal development.

Equations
  • One or more equations did not get rendered due to their size.
Instances For

    A compound cause is necessary iff setting all its variables to false prevents the effect under normal development.

    Equations
    • One or more equations did not get rendered due to their size.
    Instances For

      Singleton compound necessity reduces to the simple but-for test (set variable to false, check effect).

      Note: compoundNecessary uses the @cite{nadathur-lauer-2020} but-for test, while causallyNecessary uses @cite{nadathur-2024} Definition 10b (with precondition + achievability + supersituation quantification). The two coincide when the Def 10b precondition passes and the cause is exogenous, but diverge in general.

      § 1b. Bridge: Compound Sufficiency = allSatisfy #

      A compound cause {v₁,...,vₙ} is sufficient iff all its constituent variables being present suffices for the effect. This is exactly allSatisfy from @cite{kriz-spector-2021}: plural predication where every atom satisfies the predicate "is causally active."

      Compound sufficiency over a Fin-indexed variable set is equivalent to allSatisfy applied to the "is present" predicate.

      This connects causal cognition to plural semantics: judging a compound cause as sufficient = judging that the plurality "all satisfy" the causal activation predicate.

      § 2. The Necessity-Sufficiency Model (NSM) #

      The general NSM from @cite{icard-et-al-2017}: NSM(C) = P(C)·Suf(C) + (1-P(C))·Nec(C). Imported from Semantics.Causation.CausalSelection.nsm.

      § 3. Experiment 1: Threshold Game #

      Three urns — LOW (p=1/20), INTERMEDIATE (p=1/2), HIGH (p=19/20) — with rule WIN := sum ≥ 2. The player draws from all three and wins.

      SEM Encoding #

      The threshold ≥ 2 rule is encoded as three conjunctive laws: A∧B→WIN, A∧C→WIN, B∧C→WIN. Any pair suffices.

      Threshold-≥-2 causal dynamics: any two urns on → WIN.

      Equations
      • One or more equations did not get rendered due to their size.
      Instances For

        NSM Computation for Experiment 1 #

        For compound pair causes at s = 0, any pair is deterministically sufficient (Suf = 1), so NSM(C→WIN) = 1 - P(WIN ∧ ¬C). The residual probability P(WIN ∧ ¬C) is the chance that the remaining single urn (Z) plus exactly one of {X,Y} still meets threshold — but with the compound removed, we need all three absent variables except Z, so P(WIN ∧ ¬C) = P(exactly one of {X,Y} on) × P(Z on).

        NSM for a compound pair {X,Y} in the threshold-≥-2 game (Suf=1).

        NSM = 1 - P(WIN ∧ ¬C), where P(WIN ∧ ¬C) is the probability that exactly one of {X,Y} is on AND the third urn Z is also on.

        Equations
        Instances For

          Anti-linearity: INT∧HIGH has strictly higher NSM than LOW∧INT.

          The additive hypothesis predicts LOW∧INT ≈ INT∧HIGH (since LOW and HIGH have comparable individual NSM in the threshold game). The holistic NSM gives 39/40 vs 21/40, matching the empirical finding (t(355) = -4.67, p < 0.001).

          § 4. Experiment 2: Disjunctive Rule and LOSS #

          WIN := (A∧B) ∨ (C∧D), with P(A)=7/10, P(B)=1/10, P(C)=1/5, P(D)=9/10.

          Classical negation: LOSS = ¬(A∧B) ∧ ¬(C∧D). Homogeneous negation: LOSS_strong = ¬A ∧ ¬B ∧ ¬C ∧ ¬D.

          Empirical loss judgments match a mixture: w · LOSS_strong + (1-w) · LOSS_classical, with fitted w ≈ 0.77, consistent with the homogeneity property of plural negation (@cite{kriz-spector-2021}).

          Experiment 2 causal dynamics: (A∧B)∨(C∧D) → WIN.

          Equations
          • One or more equations did not get rendered due to their size.
          Instances For

            Classical LOSS = ¬((A∧B) ∨ (C∧D)) ≡ ¬(A∧B) ∧ ¬(C∧D).

            Equations
            Instances For

              Homogeneous LOSS = ¬A ∧ ¬B ∧ ¬C ∧ ¬D.

              Equations
              Instances For

                LOSS_strong entails classical LOSS.

                Classical LOSS does NOT entail LOSS_strong.

                Witness: A=1, B=0, C=0, D=0 — neither A∧B nor C∧D holds (classical LOSS), but A is present (LOSS_strong fails).

                Mixture model: w · LOSS_strong + (1-w) · LOSS_classical.

                Fitted w ≈ 0.77, reflecting the dominance of the homogeneous reading (neither spoke German) over the classical reading (not both spoke).

                Equations
                • One or more equations did not get rendered due to their size.
                Instances For

                  At w = 1, the mixture reduces to LOSS_strong.

                  At w = 0, the mixture reduces to classical LOSS.

                  § 4b. Bridge: Loss Gap = inGap (Homogeneity) #

                  The gap between lossClassical and lossStrong — valuations where classical loss holds but homogeneous loss does not — is exactly the set of worlds in the truth-value gap (inGap) for the "is present" predicate over the four causal variables. This connects the paper's w-parameter mixture to the formal semantics of plural homogeneity.

                  theorem Phenomena.Causation.Studies.KonukEtAl2026.loss_gap_iff_pluralGap (f : Fin 4Bool) :
                  lossClassical (f 0) (f 1) (f 2) (f 3) = true lossStrong (f 0) (f 1) (f 2) (f 3) = false lossClassical (f 0) (f 1) (f 2) (f 3) = true Semantics.Lexical.Plural.Distributivity.someSatisfy (fun (i : Fin 4) (x : Unit) => f i) Finset.univ () = true

                  The loss gap (classical but not strong) is exactly inGap for the "is present" predicate: some but not all variables are false.

                  The classical negation ¬(A∧B) ∧ ¬(C∧D) allows worlds where some variables are true and others false. The homogeneous negation ¬A∧¬B∧¬C∧¬D requires all false. The gap — where they disagree — is the truth-value gap from @cite{kriz-spector-2021}: worlds where the plurality is neither all-P nor none-P.

                  § 5. Experiment 2: Crossing Avoidance (Overdetermined Positive round) #

                  In (A∧B)∨(C∧D), a compound cause is "within-disjunct" if both variables come from the same conjunct, and "cross-disjunct" otherwise.

                  Empirical finding: within-disjunct causes are preferred over cross-disjunct ones, even controlling for counterfactual dependence.

                  Disjunct membership classification for a pair of variables.

                  Instances For
                    Equations
                    • One or more equations did not get rendered due to their size.
                    Instances For

                      Classify a pair of Experiment 2 variables by disjunct membership. Indices: A=0, B=1 (first conjunct), C=2, D=3 (second conjunct).

                      Equations
                      • One or more equations did not get rendered due to their size.
                      Instances For

                        § 5b. Experiment 2: Triple-1 and Triple-0 Conditions #

                        Triple-1: A, B, D drawn (colored), C not drawn — John wins. #

                        The win is via A∧B (purple pair). Urn D (yellow) is idle: it draws a colored ball but has no effect on the outcome because urn C (its partner) does not. The CESM and NSM both predict A∧B rated near ceiling.

                        Triple-0: A, B, D draw white balls, C draws colored — John loses. #

                        The mirror image: the loss is driven by the absence of colored balls from A and B (and D), but not C. Under the non-classical (homogeneous) representation of loss, LOSS := ¬A ∧ ¬B ∧ ¬D, and the white ball from urn D is indispensable while A and B are redundant.

                        Model fits with the w parameter: r = .94 (CESM) and .99 (NSM) for Triple-0 — the paper's most dramatic improvement over the base models (r = .34 and .52 without w).

                        In Triple-0 (loss), under homogeneous representation LOSS = ¬A∧¬B∧¬D, the white ball from D is indispensable: its absence would break the homogeneous conjunction. Formally, removing D (setting it to true) blocks the loss under the strong reading.

                        § 6. Bridge: LOSS_strong = noneSatisfy (Homogeneity) #

                        LOSS_strong is exactly the noneSatisfy predicate from @cite{kriz-spector-2021} applied to the four causal variables: every individual variable is false.

                        In Semantics.Lexical.Plural.Distributivity, noneSatisfy P x w = true iff ∀ a ∈ x, P a w = false. LOSS_strong instantiates this with the identity predicate "is present" over the four causal variables, connecting causal cognition to the homogeneity account of plural negation.

                        theorem Phenomena.Causation.Studies.KonukEtAl2026.lossStrong_iff_allFalse (f : Fin 4Bool) :
                        lossStrong (f 0) (f 1) (f 2) (f 3) = true ∀ (i : Fin 4), f i = false

                        LOSS_strong holds iff every individual variable is false.

                        LOSS_strong is exactly noneSatisfy from @cite{kriz-spector-2021}: "none of {v₁,...,v₄} are present" under the homogeneity account.