@cite{qing-goodman-lassiter-2016} #
A rational speech-act model of projective content. CogSci 2016, pp. 1110–1115.
The listener jointly infers the world state and the common ground (context set) the speaker assumed. Under the right QUD, this derives presupposition projection without any special semantic mechanism.
The Model #
- L0 (eq. 5): L0(Q(w) | u, C, Q) ∝ Σ_{w'∈C∩⟦u⟧} δ_{Q(w)=Q(w')} · P(w')
- S1 (eq. 6): S(u|w,C,Q) ∝ P(u) · L0(Q(w) | u, C, Q)^α
- L1 (eq. 7): L(w,C | u, Q) ∝ P(w) · P(C) · S(u | w, C, Q)
Domain: 13 utterances × 4 worlds × 9 context sets. α = 6.
Key Insight: QUD Projection and Context Set Inference #
Under QUD_now ("Does John smoke now?"), S1 scores depend on now(w) only —
worlds with the same now value are indistinguishable (§8.1). This means
presupposition projection CANNOT be observed at the world marginal level.
Instead, projection surfaces in the context set posterior (L1_latent):
L1 infers the speaker assumed a context set entailing past=T (§8.5).
Under QUD_max (identity QUD), the past-now degeneracy breaks and projection is visible directly at the world level (§8.4).
Context Set Simplification #
The paper uses 15 non-empty context sets (all 2^4 - 1 subsets of 4 worlds) with
5% noise added to eq. (8) to ensure nonzero priors. We model only the 9 context
sets derivable from observations about past/now, since the remaining 6 (e.g.,
change = {(T,F),(F,T)}) receive only the noise prior (≥18× lower) and do not
affect qualitative predictions.
Connection to @cite{scontras-tonhauser-2025} and @cite{warstadt-2022} #
All three papers implement the same mathematical structure with different
domains (change-of-state verbs, factives, genus-species). The latent variable
is called "context set" here and "private assumptions" in S&T, but the
computation is identical. See ScontrasTonhauser2025.lean for the factive
domain.
World state: (past, now) where past = John smoked, now = John smokes. Flat inductive for tactic enumerability.
- wTT : WorldState
- wTF : WorldState
- wFT : WorldState
- wFF : WorldState
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
Instances For
Equations
- One or more equations did not get rendered due to their size.
Equations
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.WorldState.wTT.past = true
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.WorldState.wTF.past = true
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.WorldState.wFT.past = false
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.WorldState.wFF.past = false
Instances For
Equations
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.WorldState.wTT.now = true
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.WorldState.wFT.now = true
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.WorldState.wTF.now = false
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.WorldState.wFF.now = false
Instances For
Utterances about John's smoking habits. 6 positive utterances from Table 1, their 6 negations, and silence.
- smokes : Utterance
- doesntSmoke : Utterance
- smoked : Utterance
- didntSmoke : Utterance
- alwaysSmoked : Utterance
- notAlwaysSmoked : Utterance
- stoppedSmoking : Utterance
- notStoppedSmoking : Utterance
- startedSmoking : Utterance
- notStartedSmoking : Utterance
- neverSmoked : Utterance
- notNeverSmoked : Utterance
- silence : Utterance
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
Instances For
Equations
- One or more equations did not get rendered due to their size.
Literal truth conditions from Table 1. Negation of u is U - ⟦u⟧.
The change-of-state utterances (stopped, started, always smoked) encode
presupposition + assertion as a conjunction over two temporal projections
of the world state. See bridge theorems below connecting these to the
CoS theory in Theories.Semantics.Lexical.Verb.ChangeOfState.Theory.
Equations
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.smokes x✝ = x✝.now
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.doesntSmoke x✝ = !x✝.now
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.smoked x✝ = x✝.past
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.didntSmoke x✝ = !x✝.past
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.alwaysSmoked x✝ = (x✝.past && x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.notAlwaysSmoked x✝ = !(x✝.past && x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.stoppedSmoking x✝ = (x✝.past && !x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.notStoppedSmoking x✝ = !(x✝.past && !x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.startedSmoking x✝ = (!x✝.past && x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.notStartedSmoking x✝ = !(!x✝.past && x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.neverSmoked x✝ = (!x✝.past && !x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.notNeverSmoked x✝ = !(!x✝.past && !x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.literalMeaning Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.silence x✝ = true
Instances For
"Stopped smoking" = cessation presupposition (past=T) ∧ cessation assertion (now=F).
The CoS theory operates on a single predicate P, but the QGL world model
separates prior state (past) from current state (now). We use ·.past
for the presupposition and ·.now for the assertion.
"Started smoking" = inception presupposition (past=F) ∧ inception assertion (now=T).
"Always smoked" = continuation presupposition (past=T) ∧ continuation assertion (now=T).
Context sets: subsets of worlds representing common ground.
These are the 9 context sets derivable from observations about past and now
(eq. 8). The paper's full model uses 15 non-empty subsets with 5% noise;
the omitted 6 (e.g., change = {(T,F),(F,T)}) have negligible prior.
- pastTrue : ContextSet
- pastFalse : ContextSet
- nowTrue : ContextSet
- nowFalse : ContextSet
- pastTrueNowTrue : ContextSet
- pastTrueNowFalse : ContextSet
- pastFalseNowTrue : ContextSet
- pastFalseNowFalse : ContextSet
- universe : ContextSet
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
Instances For
Equations
- One or more equations did not get rendered due to their size.
World-context compatibility: w ∈ C.
Equations
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.compatibleBool Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastTrue x✝ = x✝.past
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.compatibleBool Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastFalse x✝ = !x✝.past
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.compatibleBool Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.nowTrue x✝ = x✝.now
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.compatibleBool Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.nowFalse x✝ = !x✝.now
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.compatibleBool Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastTrueNowTrue x✝ = (x✝.past && x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.compatibleBool Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastTrueNowFalse x✝ = (x✝.past && !x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.compatibleBool Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastFalseNowTrue x✝ = (!x✝.past && x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.compatibleBool Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastFalseNowFalse x✝ = (!x✝.past && !x✝.now)
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.compatibleBool Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.universe x✝ = true
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
Instances For
Equations
- One or more equations did not get rendered due to their size.
QUD aggregation: sums L0 probabilities over the QUD equivalence class.
- now: sums over worlds with same now value
- max: identity (no aggregation)
- past: sums over worlds with same past value
Equations
- One or more equations did not get rendered due to their size.
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.qudAggregate Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.QUD.max f w = f w
Instances For
Utterance prior (eq. 1): Pr(u) ∝ 2^{-#content-words(u)}. Negation and auxiliaries excluded from count.
- 2 content words: stopped/started/always/never smoking → 1/4
- 1 content word: smokes/smoked → 1/2
- 0 content words: silence → 1
Equations
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.smokes = 1 / 2
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.doesntSmoke = 1 / 2
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.smoked = 1 / 2
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.didntSmoke = 1 / 2
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.alwaysSmoked = 1 / 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.notAlwaysSmoked = 1 / 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.stoppedSmoking = 1 / 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.notStoppedSmoking = 1 / 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.startedSmoking = 1 / 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.notStartedSmoking = 1 / 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.neverSmoked = 1 / 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.notNeverSmoked = 1 / 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.utterancePrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.Utterance.silence = 1
Instances For
Context set prior (eq. 8): Pr(C) ∝ Σ_{CG⊆Obs} P(CG) · δ_{C=∩CG}. Each observation enters CG independently with probability 0.4. P(CG) = 0.4^|CG| × 0.6^(4-|CG|).
- 0 observations (universe): 0.6^4 ∝ 9
- 1 observation (single): 0.4 × 0.6^3 ∝ 6
- 2 observations (pair): 0.4^2 × 0.6^2 ∝ 4
Equations
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.contextPrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.universe = 9
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.contextPrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastTrue = 6
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.contextPrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastFalse = 6
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.contextPrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.nowTrue = 6
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.contextPrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.nowFalse = 6
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.contextPrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastTrueNowTrue = 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.contextPrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastTrueNowFalse = 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.contextPrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastFalseNowTrue = 4
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.contextPrior Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastFalseNowFalse = 4
Instances For
RSA model parameterized by QUD. The paper's final model uses QUD_now with CG prior.
- L0: meaning = compatibility ∧ literal truth (eq. 5)
- S1: utterancePrior × rpow(qudAggregate(L0), α) (eq. 6)
- L1: worldPrior × contextPrior × S1 (eq. 7)
Equations
- One or more equations did not get rendered due to their size.
Instances For
Final model: CG prior + QUD_now (paper's main prediction).
Equations
Instances For
Comparison model: CG prior + QUD_max.
Equations
Instances For
Comparison model: CG prior + QUD_past.
Equations
Instances For
§8.1 QUD_now symmetry #
Under QUD_now, qudAggregate maps wTT and wFT to the same value (both
have now=T), so S1(cs, wTT, u) = S1(cs, wFT, u) for all cs. With uniform
worldPrior and w-independent latentPrior, L1(wTT) = L1(wFT). This means
presupposition projection cannot be measured by world marginal under
QUD_now — the past dimension is invisible to L1.
QUD_now symmetry: wTT and wFT are indistinguishable.
This is the structural reason why projection under QUD_now must be measured via context set inference (L1_latent), not world marginal.
§8.2 QUD answer #
After hearing "didn't stop smoking" with QUD = "Does John smoke now?", L1 correctly infers the QUD answer: John smokes (now=T).
QUD answer inference: L1 infers now=T from "didn't stop smoking".
§8.3 World elimination #
Within worlds sharing the same past value, now=T dominates now=F. "Didn't stop smoking" is literally false at wTF (past=T, now=F is exactly "stopped smoking"), concentrating L1 mass on wTT.
Now=T dominates now=F among past=T worlds.
§8.4 Projection under QUD_max (Figure 1c) #
Under the identity QUD, the past-now degeneracy of §8.1 breaks: S1 scores depend on all world dimensions, so L1 can distinguish wTT from wFT. The key asymmetry: under +past context, "didn't stop" narrows to exactly wTT (maximally informative for the speaker), while under -past context it spreads over both wFT and wFF (less informative). This makes wTT more likely than wFT.
The paper observes that (T,T) = (F,F) under QUD_max (Figure 1c), so projection is incomplete — the past=T marginal still equals the past=F marginal. Full projection requires QUD_now + context set inference (§8.5).
Projection under QUD_max: wTT > wFT.
Incomplete projection under QUD_max: wTT = wFF (Figure 1c).
Under the identity QUD, (T,T) and (F,F) receive equal L1 probability. This means the past=T marginal equals the past=F marginal — projection is NOT captured at the world level under QUD_max.
§8.5 Context set projection (the paper's main result, Figure 3) #
The paper's headline result: after hearing "didn't stop smoking" under QUD_now, L1 infers the speaker assumed a +past context set (common ground where John smoked). This is presupposition projection: the presupposed content (past=T) enters the listener's beliefs about the common ground, even though the utterance literally says nothing about the past.
The mechanism: under +past context, "didn't stop" narrows L0 to exactly wTT, giving qudAggregate = 1 and rpow(1, 6) = 1 (maximally informative). Under -past context, L0 spreads over wFT and wFF, giving qudAggregate = 1/2 and rpow(1/2, 6) = 1/64 (weakly informative). Since S1 rewards informativity, the speaker is much more likely to use "didn't stop" when the +past context holds, and L1 infers this.
Figure 3 shows that (T,T) with context set +past is the unique maximum of the joint distribution under CG prior + QUD_now.
Context set projection: L1 infers +past context over -past context.
+past beats uninformative universe despite lower prior (9 vs 6).
Under +past, "didn't stop" narrows to exactly wTT (rpow(1,6) = 1). Under universe, it spreads over 3 worlds (rpow(1/4,6) ≈ 0). The informativity gain overwhelms the prior advantage.
-now context is dispreferred (p. 1114): -now already entails the QUD answer (John doesn't smoke now), so the speaker would be maximally informative even saying nothing — "didn't stop smoking" adds no value. L1 therefore infers the speaker did NOT assume a -now context.
§8.6 "Stopped smoking" projects past=T via QUD answer #
"Stopped smoking" is true only at wTF (past=T, now=F). Under QUD_now, L1 infers the QUD answer is now=F.
"Stopped smoking" → L1 infers now=F (the assertion).
"didn't stop smoking" is compatible with 3 of 4 worlds.
Under context set +past, "didn't stop" is maximally informative for QUD_now: it narrows to exactly (T,T).
"stopped smoking" narrows to exactly (T,F): past=T and now=F.
"always smoked" = {(T,T)}, maximally informative for (T,T).
"never smoked" = {(F,F)}, maximally informative for (F,F).
Context sets that entail past=T (used for measuring projection).
Equations
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.entailsPast Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastTrue = true
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.entailsPast Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastTrueNowTrue = true
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.entailsPast Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.ContextSet.pastTrueNowFalse = true
- Phenomena.Presupposition.Studies.QingGoodmanLassiter2016.entailsPast x✝ = false
Instances For
3 of 9 context sets entail past=T.
Mathematical Equivalence with @cite{scontras-tonhauser-2025} and @cite{warstadt-2022} #
All three papers implement the same RSA computation:
L1(w, C | u, Q) ∝ S1(u | w, C, Q) · P(w) · P(C)
| Paper | Latent | Interpretation | Domain |
|---|---|---|---|
| @cite{qing-goodman-lassiter-2016} | Context set C | Common ground | CoS verbs |
| @cite{scontras-tonhauser-2025} | Assumptions A | Speaker beliefs | Factives |
| @cite{warstadt-2022} | Context set C | Common ground | Genus-species |
The RSAConfig encoding is structurally identical: Latent = context/belief
state, meaning filters by compatibility, s1Score uses QUD-projected rpow.
See ScontrasTonhauser2025.lean for the factive domain implementation.
@cite{scontras-tonhauser-2025} fn. 10: "@cite{qing-goodman-lassiter-2016} call these subsets the 'common ground,' but we think 'private assumptions' better captures this component of the model."
The terminological difference is interpretive, not computational.