CCG Explanation of Scope Freezing #
@cite{antonyuk-2015} @cite{scontras-polinsky-tsai-mai-2017}
In CCG, scope ambiguity arises from derivational ambiguity:
- Type-raising allows arguments to take scope
- Composition allows non-canonical derivations
- When only ONE derivation exists → scope is frozen
CCG Predictions #
| Context | CCG Explanation |
|---|---|
| Possessor | Complex DP has single derivation; possessor is pre-combined |
| Double object | Argument structure differs from PP dative; single derivation |
| Passive | By-phrase combines late; limited derivational options |
| Heavy NP | Not structural; CCG predicts ambiguity |
CCG analysis of a freezing context
- singleDerivation : CCGFreezingReason
- noTypeRaising : CCGFreezingReason
- argumentStructure : CCGFreezingReason
- notFrozen : CCGFreezingReason
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
CCG's analysis of each freezing context
Equations
- Phenomena.Quantification.Compare.ccgAnalyzeContext Phenomena.Quantification.Data.FreezingContext.none = Phenomena.Quantification.Compare.CCGFreezingReason.notFrozen
- Phenomena.Quantification.Compare.ccgAnalyzeContext Phenomena.Quantification.Data.FreezingContext.possessor = Phenomena.Quantification.Compare.CCGFreezingReason.singleDerivation
- Phenomena.Quantification.Compare.ccgAnalyzeContext Phenomena.Quantification.Data.FreezingContext.doubleObject = Phenomena.Quantification.Compare.CCGFreezingReason.argumentStructure
- Phenomena.Quantification.Compare.ccgAnalyzeContext Phenomena.Quantification.Data.FreezingContext.passive = Phenomena.Quantification.Compare.CCGFreezingReason.singleDerivation
- Phenomena.Quantification.Compare.ccgAnalyzeContext Phenomena.Quantification.Data.FreezingContext.heavyNP = Phenomena.Quantification.Compare.CCGFreezingReason.notFrozen
- Phenomena.Quantification.Compare.ccgAnalyzeContext Phenomena.Quantification.Data.FreezingContext.weakCrossover = Phenomena.Quantification.Compare.CCGFreezingReason.notFrozen
- Phenomena.Quantification.Compare.ccgAnalyzeContext Phenomena.Quantification.Data.FreezingContext.adjunct = Phenomena.Quantification.Compare.CCGFreezingReason.singleDerivation
- Phenomena.Quantification.Compare.ccgAnalyzeContext Phenomena.Quantification.Data.FreezingContext.attitude = Phenomena.Quantification.Compare.CCGFreezingReason.singleDerivation
Instances For
Does CCG predict freezing?
Equations
Instances For
Processing Explanation #
Processing accounts argue:
- Inverse scope requires reanalysis or memory operations
- Cost scales with complexity of intervening material
- Freezing = processing profile of frozen condition Pareto-dominates baseline
Processing Predictions #
| Context | Processing Explanation |
|---|---|
| Possessor | Complex subject increases locality and referential load |
| Double object | Two objects increase referential load and boundaries |
| Passive | By-phrase increases locality; reanalysis costly |
| Heavy NP | Complexity directly increases locality and referential load |
Baseline: "A student attended every seminar" — short, simple.
Equations
- Phenomena.Quantification.Compare.baseline_scope = { locality := 3, boundaries := 0, referentialLoad := 0, ease := 0 }
Instances For
Possessor: "A student's teacher attended every seminar" — complex subject.
Equations
- Phenomena.Quantification.Compare.possessor_scope = { locality := 5, boundaries := 0, referentialLoad := 2, ease := 0 }
Instances For
Double object: "A teacher gave every student a book" — two objects.
Equations
- Phenomena.Quantification.Compare.doubleObject_scope = { locality := 4, boundaries := 0, referentialLoad := 2, ease := 0 }
Instances For
Heavy NP: "A student from the local university attended every seminar"
Equations
- Phenomena.Quantification.Compare.heavyNP_scope = { locality := 8, boundaries := 0, referentialLoad := 1, ease := 0 }
Instances For
Scope condition type for typeclass instance.
- baseline : ScopeCondition
- possessor : ScopeCondition
- doubleObject : ScopeCondition
- heavyNP : ScopeCondition
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- One or more equations did not get rendered due to their size.
Does the processing model predict freezing (harder than baseline)?
Equations
- One or more equations did not get rendered due to their size.
Instances For
Predictions from all three theories
- context : Data.FreezingContext
- minimalism : Bool
- ccg : Bool
- processing : ProcessingModel.CompareResult
Processing comparison against baseline (Pareto dominance)
- observed : Data.Availability
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Does the processing prediction match observation? Frozen observed → processing should be harder than baseline. Ambiguous observed → processing should not be harder.
Equations
- One or more equations did not get rendered due to their size.
Instances For
Compare theories on a freezing context
Equations
- One or more equations did not get rendered due to their size.
Instances For
Possessor freezing: all theories agree
Equations
- One or more equations did not get rendered due to their size.
Instances For
Double object: all theories agree
Equations
- One or more equations did not get rendered due to their size.
Instances For
Heavy NP: theories DIVERGE
Equations
- One or more equations did not get rendered due to their size.
Instances For
Baseline: all theories agree (ambiguous)
Equations
- One or more equations did not get rendered due to their size.
Instances For
Check if all theories agree
Equations
- One or more equations did not get rendered due to their size.
Instances For
Find where theories diverge
Equations
Instances For
Heavy NP is the key divergence case
Possessor has agreement
Where Theories Diverge #
Heavy NP #
- Minimalism: No grammatical barrier → predicts AMBIGUOUS
- CCG: Same derivations available → predicts AMBIGUOUS
- Processing: Pareto harder than baseline → predicts FROZEN
- Observed: Frozen (gradient)
Verdict: Processing explains heavy NP; grammar theories fail.
Gradient Passive Judgments #
- Minimalism: Adjunct island → predicts categorical freezing
- CCG: Single derivation → predicts categorical freezing
- Processing: Moderate cost → predicts gradient
- Observed: Gradient (weaker than possessor)
Verdict: Processing captures gradience; grammar theories predict categorical.
Context Effects (Hypothetical) #
If context can rescue "frozen" readings:
- Minimalism: No rescue possible (grammatical)
- CCG: No rescue possible (derivational)
- Processing/RSA: Rescue possible with strong context
Test: Find contexts where "frozen" readings become available.
Divergence types
- grammarVsProcessing : DivergenceType
- processingVsGrammar : DivergenceType
- allAgree : DivergenceType
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Classify the divergence
Equations
- One or more equations did not get rendered due to their size.
Instances For
Heavy NP is grammar-vs-processing divergence
Count correct predictions for a theory
Equations
- One or more equations did not get rendered due to their size.
Instances For
Equations
- One or more equations did not get rendered due to their size.
Instances For
Minimalism accuracy
Equations
- One or more equations did not get rendered due to their size.
Instances For
CCG accuracy
Equations
- One or more equations did not get rendered due to their size.
Instances For
Processing accuracy
Equations
- One or more equations did not get rendered due to their size.
Instances For
Verify processing ordering predictions against empirical data.
Equations
- One or more equations did not get rendered due to their size.
Instances For
All scope ordering predictions verified by Pareto dominance.
Theoretical Implications #
What the Comparison Shows #
Grammar theories (Minimalism, CCG) agree on most cases
- Both predict possessor, double object, passive freezing
- Both fail on heavy NP (no grammatical barrier)
Processing fills the gap
- Explains heavy NP via Pareto-harder profile
- Explains gradient judgments
- Compatible with grammar accounts for clear cases
Possible synthesis
- Grammar determines AVAILABLE readings
- Processing/pragmatics determines PREFERRED readings
- "Freezing" may be a mix: some grammatical, some processing
Open Questions #
Is possessor freezing truly categorical?
- Need experimental data comparing possessor vs heavy NP
Can context rescue frozen readings?
- Would distinguish grammar from processing accounts
Cross-linguistic variation?
- Some languages show different freezing patterns
- Scrambling languages may differ
Proposed Test Cases #
Baseline: "A student attended every seminar" (ambiguous)
Possessor: "A student's teacher attended every seminar" (frozen?)
Heavy: "A student from the university attended every seminar" (frozen?)
If possessor is MORE frozen than heavy NP, grammar contributes. If they're equally frozen, processing suffices.
RSA Rescue Prediction #
See RSA/Implementations/ScopeFreezing.lean: takes possessor_frozen.observed
(from phenomena data, which grammar predicts) as interpretation prior,
then rsa_can_rescue_frozen proves world priors can rescue.
Frozen: P(inverse) = 2%; Rescued: P(inverse) > 50%.
Summary: Scope Freezing Comparison #
Agreement #
| Context | Minimalism | CCG | Processing | Observed |
|---|---|---|---|---|
| Baseline | ✗ | ✗ | ✗ | Ambiguous |
| Possessor | ✓ | ✓ | ✓ | Frozen |
| Double obj | ✓ | ✓ | ✓ | Frozen |
| Heavy NP | ✗ | ✗ | ✓ | Frozen |
(✓ = predicts frozen, ✗ = predicts ambiguous)
Key Divergence #
Heavy NP: Only processing predicts freezing.
- Minimalism: No phase/island barrier
- CCG: Same derivations available
- Processing: Pareto harder than baseline
Theoretical Conclusion #
Scope freezing is likely a mixed phenomenon:
- Some cases are grammatical (possessor, double object)
- Some cases are processing-based (heavy NP)
- Gradient judgments suggest processing plays a role even in "grammatical" cases
Empirical Need #
Controlled experiments comparing:
- Possessor vs heavy NP freezing strength
- Context effects on frozen readings
- Cross-linguistic patterns