Documentation

Linglib.Theories.Semantics.Questions.Polarity

Questions/Polarity.lean #

@cite{bring-gunlogson-2000} @cite{ladd-1981} @cite{van-rooy-2003}

Van Rooy & Šafářová (2003) Decision-Theoretic Account of Polar Question Choice.

The Problem #

Standard G&S/Hamblin semantics predicts that:

All have the same denotation: {q, ¬q}. But they're NOT interchangeable:

The Solution #

Question polarity choice depends on utility of answers:

Question TypeUtility Condition
PPQ (?q)UV(q) > UV(¬q)
NPQ (?¬q)UV(¬q) > UV(q)
Alt (?q∨¬q)UV(q) ≈ UV(¬q)

Two sources of utility:

  1. Goal-based: UV(q) > UV(¬q) iff P(g|q) > P(g|¬q)
  2. Informativity: UV(q) = inf(q) = -log P(q) (surprisal)

Applications #

The three types of polar questions (semantically equivalent, pragmatically distinct).

Instances For
    Equations
    • One or more equations did not get rendered due to their size.
    Instances For

      A polar question with its associated proposition and type.

      • prop : WBool

        The positive proposition

      • The question type (positive, negative, or alternative)

      Instances For

        All polar questions have the same G&S denotation: {q, ¬q}.

        Equations
        Instances For

          This is the key semantic equivalence that the pragmatic account explains.

          Utility value of learning proposition p is true.

          For goal-based utility: UV(p) = P(g|p) - P(g) For informativity: UV(p) = inf(p) = -log P(p)

          We use a general definition: improvement in expected utility after conditioning.

          Equations
          Instances For

            Compare utility of positive vs negative answer.

            Equations
            • One or more equations did not get rendered due to their size.
            Instances For

              The Van Rooy & Šafářová criterion: choose question type based on answer utilities.

              • PPQ if UV(q) > UV(¬q)
              • NPQ if UV(¬q) > UV(q)
              • Alt if UV(q) ≈ UV(¬q)
              Equations
              • One or more equations did not get rendered due to their size.
              Instances For

                Threshold-based comparison (for approximate equality).

                Equations
                • One or more equations did not get rendered due to their size.
                Instances For

                  A decision problem where the agent has a single goal proposition.

                  U(w) = 1 if w ∈ g, 0 otherwise Then EU(P,U) = P(g), and UV(q) = P(g|q) - P(g)

                  Equations
                  Instances For
                    def Semantics.Questions.Polarity.conditionalGoalProb {W : Type u_1} (goal : WBool) (prior : W) (worlds : List W) (condition : WBool) :

                    For goal-based utility: UV(q) > UV(¬q) iff P(g|q) > P(g|¬q).

                    Equations
                    • One or more equations did not get rendered due to their size.
                    Instances For
                      def Semantics.Questions.Polarity.goalProbAdvantage {W : Type u_1} (goal : WBool) (prior : W) (worlds : List W) (p : WBool) :

                      Goal probability advantage: P(g|q) - P(g|¬q).

                      Equations
                      • One or more equations did not get rendered due to their size.
                      Instances For
                        theorem Semantics.Questions.Polarity.request_forces_ppq {W : Type u_1} (p : WBool) (prior : W) (worlds : List W) (_hNonempty : worlds.length > 0) :
                        goalProbAdvantage p prior worlds p 0

                        When goal = questioned proposition, PPQ is always optimal.

                        For requests like "Will you marry me?", g = q, so:

                        • P(g|q) = P(q|q) = 1
                        • P(g|¬q) = P(q|¬q) = 0 Thus UV(q) > UV(¬q) necessarily.

                        Proof: conditionalGoalProb p prior worlds (pnot p) = 0 because filtering by ¬p means all remaining worlds have p = false, so the goal p is never satisfied. And conditionalGoalProb p prior worlds p ≥ 0 because it equals either 0 (when totalProb = 0) or totalProb/totalProb = 1 (when totalProb ≠ 0).

                        def Semantics.Questions.Polarity.surprisal {W : Type u_1} (prior : W) (worlds : List W) (p : WBool) :

                        Surprisal (negative log probability) of a proposition.

                        inf(q) = -log P(q)

                        Higher surprisal = lower probability = more informative if true.

                        We approximate with 1/prob - 1, which is monotonically decreasing in prob for all prob > 0 (like -log), equals 0 at prob = 1, and is positive for prob < 1. The prob = 0 guard handles ℚ's 1/0 = 0 convention.

                        Equations
                        • One or more equations did not get rendered due to their size.
                        Instances For
                          def Semantics.Questions.Polarity.informativenessAdvantage {W : Type u_1} (prior : W) (worlds : List W) (p : WBool) :

                          For informativity: UV(q) > UV(¬q) iff P(q) < P(¬q).

                          Less likely propositions are more informative when confirmed.

                          Equations
                          • One or more equations did not get rendered due to their size.
                          Instances For
                            def Semantics.Questions.Polarity.positiveIsLessLikely {W : Type u_1} (prior : W) (worlds : List W) (p : WBool) :

                            Givón's generalization: by default, positive propositions are less likely.

                            For most natural language statements q: P(q) < P(¬q) This explains why PPQs are the default form of polar questions.

                            Equations
                            • One or more equations did not get rendered due to their size.
                            Instances For

                              Classification of polar question uses based on utility source.

                              • request : QuestionUse

                                Goal = questioned prop (requests, pleas)

                              • invitation : QuestionUse

                                Goal is facilitated by positive answer (invitations)

                              • grounding : QuestionUse

                                Checking surprising new information

                              • inference : QuestionUse

                                Drawing inferences from context

                              • rhetorical : QuestionUse

                                Speaker indicates believed answer (rhetorical)

                              • neutral : QuestionUse

                                Pure information seeking with no bias

                              Instances For
                                Equations
                                • One or more equations did not get rendered due to their size.
                                Instances For

                                  For requests, alternative questions are pragmatically degraded.

                                  "Will you marry me or not?" signals indifference to outcome, which is inconsistent with a genuine request.

                                  When is a negative polar question appropriate?

                                  NPQ (?¬q) requires UV(¬q) > UV(q), which can happen when:

                                  1. Goal is reached by ¬q being true (medical diagnosis, ecological quiz)
                                  2. Prior strongly favors q, so ¬q is more informative (tag questions)
                                  Equations
                                  Instances For
                                    def Semantics.Questions.Polarity.medicalDiagnosisDP {W : Type u_1} (_symptom illness : WBool) (prior : W) :

                                    Example: Medical diagnosis questions.

                                    "Is your child not eating?" is appropriate when:

                                    • Goal: diagnose illness
                                    • ¬(eating properly) is a symptom that helps diagnosis
                                    • Thus P(diagnosis|¬eating) > P(diagnosis|eating)
                                    Equations
                                    Instances For
                                      def Semantics.Questions.Polarity.tagQuestionInformativity {W : Type u_1} (prior : W) (worlds : List W) (_declarative tag : WBool) (_hDeclarativeIsNotTag : ∀ (w : W), _declarative w = !tag w) (hDeclarativeLikely : positiveIsLessLikely prior worlds tag = true) (hPosProb : List.foldl (fun (acc : ) (w : W) => acc + if tag w = true then prior w else 0) 0 worlds > 0) :
                                      informativenessAdvantage prior worlds tag > 0

                                      For tag questions like "John isn't bad, is he?": The speaker takes the declarative as likely true, making the tag's positive prop (that John IS bad) low probability, hence informative.

                                      Requires positive probability for the tag proposition (so the 1/prob - 1 surprisal approximation is well-defined).

                                      Equations
                                      • =
                                      Instances For
                                        def Semantics.Questions.Polarity.altAppropriate {W : Type u_1} {A : Type u_2} [Fintype W] [DecidableEq W] [DecidableEq A] (dp : Core.DecisionTheory.DecisionProblem W A) (actions : List A) (p : WBool) (threshold : ) :

                                        Alternative questions are appropriate when utilities are balanced.

                                        UV(q) ≈ UV(¬q) signals:

                                        1. No preference for one answer over the other
                                        2. Genuine information seeking without bias
                                        3. Higher urgency (explicit enumeration of alternatives)
                                        Equations
                                        • One or more equations did not get rendered due to their size.
                                        Instances For

                                          Alternative questions can be impolite as invitations.

                                          "Do you want something to drink or not?" implies:

                                          • Speaker doesn't care about hearer's preference
                                          • Violates politeness by not encoding hearer's benefit

                                          Degrees of insistence in alternative questions:

                                          1. "Did you buy it or not?"
                                          2. "Did you buy it or didn't you?"
                                          3. "Did you buy it or didn't you buy it?"
                                          4. "Did you or did you not buy it?"

                                          These have increasing insistence while maintaining UV(q) ≈ UV(¬q).

                                          Instances For
                                            Equations
                                            • One or more equations did not get rendered due to their size.
                                            Instances For

                                              On Ladd's INPQ/ONPQ Distinction #

                                              @cite{ladd-1981} distinguished:

                                              Van Rooy & Šafářová argue this distinction is superfluous:

                                              The German examples:

                                              Can be explained by whether the negation can bear verum focus (INPQ = checking surprising negative info) or not (ONPQ = standard informativity-based NPQ).

                                              We don't distinguish INPQ from ONPQ semantically. The distinction is purely pragmatic (source of utility).

                                              • prop : WBool

                                                The proposition being negated

                                              • utilitySource : Bool

                                                Whether utility is goal-based or informativity-based

                                              Instances For

                                                Both INPQ and ONPQ have the same semantic content.

                                                Equations
                                                Instances For

                                                  A rhetorical question is one where the speaker presupposes an answer but uses question form for pragmatic effect.

                                                  Key insight: Rhetorical questions MUST be polar, not alternative. "Are you crazy?" works rhetorically; "Are you crazy or not?" doesn't.

                                                  • prop : WBool

                                                    The questioned proposition

                                                  • presupposedPositive : Bool

                                                    The presupposed answer (true = positive, false = negative)

                                                  • beliefStrength :

                                                    Speaker's evidence/belief strength

                                                  Instances For

                                                    Rhetorical effect requires polar form.

                                                    The speaker:

                                                    1. Has high prior for one answer (say ¬q)
                                                    2. Uses question form to highlight that recent evidence suggests q
                                                    3. Alternative form would remove this highlighting effect
                                                    def Semantics.Questions.Polarity.rhetoricalUsePPQ {W : Type u_1} (prior : W) (worlds : List W) (p : WBool) (hPriorFavorsNegative : positiveIsLessLikely prior worlds p = true) (hPosProb : List.foldl (fun (acc : ) (w : W) => acc + if p w = true then prior w else 0) 0 worlds > 0) :
                                                    informativenessAdvantage prior worlds p > 0

                                                    Why rhetorical questions use PPQ form even when expecting negative answer:

                                                    Speaker signals: "I have new evidence for q, even though I believed ¬q" This makes q surprising (high surprisal), thus high informativity. PPQ highlights this surprising-if-true proposition.

                                                    Requires P(p) > 0 so surprisal is well-defined.

                                                    Equations
                                                    • =
                                                    Instances For

                                                      A grounding question checks whether surprising new information should be accepted.

                                                      "Is David back?" after being told David returned (unexpectedly). "Is it raining?" after seeing someone with a wet jacket.

                                                      The speaker double-checks because:

                                                      • P(q) was very low in prior state
                                                      • New evidence suggests q might be true
                                                      • Accepting q would significantly revise beliefs
                                                      • prop : WBool

                                                        The proposition to be grounded

                                                      • priorProb :

                                                        Prior probability before new evidence

                                                      • posteriorProb :

                                                        Posterior probability after new evidence

                                                      • hIncreased : self.posteriorProb > self.priorProb

                                                        Evidence must have increased probability

                                                      Instances For

                                                        Grounding questions prefer polar form to highlight the surprising proposition.

                                                        The utility of grounding: revision magnitude.

                                                        If accepting q causes large belief revision, double-checking has high utility.

                                                        Equations
                                                        Instances For