unimplicit2024.github.io - UnImplicit: Understanding Implicit and Underspecified Language

Example domain paragraphs

Real language is underspecified, vague, and ambiguous. Indeed, past work (Zipf, 1949; Piantadosi, 2012) has suggested that ambiguity may be an inextricable feature of natural language, resulting from competing communicative pressures. Resolving the meaning of language is a never-ending process of making inferences based on implicit knowledge. For example, we know that ``the girl saw the man with the telescope'' is ambiguous and could refer to two situations, while ``the girl saw the man with the hamburger''

While underspecified, ambiguous, and implicit language rarely poses a problem for language speakers, it can challenge even the best models. For example, despite recent major successes in NLP coming from large language models (LLMs), it is not clear that models capture ambiguous language in a human-like fashion (Liu, 2023; Stengel-Eskin, 2023). The same has been argued for multimodal NLP. (Pezzelle, 2023), for example, showed that CLIPScore is sensitive to underspecified captions. Tackling these kinds of lin

In order to resolve the meaning of underspecified and ambiguous language, we often employ additional modalities and information acquired through embodied experience (Bisk, 2020). For example ``the girl saw the man with the telescope'' becomes unambiguous if paired with an image of a man holding a telescope. In contrast, NLP typically considers language in isolation, removed from the context in which it is typically found. This workshop will highlight multimodal inputs, especially visual ones, as sources of

Links to unimplicit2024.github.io (1)