Unknown

Dataset Information

0

What Homophones Say about Words.


ABSTRACT: The number of potential meanings for a new word is astronomic. To make the word-learning problem tractable, one must restrict the hypothesis space. To do so, current word learning accounts often incorporate constraints about cognition or about the mature lexicon directly in the learning device. We are concerned with the convexity constraint, which holds that concepts (privileged sets of entities that we think of as "coherent") do not have gaps (if A and B belong to a concept, so does any entity "between" A and B). To leverage from it a linguistic constraint, learning algorithms have percolated this constraint from concepts, to word forms: some algorithms rely on the possibility that word forms are associated with convex sets of objects. Yet this does have to be the case: homophones are word forms associated with two separate words and meanings. Two sets of experiments show that when evidence suggests that a novel label is associated with a disjoint (non-convex) set of objects, either a) because there is a gap in conceptual space between the learning exemplars for a given word or b) because of the intervention of other lexical items in that gap, adults prefer to postulate homophony, where a single word form is associated with two separate words and meanings, rather than inferring that the word could have a disjunctive, discontinuous meaning. These results about homophony must be integrated to current word learning algorithms. We conclude by arguing for a weaker specialization of word learning algorithms, which too often could miss important constraints by focusing on a restricted empirical basis (e.g., non-homophonous content words).

SUBMITTER: Dautriche I 

PROVIDER: S-EPMC5008697 | biostudies-literature | 2016

REPOSITORIES: biostudies-literature

altmetric image

Publications

What Homophones Say about Words.

Dautriche Isabelle I   Chemla Emmanuel E  

PloS one 20160901 9


The number of potential meanings for a new word is astronomic. To make the word-learning problem tractable, one must restrict the hypothesis space. To do so, current word learning accounts often incorporate constraints about cognition or about the mature lexicon directly in the learning device. We are concerned with the convexity constraint, which holds that concepts (privileged sets of entities that we think of as "coherent") do not have gaps (if A and B belong to a concept, so does any entity  ...[more]

Similar Datasets

| S-EPMC7534309 | biostudies-literature
| PRJNA603992 | ENA
| S-EPMC5367780 | biostudies-literature
| S-EPMC6647162 | biostudies-literature
| S-EPMC6634368 | biostudies-literature
| S-EPMC5448397 | biostudies-literature
| S-EPMC6695592 | biostudies-literature
| S-EPMC7056507 | biostudies-literature
| S-EPMC5974396 | biostudies-literature
| S-EPMC3859505 | biostudies-literature