Peer-Reviewed Journal Details
Mandatory Fields
Dymarska, A; Connell, L; Banks, B
2022
Unknown
Collabra: Psychology
Linguistic Bootstrapping Allows More Real-world Object Concepts to Be Held in Mind
Published
()
Optional Fields
8
1
40171
The linguistic-simulation approach to cognition predicts that language can enable more efficient conceptual processing than purely sensorimotor-affective simulations of concepts. We tested the implications of this approach in memory for sequences of real-world objects, where use of linguistic labels (i.e., words and phrases) could enable more efficient representation of object concepts than representation via full sensorimotor simulation; a proposal called linguistic bootstrapping. In three pre-registered experiments using a nonverbal paradigm, we asked participants to remember sequences of contextually-situated, real-world objects (e.g., the ingredients for a recipe), and later asked them to select the correct objects from arrays of distractors. Critically, we used articulatory suppression to selectively suppress implicit activation of linguistic labels, which we predicted would impair performance by reducing the number of objects that could be held in mind simultaneously. We found that suppressing access to language when learning the sequences impaired accuracy of object recognition, though not latency, and that this impairment was not simply dual-task load. Results show that a sequence of up to 10 contextually-situated object concepts can be held in mind when language is inhibited, but this increases to 12 objects when language is available. The findings support the linguistic bootstrapping hypothesis that representing familiar object concepts normally relies on language, and that implicitly-retrieved object labels, used as linguistic placeholders, can increase the number of objects that can be simultaneously represented beyond what sensorimotor information alone can accomplish.
https://doi.org/10.1525/collabra.40171
https://doi.org/10.1525/collabra.40171
Grant Details