A subscription to JoVE is required to view this content. Sign in or start your free trial.
Here we present a protocol for familiarization-test paradigms which provide a direct test of infant categorization and help to define the role of language in early category learning.
Assessing infant category learning is a challenging but vital aspect of studying infant cognition. By employing a familiarization-test paradigm, we straightforwardly measure infants’ success in learning a novel category while relying only on their looking behavior. Moreover, the paradigm can directly measure the impact of different auditory signals on the infant categorization across a range of ages. For instance, we assessed how 2-year-olds learn categories in a variety of labeling environments: in our task, 2-year-olds successfully learned categories when all exemplars were labeled or the first two exemplars were labeled, but they failed to categorize when no exemplars were labeled or only the final two exemplars were labeled. To determine infants’ success in such tasks, researchers can examine both the overall preference displayed by infants in each condition and infants’ pattern of looking over the course of the test phase, using an eye-tracker to provide fine-grained time-course data. Thus, we present a powerful paradigm for identifying the role of language, or any auditory signal, in infants’ object category learning.
Categorization is a fundamental building block of human cognition: infants’ categorization abilities emerge early in infancy and become increasingly sophisticated with age.1,2,3 Research has also revealed a powerful role for language in infant categorization: from 3 months of age, infants learn categories more successfully when category exemplars are paired with language.4,5,6 Moreover, by the end of the first year, infants are attuned to the role of count noun labels in c....
All methods described here have been approved by the Northwestern University Institutional Review Board.
1. Stimuli Creation
NOTE: The visual stimuli (see Figure 1) used in the representative design reported below were originally developed in Havy and Waxman (2016)18 and are available for download at https://osf.io/n6uy8/.
Using the protocol above, we ran two experiments22. Analyses were conducted with the eyetrackingR package23, and the data and code are available at https://github.com/sandylat/ssl-in-infancy. In the first experiment, we contrasted a fully supervised condition (n = 24, Mage = 26.8 mo), featuring only labeled exemplars, with an unsupervised condition (n = 24, Mage = 26.9 mo), featuring only unlabeled ex.......
Here, we present a procedure for evaluating the role of labeling in categorization. By presenting 2-year-olds with a realistic mix of labeled and unlabeled exemplars, we demonstrate that very young children are capable of learning in semi-supervised environments, extending work with adults and older children24,25. Thus, this method offers a resolution to the paradox posed above: if even a few labeled exemplars can spark category learning, then labels can be both .......
The research reported here was supported by the National Institute of Child Health and Human Development of the National Institutes of Health under award number R01HD083310 and a National Science Foundation Graduate Research Fellowship under grant no. DGE‐1324585. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health or the National Science Foundation.
....Name | Company | Catalog Number | Comments |
Final Cut Pro X | Apple | N/A | Video editing, composition software |
MorphX | Norrkross | N/A | Image-morphing software |
PhotoShop | Adobe | N/A | Image-editing software |
R | R Core Team | N/A | Statistical analysis software |
T60XL Eyetracker | Tobii Pro | Discontinued | Large, arm-mounted eyetracker suitable for work with infants and children |
Tobii Pro Studio | Tobii Pro | N/A | Software directing eyetracker display, data collection |
This article has been published
Video Coming Soon
ABOUT JoVE
Copyright © 2024 MyJoVE Corporation. All rights reserved