Events

How do genes and experience interact to produce human cognition? I will discuss insights into this puzzle from studies of blindness. The first half of the talk will focus on how first-person sensory experience contributes to concepts. What do congenitally blind people know about seeing and light? One source of evidence comes from studies of “visual” verbs. Congenitally blind and sighted people made semantic similarity judgments on pairs of visual verbs (e.g. to glimpse) and non-visual verbs (e.g. to touch). We find that blind adults distinguish seeing from perception through other sensory modalities (e.g. to touch) and from amodal knowledge acquisition (e.g. to notice). Like sighted individuals, they make fine grained spatiotemporal distinctions among verbs of seeing (e.g. to peek vs. to stare). Blind adults also distinguish among verbs of light emission along dimensions of intensity (glow vs. blaze) and temporal continuity (blaze vs. flash). This knowledge about seeing is not limited to the meanings of words. Blind people make inferences about how others feel based on visual experience and these inferences depend on the same neural mechanisms as in sighted individuals. Together these data suggest that first-person sensory experience is not required to develop rich conceptual representations. The second half of the talk will focus on effects of experience on the neurobiology of language. Language processing typically relies on fronto-temporal cortices. I argue that “visual” areas of the occipital cortex are added to the language system in congenitally blind individuals. Language-related plasticity occurs during development: plasticity is observed in congenitally, but not late blind adults and emerges in blind children by 4-years-of-age. These findings suggest that brain regions that did not evolve for language can nevertheless acquire language processing capacities. These studies suggest that during development brain regions acquire cognitive functions through a constrained process of self-organization.