Bencie Woll, born in New York City, New York, USA, in 1950. Ph.D. from the University of Bristol. Professor of Sign Language and Deaf Studies, and Director of Deafness, Cognition and Language Research Centre at University College London.
Fellow (1 September 2005 – 31 January 2006)
ECHO PHONOLOGY
The specific study planned for NIAS as part of the “Windows on Language Genesis” group developed earlier research on ‘echo phonology’, the repertoire of mouth actions which form an obligatory accompaniment to some manual signs in a range of sign languages, and which are characterised by ‘echoing’ in oral articulatory properties certain of the properties of the manual actions (e.g. opening or closing of the hand, movement through space, etc). The aim was to undertake a series of studies arising from very different perspectives – linguistics and cognitive neuroscience – in order to explore whether echo phonology could provide insight into a possible route for the evolution of human language. In addition, a case study on a deaf linguistic isolate, who was not exposed to a first language until late in life, explored the concept of a ‘critical period’ in childhood in which the brain can construct language from input in the environment.
Echo phonology in narratives was compared in three European sign languages and was found to appear in all three. Echo phonology appears to be a robust phenomenon, found in different languages at a similar frequency, and with a similar core phonological structure. The outcomes of the analyses of the fMRI studies indicate that for deaf subjects, the left hemisphere activations found when processing echo phonology strongly resemble those for processing silent speech (lipreading). In other words, for this population, echo phonology activates the same areas of the brain used for processing phonology in spoken language. In contrast, for hearing subjects, echo phonology processing does not activate those areas. Instead, there is extensive bilateral activation, which can include frontal regions in pre-motor sites. These sites are also activated when processing spoken language. This indicates that that echo phonology does occupy an intermediate position between signing and speech in terms of brain processing, which in turn provides support to the notion that echo phonology may reflect a mechanism associated with language evolution. The communication of the linguistic isolate revealed, as well as difficulties with syntax and semantics (including problems with semantic and syntactic role, time reference, and negation) substantial difficulties with sign language phonology: specifically difficulty in comprehending and producing phonological contrasts. The various studies undertaken thus confirm that the establishment of phonology is crucially connected with the comprehension and production of a linguistic system, and suggests that exposure to and mastery of phonology must take place within a critical developmental window.