A new study in the journal Cognition, by Judith Kroll and her team at Penn State, suggests that deaf individuals fluent in American Sign Language (ASL) with English as their second language can read words and internally visualize signs at the same time. Like non-deaf bilinguals, ASL users engage their knowledge of both languages simultaneously, though in the case of the deaf, that means they aren’t merely thinking of two words, they’re thinking of both a word and a mental representation of positioned human hands.
Deaf individuals were told to decide whether two sample English words were related or unrelated in meaning as fast as they could (i.e. ‘bird’ and ‘duck’ would be related, ‘movie’ and ‘paper’ would be unrelated). If the related words also had ASL hand signs that were simlar to each other in their shape and motion, the deaf participants recorded significantly faster translation times for the task.
In other words, if two signs are physically similar, they speed up language translation. There is clearly some kind of “vocabulary of the hands” in ASL speakers that may make two similar hand shapes analogous to two similar words, like “drive” and “dive” or “sun” and “sunk.”
These data further support the idea that ASL is very cognitively similar to spoken language, likely uses some of the same neural pathways, and is learned in the same way. The latter idea is strongly supported by the touching history of Nicaraguan Sign Language, which was created by an isolated population of genetically deaf children in Nicaragua in the 1970s.