The Language of Content

Articles from the Speak Agent team that provide classroom resources, implementation strategies, research-based perspectives, and updates regarding our Content+LanguageSM platform.

Visual language can be processed like written and spoken language


August 27, 2015 | Ben Grimley |


A team of researchers from Dalhousie University, Georgetown University, University of Geneva and University of Rochester today published the results of the first study to directly compare how brain systems engaged by sign language differ from those used for nonlinguistic gestures. Congenitally deaf signers process signs and gestures in the left hemisphere language centers, whereas non-signers showed activation only in areas attuned to human movement. This strongly indicates that people can gain experience with a visual language and process it in the same way as spoken language. “Sign language uses space (as opposed to sound or writing), and it looks like a gesture but behaves like a language,” Dr. Aaron Newman told the Chronicle Herald. “Over the years, signers and the deaf community have had to fight to have their language recognized as a real language.”

Visual language learning and symbols can support academic language development, especially vocabulary acquisition. This not only helps English language learners (ELLs) and persons with communication disabilities, but also helps any academic language learner. After all, academic language is nobody's primary language! This study is promising because it infers that visual/symbol systems can be processed in the same regions of the brain as text and speech. Children are particularly adept at learning new language systems, giving them the opportunity to adapt rules-based symbol sets to other forms of expression.

The study findings were published in the Proceedings of the National Academy of Sciences.

Ben Grimley

Written by Ben Grimley

Ben is CEO and Co-Founder of Speak Agent, Inc.