No AccessJournal of Speech, Language, and Hearing ResearchResearch Article1 Jan 2017

Visual Context Enhanced: The Joint Contribution of Iconic Gestures and Visible Speech to Degraded Speech Comprehension

    Purpose

    This study investigated whether and to what extent iconic co-speech gestures contribute to information from visible speech to enhance degraded speech comprehension at different levels of noise-vocoding. Previous studies of the contributions of these 2 visual articulators to speech comprehension have only been performed separately.

    Method

    Twenty participants watched videos of an actress uttering an action verb and completed a free-recall task. The videos were presented in 3 speech conditions (2-band noise-vocoding, 6-band noise-vocoding, clear), 3 multimodal conditions (speech + lips blurred, speech + visible speech, speech + visible speech + gesture), and 2 visual-only conditions (visible speech, visible speech + gesture).

    Results

    Accuracy levels were higher when both visual articulators were present compared with 1 or none. The enhancement effects of (a) visible speech, (b) gestural information on top of visible speech, and (c) both visible speech and iconic gestures were larger in 6-band than 2-band noise-vocoding or visual-only conditions. Gestural enhancement in 2-band noise-vocoding did not differ from gestural enhancement in visual-only conditions.

    Conclusions

    When perceiving degraded speech in a visual context, listeners benefit more from having both visual articulators present compared with 1. This benefit was larger at 6-band than 2-band noise-vocoding, where listeners can benefit from both phonological cues from visible speech and semantic cues from iconic gestures to disambiguate speech.

    References

    • Beattie, G., & Shovelton, H. (1999a). Do iconic hand gestures really contribute anything to the semantic information conveyed by speech? An experimental investigation. Semiotica, 1, 32–49.
    • Beattie, G., & Shovelton, H. (1999b). Mapping the range of information contained in the iconic hand gestures that accompany spontaneous speech. Journal of Language and Social Psychology, 18(4), 438–462. doi:10.1177/0261927X99018004005
    • Beattie, G., & Shovelton, H. (2002). An experimental investigation of some properties of individual iconic gestures that mediate their communicative power. British Journal of Psychology, 93(2), 179–192. doi:10.1348/000712602162526
    • Boersma, P., & Weenink, D. (2015). Praat: Doing phonetics by computer. [Computer program]. Version 6.0.19. http://www.praat.org/
    • Callan, D. E., Jones, J. A., Munhall, K., Callan, A. M., Kroos, C., & Vatikiotis-Bateson, E. (2003). Neural processes underlying perceptual enhancement by visual speech gestures. NeuroReport, 14, 2213–2218. doi:10.1097/01.wnr.0000095492.38740.8f
    • Clark, H. H. (1996). Using language. Cambridge, United Kingdom: Cambridge University Press.
    • Erber, N. P. (1969). Interaction of audition and vision in the recognition of oral speech stimuli. Journal of Speech and Hearing Research, 12, 423–425.
    • Erber, N. P. (1971). Auditory and audiovisual reception of words in low-frequency noise by children with normal hearing and by children with impaired hearing. Journal of Speech and Hearing Research, 14, 496–512.
    • Goldin-Meadow, S. (2005). Hearing gesture: How our hands help us think. Cambridge, MA: Harvard University Press.
    • Grant, K. W., & Walden, B. E. (1996). Evaluating the articulation index for auditory–visual consonant recognition. The Journal of the Acoustical Society of America, 100, 2415–2424.
    • Hirata, Y., & Kelly, S. D. (2010). Effects of lips and hands on auditory learning of second-language speech sounds. Journal of Speech, Language, and Hearing Research, 53, 298–310.
    • Holle, H., & Gunter, T. C. (2007). The role of iconic gestures in speech disambiguation: ERP evidence. Journal of Cognitive Neuroscience, 19, 1175–1192. doi:10.1162/jocn.2007.19.7.1175
    • Holle, H., Obermeier, C., Schmidt-Kassow, M., Friederici, A. D., Ward, J., & Gunter, T. C. (2012). Gesture facilitates the syntactic analysis of speech. Frontiers in Psychology, 3(3), 74. doi:10.3389/fpsyg.2012.00074
    • Holle, H., Obleser, J., Rueschemeyer, S.-A., & Gunter, T. C. (2010). Integration of iconic gestures and speech in left superior temporal areas boosts speech comprehension under adverse listening conditions. NeuroImage, 49, 875–884. doi:10.1016/j.neuroimage.2009.08.058
    • Holler, J., Kelly, S., Hagoort, P., & Özyürek, A. (2010). When gestures catch the eye: The influence of gaze direction on co-speech gesture comprehension in triadic communication. In Miyake, N., Peebles, D., & Cooper, R. P. (Eds.), Proceedings of the 34th Annual Meeting of the Cognitive Science Society (CogSci 2012) (pp. 467–472). Austin, TX: Cognitive Science Society.
    • Holler, J., Schubotz, L., Kelly, S., Hagoort, P., Schuetze, M., & Özyürek, A. (2014). Social eye gaze modulates processing of speech and co-speech gesture. Cognition, 133(3), 692–697. doi:10.1016/j.cognition.2014.08.008
    • Holler, J., Shovelton, H., & Beattie, G. (2009). Do iconic hand gestures really contribute to the communication of semantic information in a face-to-face context?, Journal of Nonverbal Behavior, 33(2), 73–88. doi:10.1007/s10919-008-0063-9
    • Hoskin, J., & Herman, R. (2001). The communication, speech and gesture of a group of hearing-impaired children. International Journal of Language & Communication Disorders/Royal College of Speech & Language Therapists, 36(Suppl.), 206–209.
    • Hostetter, A. B. (2011). When do gestures communicate? A meta-analysis. Psychological Bulletin, 137(2), 297–315. doi:10.1037/a0022128
    • Kelly, S. D., Barr, D. J., Church, R. B., & Lynch, K. (1999). Offering a hand to pragmatic understanding: The role of speech and gesture in comprehension and memory. Journal of Memory and Language, 40, 577–592.
    • Kelly, S. D., Healey, M., Özyürek, A., & Holler, J. (2015). The processing of speech, gesture, and action during language comprehension. Psychonomic Bulletin & Review, 22(2), 517–523. doi:10.3758/s13423-014-0681-7
    • Kelly, S. D., Hirata, Y., Simester, J., Burch, J., Cullings, E., & Demakakos, J. (2008). Effects of hand gesture and lip movements on auditory learning of second language speech sounds. The Journal of the Acoustical Society of America, 6(10), 2357–2362. doi:10.1121/1.2933816
    • Kelly, S. D., Manning, S. M., & Rodak, S. (2008). Gesture gives a hand to language and learning: Perspectives from cognitive neuroscience, developmental psychology and education. Language and Linguistics Compass, 2(4), 569–588. doi:10.1111/j.1749-818X.2008.00067.x
    • Kelly, S. D., Özyürek, A., & Maris, E. (2010). Two sides of the same coin: Speech and gesture mutually interact to enhance comprehension. Psychological Science, 21(2), 260–267. doi:10.1177/0956797609357327
    • Kendon, A. (2004). Gesture: Visible action as utterance. Cambridge, United Kingdom: Cambridge University Press.
    • Krahmer, E., & Swerts, M. (2007). The effects of visual beats on prosodic prominence: Acoustic analyses, auditory perception and visual perception. Journal of Memory and Language, 57(3), 396–414.
    • Krauss, R. M., Morrel-Samuels, P., & Colsante, C. (1991). Do conversational hand gestures communicate?, Journal of Personality and Social Psychology, 61, 743–754.
    • Lombard, E. (1911). Le signe de l'elevation de la voix. Annals Maladiers Oreille, Larynx, Nez, Pharynx, 37, 101–119.
    • Ma, W. J., Zhou, X., Ross, L. A., Foxe, J. J., & Parra, L. C. (2009). Lip-reading aids word recognition most in moderate noise: A Bayesian explanation using high-dimensional feature space. PloS One, 4(3), e4638. doi:10.1371/journal.pone.0004638
    • McNeill, D. (1992). Hand and mind: What gestures reveal about thought. Chicago, IL: Chicago University Press.
    • Meredith, M. A., & Stein, B. E. (1983, July22). Interactions among converging sensory inputs in the superior colliculus. Science, 221(4608), 389–391.
    • Obermeier, C., Dolk, T., & Gunter, T. C. (2012). The benefit of gestures during communication: Evidence from hearing and hearing-impaired individuals. Cortex; a Journal Devoted to the Study of the Nervous System and Behavior, 48, 857–870. doi:10.1016/j.cortex.2011.02.007
    • Obermeier, C., Holle, H., & Gunter, T. C. (2011). What iconic gesture fragments reveal about gesture-speech integration: When synchrony is lost, memory can help. Journal of Cognitive Neuroscience, 23, 1648–1663. doi:10.1162/jocn.2010.21498
    • Özyürek, A. (2014). Hearing and seeing meaning in speech and gesture: Insights from brain and behaviour. Philosophical Transactions of the Royal Society of London. Series B, Biological Sciences, 369, 20130296. doi:10.1098/rstb.2013.0296
    • Peelle, J. E., & Sommers, M. S. (2015). Prediction and constraint in audiovisual speech perception. Cortex; a Journal Devoted to the Study of the Nervous System and Behavior, 68, 169–181. doi:10.1016/j.cortex.2015.03.006
    • Rogers, W. T. (1978). The contribution of kinesic illustrators toward the comprehension of verbal behavior within utterances. Human Communication Research, 5(1), 54–62. doi:10.1111/j.1468-2958.1978.tb00622.x
    • Ross, L. A., Saint-Amour, D., Leavitt, V. M., Javitt, D. C., & Foxe, J. J. (2007). Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments. Cerebral Cortex (New York, N.Y. : 1991), 17(5), 1147–1153. doi:10.1093/cercor/bhl024
    • Schwartz, J.-L., Berthommier, F., & Savariaux, C. (2004). Seeing to hear better: Evidence for early audio-visual interactions in speech identification. Cognition, 93(2), B69–B78. doi:10.1016/j.cognition.2004.01.006
    • Shannon, R., Zeng, F.-G., Kamath, V., Wygonski, J., & Ekelid, M. (1995, October13). Speech recognition with primarily temporal cues. Science, 270(5234), 303–304.
    • Skipper, J. I., Goldin-Meadow, S., Nusbaum, H. C., & Small, S. L. (2009). Gestures orchestrate brain networks for language understanding. Current Biology, 19(8), 661–667. doi:10.1016/j.cub.2009.02.051
    • Sumby, W. H., & Pollack, I. (1954). Visual contribution to speech intelligibility in noise. The Journal of the Acoustical Society of America, 26, 212. doi:10.1121/1.1907309
    • Tye-Murray, N., Sommers, M. S., & Spehar, B. (2007). Audiovisual integration and lipreading abilities of older adults with normal and impaired hearing. Ear and Hearing, 28, 656–668. doi:10.1097/AUD.0b013e31812f7185
    • Wu, Y. C., & Coulson, S. (2007). Iconic gestures prime related concepts: An ERP study. Psychonomic Bulletin & Review, 14(1), 57–63. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/17546731

    Additional Resources