TITLE

Auditory Speech Recognition and Visual Text Recognition in Younger and Older Adults: Similarities and Differences Between Modalities and the Effects of Presentation Rate

AUTHOR(S)
Humes, Larry E.; Burk, Matthew H.; Coughlin, Maureen P.; Busey, Thomas A.; Strauser, Lauren E.
PUB. DATE
April 2007
SOURCE
Journal of Speech, Language & Hearing Research;Apr2007, Vol. 50 Issue 2, p283
SOURCE TYPE
Academic Journal
DOC. TYPE
Article
ABSTRACT
Purpose: To examine age-related differences in auditory speech recognition and visual text recognition performance for parallel sets of stimulus materials in the auditory and visual modalities. In addition, the effects of variation in rate of presentation of stimuli in each modality were investigated in each age group. Method: A mixed-model design was used in which 3 independent groups (13 young adults with normal hearing, 10 elderly adults with normal hearing, and 16 elderly hearing-impaired adults) listened to auditory speech tests (a sentence-in-noise task, time-compressed monosyllables, and a speeded-spelling task) and viewed visual text-based analogs of the auditory tests. All auditory speech materials were presented so that the amplitude of the speech signal was at least 15 dB above threshold through 4000 Hz. Results: Analyses of the group data revealed that when baseline levels of performance were used as covariates in the group analyses the only significant group difference was that both elderly groups performed worse than the young group on the auditory speeded-speech tasks. Analysis of individual data, using correlations, factor analysis, and linear regression, was generally consistent with the group data and revealed significant, moderate correlations of performance for similar tasks across modalities, but stronger correlations across tasks within a modality. This suggests that performance on these tasks was mediated both by a common underlying factor, such as cognitive processing, as well as modality-specific processing. Conclusion: Performance on measures of auditory processing of speech examined here was closely associated with performance on parallel measures of the visual processing of text obtained from the same participants. Young and older adults demonstrated comparable abilities in the use of contextual information in each modality, but older adults, regardless of hearing status, had more difficulty with fast presentation of auditory speech stimuli than young adults. There were no differences among the 3 groups with regard to the effects of presentation rate for the visual recognition of text, at least for the rates of presentation used here.
ACCESSION #
24729796

 

Related Articles

  • Detection and Recognition of Stop Consonants by Normal-Hearing and Hearing-Impaired Listeners. Turner, Christopher W.; Fabry, David A.; Barrett, Stephanie; Horwitz, Amy R. // Journal of Speech & Hearing Research;Aug92, Vol. 35 Issue 4, p942 

    Presents a study which examined the possibility that hearing-impaired listeners require a larger signal-to-noise ratio for the detection of speech sounds. Method; Results and discussion.

  • PERIODICAL ARTICLES AND PAMPHLETS: AUDITORY IMPAIRMENTS.  // Exceptional Children;Nov1957, Vol. 24 Issue 3, p140 

    Several articles and pamphlets are presented including "The Temporary Enrollment of the Hard of Hearing Children in Educational Programs for the Deaf," by Francis X. Blair, "Sex Education of Deaf Children," by Joyce Whittier Chaplin, "Let's Practice Lipreading: Practice Material for Work With...

  • Relationship between Laboratory Measures of Directional Advantage and Everyday Success with Directional Microphone Hearing Aids. Cord, Mary T.; Surr, Rauna K.; Walden, Brian E.; Dyrlund, Ole // Journal of the American Academy of Audiology;May2004, Vol. 15 Issue 5, p353 

    The improvement in speech recognition in noise obtained with directional microphones compared to omnidirectional microphones is referred to as the directional advantage. Laboratory studies have revealed substantial differences in the magnitude of the directional advantage across...

  • Contributions of Oral and Extraoral Facial Movement to Visual and Audiovisual Speech Perception. Thomas, Sharon M.; Jordan, Timothy R. // Journal of Experimental Psychology. Human Perception & Performan;Oct2004, Vol. 30 Issue 5, p873 

    Seeing a talker' s face influences auditory speech recognition, but the visible input essential for this influence has yet to be established. Using a new seamless editing technique, the authors examined effects of restricting visible movement to oral or extraoral areas of a talking face. In...

  • Alignment to visual speech information. Miller, Rachel; Sanchez, Kauyumari; Rosenblum, Lawrence // Attention, Perception & Psychophysics;Aug2010, Vol. 72 Issue 6, p1614 

    Speech alignment is the tendency for interlocutors to unconsciously imitate one another's speaking style. Alignment also occurs when a talker is asked to shadow recorded words (e.g., Shockley, Sabadini, & Fowler, 2004). In two experiments, we examined whether alignment could be induced with...

  • Multisensory speech perception of young children with profound hearing loss. Kishon-Rabin, Liat; Haras, Nava // Journal of Speech, Language & Hearing Research;Oct1997, Vol. 40 Issue 5, p1135 

    Presents a study which evaluated the contribution of a two-channel vibrotactile aid to the auditory and visual perception of speech in young deaf children. Use of words and speech pattern contrasts; Intensive, hierarchical and systemic training program; Word recognition; Detection of consonant,...

  • Short-Term Visual Deprivation Improves the Perception of Harmonicity. Landry, Simon P.; Shiller, Douglas M.; Champoux, Fran├žois // Journal of Experimental Psychology. Human Perception & Performan;Dec2013, Vol. 39 Issue 6, p1503 

    Neuroimaging studies have shown that the perception of auditory stimuli involves occipital cortical regions traditionally associated with visual processing, even in the absence of any overt visual component to the task. Analogous behavioral evidence of an interaction between visual and auditory...

  • Language identification from visual-only speech signals. Ronquest, Rebecca; Levi, Susannah; Pisoni, David // Attention, Perception & Psychophysics;Aug2010, Vol. 72 Issue 6, p1601 

    Our goal in the present study was to examine how observers identify English and Spanish from visual-only displays of speech. First, we replicated the recent findings of Soto-Faraco et al. (2007) with Spanish and English bilingual and monolingual observers using different languages and a...

  • A comparison of the McGurk effect for spoken and sung syllables. Quinto, Lena; Forde Thompson, William; Russo, Frank; Trehub, Sandra // Attention, Perception & Psychophysics;Aug2010, Vol. 72 Issue 6, p1450 

    The importance of visual cues in speech perception is illustrated by the McGurk effect, whereby a speaker's facial movements affect speech perception. The goal of the present study was to evaluate whether the McGurk effect is also observed for sung syllables. Participants heard and saw sung...

Share

Read the Article

Courtesy of VIRGINIA BEACH PUBLIC LIBRARY AND SYSTEM

Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics