TITLE

Auditory and auditory-visual perception of clear and conversational speech

AUTHOR(S)
Helfer, Karen S.
PUB. DATE
April 1997
SOURCE
Journal of Speech, Language & Hearing Research;Apr1997, Vol. 40 Issue 2, p432
SOURCE TYPE
Academic Journal
DOC. TYPE
Article
ABSTRACT
Examines how speaking mode and presentation mode influence auditory-visual perception of words within nonsense sentences. Benefit of clear speech for words occurring in the middle of sentences compared with words at either the beginning or end of sentences for both; Suggestion that speaking clearly and providing visual speech information provide complimentary information.
ACCESSION #
9710295758

 

Related Articles

  • Effects of Horizontal Viewing Angle on Visual and Audiovisual Speech Recognition. Jordan, Timothy R.; Thomas, Sharon M. // Journal of Experimental Psychology. Human Perception & Performan;Dec2001, Vol. 27 Issue 6, p1386 

    Investigates the effects of changes in horizontal viewing angle on visual and audiovisual speech recognition. Unimodal visual speech: Visual speech; Congruent and incongruent auditory speech.

  • Effects of Two Repair Strategies on Speechreading of Words and Sentences. Marzolf, Cari A.; Stewart, Michael; Nerbonne, Michael A.; Lehman, Mark E. // Journal of the American Academy of Audiology;Jun1998, Vol. 9 Issue 3 

    The effectiveness of two repair strategies, repetition and paraphrasing, in enhancing speechreading performance was evaluated on 20 young adults with normal hearing using both word and sentence test stimuli. Both strategy types produced significant improvement, and the repeat strategy yielded...

  • Speech intelligibility reduces over distance from an attended location: Evidence for an auditory spatial gradient of attention. Allen, Kachina; Alais, David; Carlile, Simon // Attention, Perception & Psychophysics;Jan2009, Vol. 71 Issue 1, p164 

    Speech reception thresholds (SRTs) were measured at a central focus of attention and at 20�, 40�, and 60� locations distant in azimuth. Measurements were taken with one target collocated with two maskers, or with maskers flanking the target by �20�. For 80% of trials,...

  • Realtime Lip Contour Tracking For Audio-Visual Speech Recognition Applications. Yazdi, Mehran; Seyfi, Mehdi; Rafati, Amirhossein; Asadi, Meghdad // International Journal of Biological & Medical Sciences;2009, Vol. 4 Issue 4, p190 

    Detection and tracking of the lip contour is an important issue in speechreading. While there are solutions for lip tracking once a good contour initialization in the first frame is available, the problem of finding such a good initialization is not yet solved automatically, but done manually....

  • Hearing lips in a second language: visual articulatory information enables the perception of second language sounds. Navarra, Jordi; Soto-Faraco, Salvador // Psychological Research;Jan2007, Vol. 71 Issue 1, p4 

    We investigated the effects of visual speech information (articulatory gestures) on the perception of second language (L2) sounds. Previous studies have demonstrated that listeners often fail to hear the difference between certain non-native phonemic contrasts, such as in the case of Spanish...

  • Effect of Visual Speech in Sign Speech Synthesis. Krňoul, Zdeněk // World Academy of Science, Engineering & Technology;Jul2009, Issue 31, p5 

    This article investigates a contribution of synthesized visual speech. Synthesis of visual speech expressed by a computer consists in an animation in particular movements of lips. Visual speech is also necessary part of the non-manual component of a sign language. Appropriate methodology is...

  • The McGurk Effect: The Difficulty of Separating Sound from Sight.  // Hearing Review;Oct2013, Vol. 20 Issue 11, p48 

    The article discusses a study which pinpointed the source of the McGurk effect, a phenomenon that demonstrates an interaction between hearing and vision in speech perception. The study, published in the journal "PLOS ONE," located the source of McGurk effect by recording and analyzing brain...

  • AUDITORY AND VISUAL CUEING OF THE [± ROUNDED] FEATURE OF VOWELS. Lisker, Leigh; Rossi, Mario // Language & Speech;Oct-Dec92, Vol. 35 Issue 4, p391 

    Discusses the production of some isolated vowels in ten random orders by a seasoned phonetician. Recording of the acoustic signals and frontal views; Lipreading effect; Audiovisual speech perception.

  • Does Face Inversion Change Spatial Frequency Tuning? Willenbockel, Verena; Fiset, Daniel; Chauvin, Alan; Blais, Caroline; Arguin, Martin; Tanaka, James W.; Bub, Daniel N.; Gosselin, Frédéric // Journal of Experimental Psychology. Human Perception & Performan;Feb2010, Vol. 36 Issue 1, p122 

    The authors examined spatial frequency (SF) tuning of upright and inverted face identification using an SF variant of the Bubbles technique (F. Gosselin & P. G. Schyns, 2001). In Experiment 1, they validated the SF Bubbles technique in a plaid detection task. In Experiments 2a-c, the SFs used...

Share

Read the Article

Courtesy of VIRGINIA BEACH PUBLIC LIBRARY AND SYSTEM

Sorry, but this item is not currently available from your library.

Try another library?
Sign out of this library

Other Topics