INSTRUCTOR'S MANUAL For Sensation and Perception 10th Edition. Bruce Goldstein, James Brockmole

Page 1


Sensation and Perception 10e Bruce Goldstein, James Brockmole (Instructor's Manual All Chapters, 100% Original Verified, A+ Grade) (Lecture Notes Only) CHAPTER 1: INTRODUCTION TO PERCEPTION

Chapter Outline I.

Introduction A. Some Questions We Will Consider 1. Hypothetical “Science Project”: Design a Sensory Device

II.

Why Read This Book?

III.

The Perceptual Process A. But What About “Sensation”? B. Distal and Proximal Stimuli (Steps 1 and 2) 1. Distal stimulus 2. Principle of transformation 3. Proximal stimulus 4. Principle of representation C. Receptor Processes (Step 3) 1. Sensory receptors a. Example: visual pigment 2. Transduction C. Neural Processing (Step 4) 1. Transmission 2. Change/Processing Signal 3. Primary receiving area 4. Cerebral Cortex a. Occipital lobe b. Temporal lobe c. Parietal lobe d. Frontal lobe D. Behavioral Responses (Steps 5-7) 1. Perception 2. Recognition a. Problems of recognition: e.g., visual form agnosia 3. Action E. Knowledge 1. Demonstration: perceiving a picture a. The “Rat-Man” 2. Categorize 3. Bottom-up (data-based) processing 4. Top-down (knowledge-based) processing

IV.

Studying the Perceptual Process A. The Two “Stimulus” Relationships (A and B) 1. Stimulus-perception relationship

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


2. Stimulus-physiology relationship B. The Physiology-Perception relationship C. Cognitive Influences on Perception V.

“Test Yourself 1.1”

VI.

Measuring Perception A. Gustav Fechner Introduces Methods to Measure Thresholds 1. Classical Psychophysical Methods: a. Method of Limits i. Absolute Threshold ii. Difference Threshold 2. Five Questions About the Perceptual World a. Question 1: What is the Perceptual Magnitude of a Stimulus? Technique: Magnitude Estimation b. Method: Magnitude Estimation c. Question 2; What is the Identity of a Stimulus? Technique: Recognition Testing d. Question 3: How Quickly Can I React to It? Technique: Reaction Time e. Question 4: How Can I Describe What Is Out There? Technique: Phenomenological Report f. Question 5: How Can I Interact With It? Technique: Physical Tasks and Judgments

VII.

Something to Consider: Why is the Difference between Physical and Perceptual Important?

VIII.

“Test Yourself 1.2”

IX.

Think About It

X.

Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. State and explain each step of the perceptual process. 2. Differentiate between “top-down” and “bottom-up” processing. 3. Describe how cognitive processes can influence perception. 4. List five different ways to study perception. 5. Explain the concept of recognition and how it is distinct from perception. 6. Define “absolute threshold” and “difference threshold.” 7. Describe the methods used in method of limits and magnitude estimation studies.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Chapter Overview/ Summary Chapter 1 introduces the student to the basic concepts in perception. The opening vignette engages the student to think about perception as a science project: how you design a device to obtain information from the environment. This exercise reveals some of the major issues and complexities in perception. Next, four reasons for studying perception are outlined: (1) studying perception can result in a career: (2) applications of perception research overlap with other fields, such as medicine (e.g., treating dysfunctions of sensory systems), robotics, computer science, and engineering; (3) studying perception results in a greater appreciation of your sensory systems and enhances your curiosity about perceptual experiences; and (4) studying perception is inherently interesting. Goldstein then presents the steps of the “perceptual process”. These steps can be included in four categories: (1) Stimuli; (2) Receptor Processes; (3) Neural Processing; (4) Behavioral Responses. The role of Knowledge is also discussed in relation to the steps of the perceptual process. The process starts with the environmental stimulus, followed by the stimulus on the receptors (the “image” in vision. Stimulus processing continues with transduction (converting the physical energy into neural energy); transmission (receptors activating other neurons, which activate more neurons); and neural processing (the interactions between neurons and neural systems). Transduction is analogous to information transmission between an individual and an ATM. The steps categorized as “Behavioral Responses” are: perception (the conscious sensory experience); recognition (classifying objects into categories); and action (motor activities that occur to react to the sensory information. The “Knowledge” section highlights the influence of “top-down” cognitive processes on other steps in the perceptual process, as shown by the “rat-man” demonstration. The remainder of the chapter addresses how perception is studied. The approaches to studying perception are the psychophysical level of analysis (the stimulusperception relationship) and the physiological approach (the stimulus-physiology relationship and the physiology-perception relationship). Both approaches are necessary to fully understand perception. Cognitive influences on perception are also vitally important to study. More specific ways of studying the psychophysical level of analysis are then detailed. These include detection/measuring thresholds; magnitude estimation, description (phenomenological method), and various behavioral methods (visual search, same-different judgments, distance judgments). Classical psychophysical methods for measuring detection are the method of limits, method of adjustment, and the method of constant stimuli (the latter two discussed in Appendix A). Using these methods, a researcher can determine the participant’s absolute threshold and difference threshold (DL). ’s law is the first psychophysical law discussed in Appendix B: the ratio of the DL to the standard stimulus is a constant fraction. Magnitude estimation is also discussed in more detail in Appendix C, including the major method used, representative results from stimuli in different modalities, and Stevens’s Power Law. Results from judging the brightness of a light indicate “response compression” (doubling the physical intensity of the light less than doubles the perceptual brightness of the light). Results from judging the intensity of an electric shock indicate “response expansion” (doubling the physical intensity of the shock more than doubles the perceptual response to the shock). Stevens’s Power Law specifies the relationship between the physical intensity and the perceptual experience. A key component of this law is that the physical stimulus intensity is raised © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


to an exponent. This exponent is derived from the slope of the line created by taking the logarithm of the physical intensities and the logarithm of the magnitude estimations. The reason why it is a good thing for humans to have brightness show response compression, and electric shock show response expansion, is discussed. Appendix D introduces the idea of response criterion in a detection study, and how signal detection theory accounts for this. Demonstrations, Activities, and Lecture Topics (1) Inherent Interest in Perception: Encourage students to bring in examples of visual phenomena that they may have seen. Many students have had websites with illusions forwarded to them. Students may have 3-D magazines, books, or video games. I have an old box of Apple Jacks cereal that has numerous visual illusions on the back, and paper diner placemats with illusions. Emphasize the point that the ubiquity of these examples shows how inherently interesting perception is. (2) Human Factors and Perception: Goldstein cites applications of perception as one of the reasons for studying perception. A major contributor in this field is Donald Norman, the author of “The Psychology of Everyday Things” (1988), “Emotional Design” (2004), and “Living with Complexity” (2010). His JND (Just Noticeable Difference, a psychophysiological term related to difference thresholds) website has links to many of his essays and sample chapters (including “Attractive Things Work Better” from “Emotional Design” and “Memory is More Important than Actuality”). Two examples I like to use from “The Psychology of Everyday Things” are: (1) the beer-handle controls (Figure 4.6) to have visual and tactual discrimination of controls; and (2) the relatively well-known stove-top design and controls (Figures 3.3, 3.4 , and 3.5). The latter example shows the idea of natural mapping, which highlights the problem associated with the disputed “butterfly ballot” of the 2000 Presidential election. (Wikipedia provides a photo of the ballot and more information regarding its use in Florida). Goldstein also specifically mentions highway sign visibility. Don Meeker and James Montalbano have recently designed a new typeface for interstate highway signs; a slideshow of the development of this new typeface can be found at The New York Times website in a 2007 slideshow entitled “What’s Your Sign?”. (3) “Do The Math” behind Stevens’s Power Law (Appendix C): Give your students a concrete example of how Stevens’s Law works by plugging in actual values. To keep it simple, assume K=1. Then demonstrate response expansion by using n=3, and varying S from 2 to 8. Students will see how rapidly P increases. Then demonstrate response compression by using n=0.67 (or 2/3). This introduces the student to the wonderful world of fractional exponents, where you first square S, then take the cubed root of that quantity. Again, varying S from 2 to 8, the student will see that P does increase, but at a slower rate.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(4) Signal Detection Theory and “Phantom Vibration Syndrome”(Appendix D): Signal detection theory is introduced in the “Something to Consider” segment of Chapter 1. Another way to initiate conversation about SDT is the phenomenon of “phantom vibration syndrome” (for example, a USA Today article from 2007 entitled “Good Vibrations? Bad? None at all?). Some people report that they feel their cell phone vibrating, only to find out that it isn’t. The simplest explanation would be in terms of the role of expectation in SDT. The Spokesman - Review, June 19th 2007, story “Phantom vibration syndrome” provides a few plausible explanations, one of which relates to differences in response criterion. In addition, a 2010 study published in BMJ - “Phantom vibration syndrome among medical staff: a cross sectional survey” - reports a 68% incidence rate of this phenomenon. (5) Scavenger Hunt Icebreaker: This first-day activity can introduce the students to some major topics in perception, and introduce them to each other! In this type of scavenger hunt, which has been used as an icebreaker in various situations, the student is given a list of “characteristics” and must find someone else in the classroom that fits that characteristic. For example, the item might be “Has a dog or a cat,” and then the student finds a classmate who has a dog or cat. The key element here is to generate items that can be linked to the course. For example, the above item could be used to address the differences in perception between humans and other animals (“Are dogs colorblind?”; “How is a dog’s sense of smell different than humans?”). It can also help to add pop culture references: I included “Knows what was distinctive about Amanda Swafford on Cycle 3 America’s Next Top Model” (she was legally blind), or “Has seen the U23D movie.” I usually use about 12-14 items for the scavenger hunt in a class of 24-30 students. (6) Classic Psychophysical Methods: Students can get “hands-on” experience with classic psychophysical methods by being the experimenter and the participant with the right equipment. In order to demonstrate Weber’s weight lifting discrimination studies (Appendix B), Lafayette Instruments (ordering information can be found online) has “Discrimination Weights” (Model 16015). To measure two-point cutaneous sensitivity, Lafayette also has a Two Point Aesthesiometer (Model 16022). The advantage of using these devices is that the student/researcher can easily manipulate the stimulus intensities to present to a classmate, according to the psychophysical method being demonstrated. (7) Blindfolds: One key concept in a perception course is to not take your senses for granted. To demonstrate this point, you can bring in some blindfolds and ask for volunteers to wear them. It isn’t unusual to have no one volunteer, at which point you can discuss everyone’s reluctance to wear them. If you do have volunteers, let them wear the blindfolds and keep the class quiet for about two minutes. Then have the volunteers take the blindfolds off, and report their reactions to the experience.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(8) Method of Adjustment/ Social Psychophysics (Appendix A): Wally Beagley developed the EyeLines software for doing method of adjustment experiments. This free software can introduce students to this classic psychophysical method. One way to use this software to highlight a key concept in Chapter 1 is to do a magnitude estimation task that demonstrates Stevens’s idea of social psychophysics. Stevens, in his book “Psychophysics: An Introduction to its Perceptual, Neural, and Social Prospects” (1975), believes that magnitude estimation can be used to scale attitudes, such as watch preferences or attitudes to crimes or monarchs. Based on Exercise #28 in “Workshops in Perception” by Power, Hausfeld, and Gorta (1981), you can use EyeLines to have students estimate how happy they would be to receive various money amounts. It is fairly simple to program EyeLines to present a range of money amounts to the student, and the student then uses the mouse to draw a line that reflects how happy he/she would be to win that amount of money: the longer the line drawn, the happier he/she would be with that money amount. The results can be discussed in terms of response compression or expansion. This also introduces the student to the concept of cross-modal matching in magnitude estimation. Suggested Websites Classics in the History of Psychology - Fechner Christopher Green has created a great website for the history of psychology. The contents of Fechner’s “Elements of Psychophysics”: Sections VII and XIV are particularly relevant for Chapter 1. The ISP web site: History of Psychophysics This is the website for the International Society for Psychophysics. There is information about the society, but the “History” link has a wealth of information about the history of psychophysics, including a “psychophysics family tree.” Donald Norman’s JND website This is Donald Norman’s “just noticeable difference” website. Explore the site for examples of human factors and perception as well as career options in human-centered design. Human Factors and Ergonomics Society This is the website of the Human Factors and Ergonomics Society. Interested students can find out more about opportunities and research in human factors. Cool Optical Illusions This is one of the “fun” websites to introduce the students to visual illusions. Higherlevel websites with a greater amount of content will be given in later chapters, but this may be a good starting point.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Wikipedia: Spinning Dancer Illusion Another illusion that students may be familiar with is the “spinning dancer” silhouette. This illusion is quite robust, and immediately piques student interest in perception. Even though this illusion would be covered more thoroughly when discussing depth ambiguities, it is a nice introductory illusion to generate interest. Movie Scene: Spider-Man (2002) (Scene 7 – “Fight with Flash): Prior to this scene, Peter Parker (portrayed by Tobey Maguire) has been bitten by a spider that will turn him into Spiderman, and is just realizing that he is undergoing changes. He has inadvertently splashed food all over Flash (Joe Manganiello), the class bully. The scene begins with Flash chasing after Peter, and then they confront each other. Peter is able to use his enhanced visual abilities (and other new powers) to defeat Flash, and hopefully impress Mary Jane (Kirsten Dunst). You can relate the scene to the text in relation to the reasons for studying perception: Not only is it important to maintain the sensory capabilities that you have, but heightened senses would greatly benefit any individual. (This, of course, is also the central idea behind many superheroes, and the police officer on the TV shows The Sentinel and Bionic Woman).

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 2: THE BEGINNING OF THE PERCEPTION PROCESS

Chapter Outline I.

Introduction A. Some Questions We Will Consider

II.

Starting at the Beginning

III.

Light, the Eye, and the Visual Receptors A. Light: The Stimulus for Vision 1. Electromagnetic spectrum 2. Wavelength 3. Visible Light B. The Eye 1. Pupil 2. Cornea 3. Lens 4. Retina 5. Rods and cones 6. Visual pigments 7. Optic nerve 8. Fovea 9. Peripheral Retina 10. Macular degeneration 11. Retinitis pigmentation 12. Blind spot a. Demonstration: Becoming Aware of the Blind Spot b. Demonstration: Filling in the Blind Spot Focusing Light Onto the Receptors A. Accommodation 1. Demonstration: Becoming Aware of What Is In Focus B. Presbyopia C. Myopia/Nearsightedness 1. Refractive myopia 2. Axial myopia D. Hyperopia/Farsightedness

IV.

V.

Receptors and Perception A. Transforming Light Energy into Electrical Energy 1. Transduction a. Opsin b. Retinal B. Adapting to the Dark 1. Measuring the dark adaptation curve

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


a. Method: Measuring the dark adaptation curve i. Light-adapted sensitivity ii. Dark-adapted sensitivity 2. Measuring cone adaptation 3. Measuring rod adaptation a. Rod monochromats b. Rod-cone break 4. Visual pigment regeneration a. Visual pigment bleaching b. Regeneration, Rushton (1961) c. Detached retina C. Spectral Sensitivity 1. Spectral sensitivity curves 2. Method: Measuring a Spectral Sensitivity Curve a. Monochromatic light b. Cone Spectral Sensitivity Curve c. Rod Spectral Sensitivity Curve d. Purkinje shift 3. Rod- and Cone-Pigment Absorption Spectra VI. Test Yourself 2.1 VII.

Electrical Signals in Neurons A. Structure of Neurons 1. Cell body 2. Dendrites 3. Axon/Nerve Fiber 4. Sensory receptors e.g. Vision a. Lateral Geniculate Nucleus b. Visual receiving area B. Recording Electrical Signals in Neurons 1. Method: The Setup for Recording From a Single Neuron C. Basic Properties of Action Potentials 1. Propagated response 2. Size of action potential vs. Rate of firing 3. Refractory periods 4. Spontaneous activity D. Chemical Basis of Action Potentials 1. Ions 2. Membrane permeability 3. Rising phase of the action potential 4. Falling phase of the action potential 5. Sodium-potassium pump E. Transmitting Information Across a Gap 1. Synapse

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


2. Neurotransmitters 3. Receptor Sites a. Excitatory response/depolarization b. Inhibitory response/hyperpolarization VIII.

Neural Convergence and Perception A. Retinal anatomy 1. Receptors 2. Bipolar cells 3. Ganglion cells 4. Horizontal and amacrine cells 5. Neural convergence B. Convergence Causes the Rods to Be More Sensitive Than the Cones C. Lack of Convergence Causes the Cones to Have Better Acuity Than Rods 1. Demonstration: Foveal Versus Peripheral Acuity

IX.

Something to Consider: Early Events Are Powerful A. Hubble Space Telescope faulty lens and “eyeglasses” B. Electromagnetic energy and visual pigments affect vision

X.

Developmental Dimension: Infant Visual Acuity A. Method: Preferential Looking B. Visual Evoked Potential (VEP)

XI.

Test Yourself 2.2

XII.

Think About It

XIII.

Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. Describe how the cornea and lens focus the image on the retina. 2. Describe the role of visual pigments in transduction. 3. Describe the method for measuring dark adaptation, and the overall results. 4. Discuss the differences between the distribution of the rods and the cones. 5. Explain why the “blind spot” exists, and why we are not usually aware of it. 6. Identify the key components of neurons. 7. Define propagated response, and discuss how this is related to measuring activity in a single neuron. 8. Describe depolarization, hyperpolarization, and inhibition. 9. Describe what convergence is, and how it related to acuity in rods and cones. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


. 10. Discuss how visual acuity develops over the first year of life.

Chapter Overview/ Summary Chapter 2 begins with a discussion of the early stages of vision, specifically, how visible light is transformed by the major structures of the eye and receptors. Visible light (400 nm to 700 nm) is structured by the environment and reflected into the eye. The cornea and the lens focus the image on the retina. The lens accomplishes this by the process of accommodation, increasing the focusing power of the lens when viewing near objects. As we age, accommodation becomes more difficult, a condition called presbyopia. Two additional vision problems, myopia and hyperopia, are described. After the visual image reaches the retina, transduction is accomplished by rods and cones. Exactly how transduction occurs in the rods and cones is then summarized. The visual pigment molecules are composed of opsin and retinal. The retinal reacts to light, by changing shape (a process called isomerization), which results in transduction. Physiological studies showed that isomerizing one visual pigment molecule results in the enzyme cascade, causing a sequence of chemical reactions within the receptor. The role of visual pigments in the receptors are then discussed. The rods and cones are different in several ways: their shape, the location of each type on the retina, and overall number of each in the human eye. As rods and cones have distinct functions, they also can be associated with different disorders in vision: macular degeneration is a condition that destroys foveal cones; and retinitis pigmentosa primarily destroys the peripheral rods. Where there are no receptors (where the optic nerve leaves the eye), a blind spot exists. Rods and cones also differ in their adaptation to the dark, achieving maximum sensitivity at different times. Rushton showed that these times correspond to the time needed for pigment regeneration. Spectral sensitivity is also dependent on characteristics of the visual pigment, specifically, the absorption spectrum. A demonstration of the difference in spectral sensitivity for rods and cones is the Purkinje shift – enhanced perception of short wavelengths during dark adaptation. After transduction, visual information is represented via an electrical signal. The electrical signal begins at the receptor and travels through the nerve to the brain where it synapses with neurons. The major parts of a neuron are listed, and the receptors are discussed as a specific type of neuron. The role of electrodes in recording neural activity is described. The details of how electrical signals are transmitted are discussed next. Four basic properties of action potentials are discussed: (1) the action potential is a propagated response; (2) changes in stimulus intensity increases rate of firing, not the size of the action potential; (3) refractory periods occur, providing an upper limit to firing rate; and (4) neurons have spontaneous activity, even when no stimuli are presented. A detailed discussion of the action potential follows including the role of sodium, chlorine, and potassium ions in the conduction of the electric signal. The values of the resting potential and how these change during the action potential are provided, as well as the role of permeability and selective permeability in the process. Communication between neurons occurs at the synapse: neurotransmitters are released across the synapse, and taken into the receptor sites. The change in the voltage of the postsynaptic neuron depends on the amount of excitation and inhibition it receives from the presynaptic neurons. Goldstein then describes differences in the transfer of information from rods and cones. He begins with the concept of convergence, then demonstrates the resultant © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


trade-off between sensitivity and acuity. Convergence (as occurs with the rods) leads to greater sensitivity, but decreased acuity. “Direct lines” (as occurs with the foveal cones) leads to decreased sensitivity, but greater acuity. The final portion of the chapter details the challenges, methods, and results of research on infant perceptual development. Testing infant perceptual capabilities can be difficult as infants are non-verbal. Therefore, methods like preferential looking (PL) and visual evoked potentials have been devised to assess infant visual capacities. Using these techniques, it has been determined that acuity progresses from about 20/400 20/600 at 1 month to full acuity after 1 year. The major reason for this is that the retinal cones and cortical cells have to develop. Contrast sensitivity functions, in which contrast is used to determine the sensitivity to seeing gratings, show that sensitivity to higher frequencies develop over the first three months, so fine details are only perceived from after 3 to 6 months.

Demonstrations, Activities, and Lecture Topics (1) Classroom Pinhole Camera: Prull and Banks (2005) have developed an outstanding demonstration to teach about the functioning of the eye. They created a “classroom-sized pinhole camera” to demonstrate pupil functioning, image inversion, and focusing (including myopia and hyperopia). Details can be found at: Prull, M. W., & Banks, W. P. (2005). Seeing the light: A classroom-sized pinhole camera demonstration for teaching vision. Teaching of Psychology, 32, 103-106. (2) Online Tests for Visual Disorders: Free online eye “exams” for macular degeneration, myopia, and hypermyopia can be found online at the Optometric Eye Site. The St. Luke’s Cataract and Laser Institute website has a page entitled “Eye Q” which also contains the Amsler grid (can be printed) and information. For the test, the student would cover one eye and look at the fixation point. While doing this, they should see if the lines appear wavy or if portions of the grid are missing; either result would be a potential indicator of an abnormality. (3) Visual Disorders: The topic of visual disorders always leads to numerous questions from students. Be prepared for questions about LASIK and other surgical treatments, macular degeneration, and astigmatism. The websites listed below contain information, photos, and animations on numerous disorders. (4) Eye Models: For instructors with larger budgets, Denoyer-Geppert Science Company offers a range of eye models that provide students with “hands-on experience.” There is a less-expensive student version, and a more detailed and expensive “Giant Five-Part Eyeball.” “Rubin’s Eye and Vision Lab” also may be a good investment, in which you can change lenses to demonstrate focusing problems. These can be ordered from the website. (5) Video Suggestion: Denoyer-Geppert also has a video series – “The Special Senses” – that covers some information presented in Chapter 2. The video reviews each sensory receptor (vision, hearing, smell, taste). For vision, there is discussion of the eyeball and the process of refraction. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(6) Interactive Class Demonstrations: There have been at least three articles that have addressed how to get the class directly and actively involved in simulating neural functioning. Some have been used for introductory psychology and physiological psychology, but certainly would be appropriate for this chapter. Here are the references for these three: Bockoven, J. (2004). The pedagogical toolbox: Computer-generated visual displays, classroom demonstration, and lecture. Psychological Reports, 94, 967-975. Hamilton, S. B., & Knox, T. A. (1985) The collossal neuron: Acting out physiological psychology. Teaching of Psychology, 12(3), 153-156. Reardon, R.R., Durso, F.T., & Wilson, D.A.. (1994). Neural coding and synaptic transmission: Participation exercises for introductory psychology. Teaching of Psychology, 21(2), 96-99. (7) Evolution and Synapses: An interdisciplinary approach to neural functioning was addressed in a 2008 New York Times article on the relationship between evolution and synapse formation. The article is entitled, “Brainpower may lie in the complexity of synapses”. (8) Preferential Looking and Confounds: A research method issue of experimenter effects can be highlighted by research involving the preferential looking technique. Goren, Sarty, and Wu (1975) reported that newborns preferred looking at “scrambled faces” than “unscrambled faces.” Maurer and Young (1983) were skeptical about the results, and replicated the study, with two important differences: (1) they put the infant in a baby seat, instead of the experimenter’s lap; and (2) made sure that the experimenter could not see the stimulus. These changes eliminated the preference for the scrambled faces. The implication is that the experimenters in the first study may have unintentionally influenced the infant’s looking behavior. The bottom line is that care must be taken in using these techniques when testing infants. 9) Dog Eye Charts: A humorous spin on the issues involving testing visual acuity of non-verbal individuals (infants and dogs) is the construction of “dog eye charts.” An original example of this was used in a “Peanuts” cartoon in which Charlie Brown takes Snoopy to the “eye doctor,” where Snoopy reads a “Snellen-like” chart where the stimuli are different-sized paw prints. Local Paper Studio (online vendor) provides humorous dog and cat “eye charts”. 10) VEP and Dyslexia: Although the chapter focuses on the use of VEPs to study infant development of acuity, VEPs can also be used to determine if there are differences in the visual processing of older children. One area where this has been tested is to see if there is a difference in early visual processing between dyslexic and non-dyslexic children. Solan et al. (1990) found that the VEP amplitude for response to reversing checkerboards was generally higher for © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


control children than dyslexic children, but the binocular advantage was similar for the two groups. Solan, H.A., Sutija, V.G., Ficarra, A. P., & Wurst, S. A. (1990). Binocular advantage and visual processing in dyslexic and control children as measured by visual evoked potentials. Optometry and Visual Science, 67, 105-110. Suggested Websites ePsych: An Electronic Psychology Text The site contains many student-friendly tutorials, very readable, informative, and sprinkled with humor. Click on “The Biological Mind” and begin with “The Neuron”. A discussion of the structure of the eye, blind spot, rods and cones, and dark adaptation appear in “The Eye”. SUNY College of Optometry This is the website for the SUNY College of Optometry. Students who express an interest in a career in optometry can check out the admissions policies. Another interesting aspect of this optometry school is that they maintain a “Learning Disabilities Unit” at the University Eye Center. Students with overlapping interests in vision and developmental/clinical issues should find their work fascinating. University of Michigan Kellogg Eye Center This website represents another optometry school. It also has a great summary of visual conditions and treatments. National Eye Institute The National Eye Institute website has a wealth of information, pictures, and videos. You can search the site for information about retinitis, macular degeneration, detached retina, and other conditions discussed in Chapter 2. Foundation Fighting Blindness The goal of the foundation represented on this website is to support research of retinal degenerative diseases. The website contains numerous links, images, and information sources. News reports are updated frequently. In addition to the factual information about the conditions, causes, and treatments, the site also addresses the emotional aspect of coping with visual disorders. Jan Worst Research Group: Ophthalmic Studies This Netherlands-based ophthalmology center has some interesting (if somewhat graphic) images of corneal surgery, and the surgical equipment used. Webvision: The Organization of the Retina and Visual System This website is probably aimed more at graduate students than undergraduates, but instructors can peruse this outstanding site for many concepts covered in Chapter 2. Find the Table of Contents (positioned on the right side of the page) and select Part II © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(Anatomy and Physiology of the Retina) and Part III (Retinal Circuits) for information relevant to this chapter. The Brain from Top to Bottom McGill University sponsors this informative website. Click on “From the simple to the complex” to access a nice tutorial on neural functioning or “Vision” for a tutorial on the structures involved in vision. A great feature of this website is that you can select “beginner,” “intermediate,” and “advanced” levels of information once you have chosen a topic area. Neurons - Animated Cellular and Molecular Concepts Animations and Powerpoint presentations abound on this website from the University of Toronto on neuron anatomy, synaptic transmission, and the action potential. UCLA Baby Lab The lab studies perceptual and cognitive development in infants. Eye tracking is one of the techniques they use with infants. Infant Vision Laboratory at Smith-Kettlewell An active infant perception lab at Smith-Kettlewell is headed by Andrew Norcia. They conduct VEP studies of infants. Also click on “What can my baby see?” for a great summary of helpful information. Movie Scene -- Wild Wild West (1999) (23:16-24:44): In this scene, West and Gordon (Will Smith and Kevin Kline) attempt to determine the last image a dead man saw. Gordon cites the “retinal terminus theory,” and projects this image from the dead man’s head (much to the displeasure of West!). This theory was popular in the mid-1800s (the time the movie is set in). One of the theory’s major proponents was Kuhne, a physiologist at Heidelberg University, who claimed he was able to retrieve the “optogram” from the eye of a guillotined man. Based on the information in Chapter 2, students can critically discuss the merits (or lack of merits) of this theory. Added bonuses in the film clip: Gordon refers to aqueous humor and that the image projected is upside-down due to refraction.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 3: NEURAL PROCESSING

Chapter Outline I.

Introduction A. Some Questions We Will Consider

II.

Inhibitory Processes in the Retina A. Lateral Inhibition in the Limulus 1. Hartline et al. (1956) a. Ommatidia b. Lateral plexus B. Using Lateral Inhibition to Explain Perception 1. The Lateral Inhibition Explanation of the Chevreul (Staircase) Illusion a. Chevreul (Staircase) Illusion b. Mach Bands 2. The Lateral Inhibition Explanation of the Hermann Grid a. Bipolar cells, initial response C. Problems with the Lateral Inhibition Explanation of the Chevreul Illusion and the Hermann Grid 1. Chevreul Illusion a. Luminance Ramp 2. Hermann Grid

III.

Test Yourself 3.1

IV.

Processing from Retina to Visual Cortex and Beyond A. Responding of Single Fibers in the Optic Nerve 1. Receptive fields a. Center-surround organization i. Excitatory-center, inhibitory-surround ii. Inhibitory-center, excitatory-surround b. Center-surround antagonism B. Hubel and Wiesel’s Rationale for Studying Receptive Fields 1. Method: Presenting stimuli to determine receptive fields 2. Visual pathway a. Optic Nerve b. Lateral Geniculate Nucleus (LGN) c. Superior Colliculus d. Occipital Lobe e. Visual receiving area/striate cortex C. Receptive Fields of Neurons in the Visual Cortex 1. Simple cortical cells

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


V.

a. Side-by-side receptive fields b. Orientation tuning curves 2. Complex cortical cells 3. End-stopped cortical cells 4. Table: Properties of neurons in the Optic Nerve, LGN, and Cortex Do Feature Detectors Play a Role in Perception? A. Selective Adaptation 1. Method: Psychophysical Measurement of the Effect of Selective Adaptation to Orientation a. Contrast threshold b. Adapting stimulus B. Selective Rearing 1. Neural plasticity

VI.

Higher-Level Neurons A. Inferotemporal (IT) cortex B. Fusiform face area (FFA)

VII.

Sensory Coding A. Specificity Coding B. Population Coding C. Sparse Coding

VIII.

Something to Consider: “Flexible” Receptive Fields A. Contextual Modulation

IX.

Test Yourself 3.2

X.

Think About It

XI.

Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. Define lateral inhibition, and describe research demonstrating the phenomenon. 2. Discuss lateral inhibition accounts and issues for three perceptual phenomena (Chevreul Illusion, Mach Bands, and Hermann Grid). 3. Define and identify “receptive field” and discuss techniques to map them. 4. Describe the basic function of the LGN, and differentiate among simple cortical cells, complex cortical cells, and end-stopped cortical cells. 5. Describe the method used for selective adaptation to orientation, and discuss how the phenomenon is related to feature detectors. 6. Contrast specificity coding, distributed coding, and sparse coding. 7. Discuss “flexible” receptive fields and explain context modulation. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Chapter Overview/ Summary Chapter 3 begins with additional discussion of the neural processing of visual information from the retina to the striate cortex. At the retinal level, lateral inhibition is introduced to describe the interaction between neurons. The Chevreul Illusion, Mach Bands, and the Hermann Grid are three phenomena that can be explained by lateral inhibition; however, there are some problems with these explanations. From the retina, information travels through the optic nerve which is composed of the axons of retinal ganglion cells. Each “fiber” of the nerve is associated with a receptive field (which typically includes the information from hundreds of receptors). The receptive field is the area on the retina that influences the firing rate of the neuron. Antagonistic center-surround (excitatory vs. inhibitory) configurations are prevalent receptive field patterns. This early receptive field work lead to Hubel and Wiesel’s systematic investigation of receptive fields in the visual pathway. Further neural processing occurs in the lateral geniculate nucleus, the striate cortex (containing simple, complex, and end-stopped cells), and the striate cortex. Because the different cortical neurons respond to specific features of a stimulus, these neurons are often referred to as “feature detectors.” Feature detectors play a role in visual perception, as demonstrated by selective adaptation to orientation (using grating stimuli and contrast sensitivity). Research that shows selective rearing of cats in an environment that contains only vertical lines also supports the role of feature detectors in vision. The last major topic in the chapter is the sensory code: How does the firing of neurons represent various characteristics of the environment. Three theories are: (1) specificity coding; (2) population coding; and (3) sparse coding. The “Something to Consider” portion of the chapter addresses the idea that receptive fields are flexible, and firing rates can be affected by what happens outside the receptive field – contextual modulation.

Demonstrations, Activities, and Lecture Topics (1) Create your own Hermann Grid: Students will occasionally ask about variants of the Hermann Grid, e.g., “What happens if you use colors?”; “What happens if you change the spacing between the squares?” To provide a brief answer to these questions, have the students create the “Hermann Grid” that answers the question. This is very easy using PowerPoint ™, which most students have used. Simply use the “auto-shapes”, make 15 copies of the original square, and then place the squares in the appropriate pattern and the desired fill colors. (This takes about 10 minutes for even an inexperienced user). They can then answer their own questions. (2) A different theory of brightness illusions: A review article by Purves et al. summarizes the Mach bands brightness illusions covered in Chapter 3 in addition to the Chubb-Sperling-Solomon illusion and the Craik-O’Brien-Cornsweet edge effects. The main premise of the article, however, is to propose a counterargument to the “conventional” lateral inhibition explanations (as used in Goldstein): an “empirical framework” theory. This theory is based, not on visual neural circuits, but on the © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


probability of the source of the ambiguous stimulus. Purves, D., Williams, M., Nundy, S., & Lotto, R.B. (2004). Perceiving the intensity of light. Psychological Review, 111, 142-158. Corney and Lotto (2007) use computational methods to study illusory perception, with the aim to show why these illusions come about. They show that natural stimuli are ambiguous, and that these ambiguities are resolved by encoding the statistical relationships between images and scenes in past visual experience. Corneym D. and Lotto, R.B. (2007). What are Lightness Illusions and Why Do We See Them? PLOS Biology. doi: 10.1371/journal.pcbi.0030180. (3) How did they do that? Chubb-Sperling-Solomon Illusion: An interesting aspect of perceptual phenomenon is how an effect is discovered. I contacted Charlie Chubb to find out the origins of the Chubb-Sperling-Solomon illusion. Chubb had been doing work on motion with George Sperling. One of the displays they were developing was to start with a visual noise stimulus and “window” the noise with a drifting sinusoidal pattern. Through brainstorming, the three authors wondered whether texture would work the same way as luminance changes in the display. To test this idea, they used simultaneous contrast displays with the visual noise. This first attempt even surprised them (it “pretty much knocked our socks off,” according to Chubb), and resulted in the illusion that now bears their names. The moral: Knowledge, brainstorming, and playing with the stimuli can lead to amazing discoveries! Chubb, C.F., Sperling, G., & Solomon, J.A. (1989). Texture interactions determine perceived contrast. Proceedings of the National Academy of Sciences, USA, 86, 9631-9635. (4) How did they do that? White’s Illusion: Michael White also graciously supplied a narrative of the creative process behind the discovery of the illusion that now bears his name. He had some setbacks in his early undergraduate and graduate career, and had doubts about his career path, when: “… early in 1976, and Vicki and I had a two-year-old son and baby daughter, with the scholarship income available only to the end of the year. I was despondent about my prospects as a scientist. Then, in February 1976 something wonderful happened. I had been interested in the relationships between art (particularly modern art) and illusions. I was reading Optical Art by Rene Parola (1969), when I saw an astonishing picture by Susan Hirth, who was one of Parola’s 11th Grade students. It was a ‘busy’ design with black, white and grey elements, where two wedge-shaped sets of physically identical mid-grey annular segments appeared to be very different in lightness (shade of grey). Although the new illusion was in a chapter explaining various lightness contrast and lightness assimilation (the opposite of contrast) effects, Parola offered no real explanation for it. Presumably, he assumed that it was some sort of contrast or assimilation effect, and he simply noted that “The grey wedges seem different.” To me, it seemed that the illusion was not readily explicable in terms of classical lightness contrast or assimilation, and I believed that I had stumbled across a powerful new effect. The following days, weeks and months © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


were very exciting. I started by designing simple variants of the illusion to extract its essence. This was before image digitization was readily available, so the designs were created by cutting shapes from black, white and mid-grey paper and gluing them together. Eventually, I isolated what I considered to be the essential effect - now commonly known as ‘White’s Illusion.’ This history is described in my PhD thesis, but is absent from my journal articles. As well as producing the designs, I read the relevant research literature, and speculated about possible explanations. I would sometimes get out of bed in the middle of the night to scribble down my latest bright idea. By the end of 1976, I had produced scores of designs, and conducted some simple experiments. Progress was satisfactory, but the remainder of my PhD research would take another five years (1977-1981) during which time I was again employed as a tutor in the Psychology Department. One of my greatest thrills was to produce a ‘dotty’ version of the illusion. It had seemed to me that the illusion essentially consisted of patterned black/grey and white/grey test regions in patterned black/white surrounds. While the pattern in the original illusion was a square-wave grating, it occurred to me that the illusion might also work with regular ‘dotty’ patterns. My father and I spent many hours creating such patterns from black, white and mid-grey paper with the help of a large leather punch to make the circular holes. The results were amazing. The different combinations of test-region and surround patterns led to very different apparent shades of grey in the objectively mid-grey test regions. The difference in lightness between two particular test regions was even stronger than in the grating version of the illusion. It was an amazing feeling to be the first person ever to see the strongest effect yet produced in what is one of the most basic human capacities - the ability to perceive differences in lightness.” Another wonderful and inspiring message for our students! White, M. (1979). A new effect of pattern on perceived lightness. Perception, 8, 413-416. (5) Contrast sensitivity and acuity and performance: Students will be familiar with the Snellen eye chart as a measure of acuity, but will not be aware of the uses of contrast sensitivity to measure acuity. Various applications include testing jet pilots, driving, computer usage, and sports. Age-related differences (in Evans and Ginsburg) and dynamic visual acuity (in Long and Zavod) can also be discussed. Evans, D. W., & Ginsburg, A. P. (1985). Contrast sensitivity predicts age-related differences in highway-sign discriminability. Human Factors, 27, 637-642. Grimson, J. M., Schallhorn, S. C., & Kaupp, S. E. (2002). Contrast sensitivity: Establishing normative data for use in screening prospective naval pilots. Aviation, Space, & Environmental Medicine, 73, 28-35. Jacko, J. A., Rosa, Jr., R.H., Scott, I.U., Pappas, C.J., & Dixon, M. (2000). Visual

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


impairment: The use of visual profiles in evaluations of icon use in computerbased tasks. International Journal of Human-Computer Interaction, 12, 10441054. Long, G. M., & Zavod, M. J. (2002). Contrast sensitivity in a dynamic environment: Effects of target conditions and visual impairment. Human Factors, 44, 120-132. Rabin, J. (1995). Small letter contrast sensitivity: An alternative measure of visual resolution for aviation candidates. Aviation, Space, & Environmental Medicine, 66, 5658. (6) Animal research ethics: The research on selective rearing will elicit some strong reactions from some students. An instructor might want to be prepared for some questions about animal research. Each instructor probably covers this at some point in his/her courses, and has appropriate references. One suggestion is: McCarty, R. (1998). Making the case for animal research. APA Monitor, Nov., 18. Bennett, A.J. (2012). Animal research: The bigger picture and why we need psychologists to speak out. Psychological Science Agenda, April 2012. (7) Video suggestions: “Vision and Movement” from “The Brain” series (from WNET) has segments that show Hubel and Wiesel’s work on discovering simple and complex cells in the cat visual cortex, and Russell DeValois’ research on mapping the monkey visual cortex. The video also contains an example of how different spatial frequencies synthesize to form a complex visual image. “Discovering Psychology” by Zimbardo contains a chapter titled “Sensation and Perception.” A segment toward the beginning of this chapter also has an interview with Hubel about mapping cortical cells, as well as Misha Pavel explaining feature and edge detection. (8) Book Recommendation: Steven Yantis’s collection of important articles in visual perception is a great resource for additional readings. The readings most appropriate for this chapter are Helmholtz’s “Concerning the perceptions in general”; Barlow’s “Single units and perception”; and Hubel and Wiesel’s “Receptive fields and functional architecture of monkey striate cortex.” The whole book has other readings that could be assigned as well for later chapters. Yantis, S. (ed.), (2001). Visual perception: Essential readings. Philadelphia: Psychology Press.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Suggested Websites ePsych: An Electronic Psychology Text The site contains many student-friendly tutorials, very readable, informative, and sprinkled with humor. Click on “The Eye” – (page 22 onwards) provides a brief overview of early visual processes and concepts and continues to a primer on receptive fields. Lightness Perception and Lightness Illusions Interactive Movies Ted Adelson has animated versions of numerous lightness illusions at this website. Simultaneous contrast, White’s Illusion, the Koffka ring, the Knill and Kersten illusion, the Vasarely illusion, and Craik-O’Brien-Cornsweet illusion are all shown. All illusions are followed by a brief explanation of the effect. Sandlot Science This popular website is geared toward a general audience, but contains an interactive version of White’s illusion and several other illusions. From the homepage select “The Mother of all Site Maps” and scroll down to “Contrast & Color” David Hubel’s Eye, Brain, and Vision This is the website for Hubel’s book Eye, Brain, and Vision. Chapter 4 (Primary Visual Cortex) and Chapter 5 (Architecture of the Visual Cortex) are excellent supplements to the Goldstein text material. Readable, with outstanding photos and figures. The Primary Visual Cortex - Webvision Matthew Schmolesky provides a biologically-oriented piece on the visual cortex on the Webvision website. The page contains great graphics and information. It also includes some historical perspectives on vision. John Krantz Sensation and Perception Tutorials John Krantz has created excellent tutorials for undergraduate students for Sensation & Perception. For Chapter 3, the tutorials on receptive fields are recommended. Krantz also has a model of the Retina, LGN, and Primary Visual Cortex; information is available at “Interactive Sensation Laboratory Exercises (ISLE)” link. The Joy of Visual Perception: A Web Book This York University website features a web book, The Joy of Visual Perception, that is an excellent resource for receptive fields. Be sure to check out the numerous links also. Movie Scene – The Matrix (1999) (38:38-40:38): In this famous scene, Morpheus (Laurence Fishburne) is revealing to Neo (Keanu Reeves) what the “Matrix” is. He begins by asking “What is real?” The answer relates to materialism; that is, our perceptual experience is due to neural firing. It is also related to the “easy” and “hard” problems of consciousness. This is an excellent, familiar scene to motivate students to learn about the biological approach to perception and to

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


encourage them to think about what is known and unknown about our conscious experience.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 4: CORTICAL ORGANIZATION

Chapter Outline I.

Introduction A. Some Questions We Will Consider

II.

Spatial Organization in the Visual Cortex A. The Neural Map on the Striate Cortex (Area V1) 1. Retinotopic map 2. Cortical magnification and cortical magnification factor 3. Method: Brain Imaging a. MRI b. fMRI 4. Demonstration: Cortical magnification of your finger B. The Cortex is Organized in Columns 1. Location and Orientation Columns – Hubel and Wiesel (1965) a. Location columns b. Orientation columns i. Neurons in each column respond to specific orientation ii. Adjacent columns have slight difference in preferred orientation 2. One location column: Many orientation columns a. Hypercolumns C. How Do Orientation-Sensitive Neurons Respond to a Scene? 1. Cortical response represents, not resembles, stimulus

III.

Test Yourself 4.1

IV.

Pathways for What, Where, and How A. Streams of Information About What and Where 1. Method: Brain Ablation 2. Ungerleider and Mishkin – ablation/lesioning a. Object discrimination problem i. Temporal ablation ii. Ventral what pathway b. Landmark discrimination problem i. Parietal ablation ii. Dorsal where pathway 3. Pathways start in retina and LGN, though not completely separate 4. Information flows forward and backward (feedback) B. Streams of Information About What and How 1. Milner and Goodale – Dorsal how stream

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


2. 3. 4.

a. Adds action component to where Method: Double Dissociations in Neuropsychology The Behavior of Patient D.F The Behavior of People Without Brain Damage

V.

Modularity A. Face Neurons in the Monkey’s IT Cortex B. The Fusiform Face Area in Humans 1. FFA in fusiform gyrus 2. Prosopgnosia: Inability to recognize faces C. Areas for Places and Bodies in Humans 1. Parahippocampal Place Area (PPA) 2. Extrastriate Body Area (EBA)

VI.

Distributed Representations A. Two Experiments that Demonstrate Distributed Representations 1. Specific areas of cortex for specific stimuli 2. Wide range of stimuli distributed over wide area of cortex B. Distributed Representation of Multidimensional Stimuli 1. Brain Areas Activated by Different Aspects of Faces

VII.

Where Perception Meets Memory A. Medial Temporal Lobe (MTL), hippocampus 1. Patient H.M. 2. Quiroga et al. (2005, 2008) studies a. “Halle Berry” neuron/concept neuron

VIII.

Something to Consider: The Mind-Body Problem A. Electrical Signals to Perception 1. Correlations 2. How do they cause perceptual experiences?

IX.

Developmental Dimension: Experience and Neural Responding A. Experience Can Shape Neural Firing 1. Experience dependent plasticity B. The Expertise Hypothesis 1. Gauthier et al. (1999) – Greeble study C. Criticism: FFA doesn’t depend on plasticity

X.

Test Yourself 4.2

XI.

Think About It

XII.

Key Terms

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Learning Objectives At the end of this chapter, the student should be able to: 1. Describe cortical magnification and why it occurs. 2. Discuss properties of different types of columns in the visual cortex, and how this relates to how an object is represented in the cortex. 3. Identify the logic of lesions (ablations) in studying the brain. 4. Describe the research by Ungerleider and Mishkin on the “what” and “where” pathways, and Milner and Goodale on the “how” pathway. 5. Explain the logic involved in establishing a double dissociation and provide an example of one. 6. Discuss research on modularity, including locations of specialized cells and specific stimuli, in both monkeys and humans. 7. Explain the connection between the hippocampus and vision in light of Quiroga and coworkers (2005, 2008) studies. 8. Describe the method and results of research that shows that neurons can be shaped by experience in humans, as well as the expertise hypothesis in relation to the FFA. Chapter Overview / Summary Chapter 4 provides more information about organization in the visual system. It begins by describing how maps in the striate cortex represent the spatial layout. For example, a retinotopic map of the LGN shows the correspondence between each point on the retina and on the LGN. Retinotopic maps on the cortex demonstrate cortical magnification of foveal cones. Brain imaging techniques, such as fMRIs, are used to measure brain activity. Research using fMRIs have also demonstrated the cortical magnification effect, as well as the connection between acuity and cortical activity. Columns occur in the LGN in a manner resembling a club sandwich. Cortical columns have been discovered that respond to location, orientation, and ocular dominance, and could be combined into hypercolumns. The pattern of activity in these columns represents, even if it doesn’t resemble, the stimulus. Streams are the different visual pathways, originating in the retina and projecting to extrastriate areas. Using lesion techniques in monkeys, Ungerleider and Mishkin called the ventral pathway (leading from V1 to the temporal lobe) the “what” pathway and the dorsal pathway (leading from V1 to the parietal lobe) was called the “where” pathway. The ventral and dorsal pathways originate in retinal ganglion cells. There is some overlap between the streams, and the flow of information in the streams is bi-directional. Milner and Goodale proposed that the dorsal stream should be called the “how” stream, since it is important not just for locating objects, but also performing actions on the objects. The dissociations experienced by their patient D.F. support this claim. Non-brain damaged individuals demonstrate the dissociation when asked to complete length estimation and grasping tasks. Modules are structures that are specialized to process information about a particular type of stimuli. Evidence from studies using monkeys show that there are modules for faces in the IT cortex. Prosopagnosics and brain scanning studies provide evidence for face modules in humans (fusiform face area; FFA). In addition to faces, a specialized area for places (PPA) and non-face body parts (EBA) have been identified. Some information from the IT cortex travels to the hippocampus. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


The case of H.M. helps to demonstrate the role of the hippocampus in long-term memory formation. Additional examinations of the hippocampus have revealed preferential activation of specific neurons to concepts (e.g., face of Jennifer Aniston, the word “Friends”, Lisa Kudrow) when exposed to corresponding visual stimuli. Similar effects have been obtained when individuals are asked to remember a concept. These findings demonstrate a clear link between vision and memory. The chapter concludes with a discussion regarding the development of the FFA. Some abilities are present from birth; others develop over time. The latter is often referred to as experience-dependent plasticity. The FFA, it has been proposed, may not be uniquely sensitive to faces, but instead is related to expertise. Gauthier and coworkers were able to demonstrate increased brain activity in the FFA when participants became experts at recognizing novel stimuli called “Greebles.”

Demonstrations, Activities, and Lecture Topics (1) Video Suggestion: The “what” and “where” pathwaysare described in WNET Program (1988) The Brain: Vision and movement. Mortimer Mishkin is interviewed about his research. Check your institution’s holdings for this popular series. (2) Prosopagnosia: Students tend to be very interested about what a person with agnosia experiences. First-hand accounts can be found through links on Harvard’s Prosopagnosia Research Center website. The Research page contains information about prosopagnosia (a.k.a., faceblindness) as well as links to recent media attention related to the disorder. These include interviews with people who have prosopagnosia, research articles, and news stories. A prosopagnosic is also interviewed in the video The Mind’s Eye: How the Brain Sees the World. Added bonus: This film demonstrates pathway differences using visual vs. grasping tasks. Another activity on prosopagnosia that is based on Oliver Sacks’ work has been posted by Steven Lloyd on APA’s Division 2 ”Office of Teaching Resources in Psychology” website. The pdf is entitled “Enhancing the Physiological Psychology Course through the Development of Neuroanatomy Laboratory Experiences and Integrative Exercises”. (3) Charles Bonnet Syndrome: Another disorder that can be addressed in this chapter is Charles Bonnet Syndrome, which is characterized by visual hallucinations in people with low vision. Apparently, visual cortex activity is misinterpreted as perceptual experience. Some introductory information can be found at the Lighthouse International website under Vision Disorders. A more scholarly case study can be found at the Journal of Neuropsychiatry & Clinical Neurosciences in a letter, “Charles Bonnet Syndrome: Two Case Reports”. Gupta, R., Singhal, A., Goel, D., Srivastava, R., and Mittal, S. (2008) Charles Bonnet Syndrome: Two Case Reports. Journal of Neuropsychiatry & Clinical Neurosciences: Letters. doi: 10.1176/appi.neuropsych.20.3.377 © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(4) Suggested Readings: Rita Carter’s Mapping the mind is a wonderful, generalaudience book on brain imaging and functions. Chapter 5 (A World of One’s Own) has nice information about agnosias and face recognition. Martha Farah’s book addresses the various types of agnosia, including a dairy farmer who had trouble recognizing his cows. Oliver Sacks’s popular book, cited in Chapter 1 of Goldstein, can be discussed more thoroughly in this chapter. Carter, R. (1998). Mapping the mind. Berkeley, CA: University of California Press. Farah, M. J. (1991). Visual agnosia: Disorders of object recognition and what they tell us about normal vision. Cambridge, MA: MIT Press. Sacks, O. (1985). The man who mistook his wife for a hat. New York: Summit Books. (5) Rod and Frame Task: The rod-and-frame task (can be used to investigate the difference between the “what” and “how” streams) has a long history in psychology (at least the visual task rather than the grasping task). Witkin (1950) introduce this task, and it has become the basis for the concept of “perceptual” (or cognitive) style. Perceptual style has been applied to a wide range of topics, including cognitive abilities, personality, and educational psychology. Table top versions of the rod-and-frame task can be found, but you can also program the EyeLines software to create a “rod-and-frame” task. Witkin, H. A. (1950). The perception of the upright. Scientific American, 200, 50-70. (6) Controversy in ERP Studies of Face Perception: A blog post entitled “The problem with comparing faces to other stimuli” provides an informative, yet accessible, introduction into the controversy surrounding ERP investigations of face perception (the N170). The piece discusses Thierry et al.’s (2007) finding that the N170 reflects stimulus variability, not face perception. (7) Faces in Cars: Another way of showing the important of face perception is to cite research that we tend to perceive faces when we look at cars. (Not a big surprise, I guess, to fans of the movie “Cars”!). A 2008 New York Times article, “Perceptions: Putting a Face Value on Cars”, provides a brief summary of Sonja Windhager’s research in this area.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Suggested Websites David Hubel’s Eye, Brain, and Vision This is the website for Hubel’s book Eye, Brain, and Vision. In addition to the book which provides an extensive look into the visual pathway up to the striate cortex – the website includes a page of illusions, Hubel’s biography, and publications. The publication page includes a copy of Hubel and Wiesel’s 1965 paper Basic Cerebral Cortex Function with Emphasis on Vision Ben Best maintains a general science website. This specific page addresses the anatomy and connections in the visual cortex. Isabel Gauthier: Object Perception Lab The Gauthier et al. (2000) article, described in the textbook, is available on-line at Gauthier’s Object Perception Lab website on the Publication page. The article has great graphics, including examples of the stimuli used, activation of the FFA for participants with different expertise (e.g., bird and car experts), and clear figures of their results. The cite also includes recent work in the area. Face Recognition Homepage This is the website for everything about face recognition. Research groups throughout the world, recent “interesting articles,” and numerous links. Human Face Perception and Recognition – Alice O’Toole Alice O’Toole heads the University of Texas at Dallas’ face recognition lab. Numerous interesting projects are listed, with examples of stimuli she has used. Categorization of faces and developmental issues are among her research interests. MIT Face Recognition Demo Page The relationship between human perceptual capabilities and computer vision is highlighted throughout the Goldstein text. Facial recognition is another ability humans seem to do quite easily, but it is challenging for computers to do this. The MIT Media Lab Vision and Modeling Group provides a short, but informative, look at face detection and recognition software. You can see a detailed description of face recognition software by selecting a specific demo. Movie Scene – Vertigo (1958) (1:30:30-1:32:00): Alfred Hitchcock’s classic. Scottie (James Stewart) is trying to find Madeline (Kim Novak), but he has a couple of false alarms, and then one face who looks like it might be Madeline. This highlights how difficult face recognition can be when there are very subtle differences between people’s faces.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 5: PERCEIVING OBJECTS AND SCENES

Chapter Outline I.

Introduction A. Some Questions We Will Consider B. Demonstration: Perceptual Puzzles in a Scene C. The DARPA “Urban Challenge” Race

II.

Why Is It So Difficult to Design a Perceiving Machine? A. The Stimulus on the Receptors is Ambiguous 1. Inverse projection problem B. Objects can be Hidden or Blurred C. Objects Look Different from Different Viewpoints 1. Viewpoint invariance

III.

Perceptual Organization A. Some Basics of Perceptual Organization 1. Grouping 2. Segregation B. The Gestalt Approach to Perceptual Grouping 1. Structuralism: sensations vs. perceptions 2. Apparent movement a. Whole is different than the sum of its parts 3. Illusory contours C. Gestalt Principles of Organization 1. Good continuation: “smoothest path” 2. Pragnanz/Good figure: “simplicity” 3. Similarity: Similar things grouped together 4. Proximity/Nearness: Physically near objects grouped together 5. Common fate: Objects moving in same direction grouped together 6. Common region: Elements in the same regions grouped together 7. Uniform connectedness: Connected region of same visual property grouped together D. Perceptual Segregation 1. Figure-ground segregation 2. Properties of figure and ground a. Reversible figure-ground i. Figure more thing-like and memorable ii. Figure is perceived as in front of ground 3. Image-based factors that determine which area is figure 4. The Role of Perceptual Principles and Experience in Determining Which Area is Figure a. Demonstration: Finding faces in a landmark


IV.

Test Yourself 5.1

V.

Perceiving Scenes and Objects in Scenes A. Perceiving the Gist of a Scene 1. Masking – Fei–Fei et al. (2007) a. Method: Using a Mask to Achieve Brief Stimulus Presentations 1. Persistence of vision 2. Visual masking stimulus 2. Global image features a. Oliva and Torralba (2001, 2006) i. Degree of naturalness ii. Degree of openness iii. Degree of roughness iv. Degree of expansion v. Color b. Global image features are holistic and rapidly perceived B. Regularities in the Environment: Information for Perceiving 1. Physical Regularities a. Horizontal and vertical orientations: Oblique effect b. “Light-from-above” assumption 2. Semantic Regularities a. Demonstration: Visualizing scenes and object b. Scene schema C. The Role of Inference in Perception 1. Helmholtz’s theory of unconscious inference a. Likelihood principle 2. Bayesian inference a. Prior probability b. Likelihood

VI.

Test Yourself 5.2

VII.

Connecting Neural Activity and Object/Scene Perception A. Brain Responses to Perceiving Faces and Places 1. Binocular Rivalry B. Spotlight on the Parahippocampal Place Area 1. Parahippocampal cortex 2. Spatial layout hypothesis 3. Space defining objects 4. Space ambiguous objects


C. Neural Mind Reading 1. Method: Neural Mind Reading 2. Kamitani and Tong (2005) a. Created an “orientation decoder” using fMRI voxel patterns 3. Naselaris et al. (2009) a. Structural encoding b. Semantic encoding VIII.

Something to Consider: Are Faces Special? A. Importance of Face perception: 1. Faces are pervasive 2. Important for social interaction B. Specialized and widespread processing of faces 1. Occipital Cortex 2. FFA 3. Amygdala 4. Superior Temporal Sulcus 5. Frontal Cortex

IX.

Developmental Dimension: Infant Face Perception A. 8 weeks - infants can detect faces B. 3-4 months – infants can discriminate expressions C. 2 day old infants recognize/prefer their mother’s face Adult-level face recognition develops in adolescence or early adulthood

X.

Test Yourself 5.3

XI.

Think About It

XII.

Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. Discuss reasons why object perception is challenging. 2. Discuss Gestalt psychology and the laws of perceptual organization. 3. Explain figure-ground segregation and identify the properties of figure-ground, gist of a scene. 4. Describe the “oblique effect” and the “light-from-above” assumption, and discuss how these exemplifies the relationship between environmental regularities, physiology, and perception. 5. Discuss research on semantic regularities in scenes. 6. Discuss Helmholtz’s theory of unconscious inference, and Bayesian inference. 7. Identify the rationale for and use of masking stimuli.


8. Describe orientation, structural, and semantic “decoders”, how they are established, and what they can be used for. 9. Analyze the evidence indicating that faces are “special” stimuli, and describe the development of face recognition in infants.

Chapter Overview/ Summary Chapter 5 introduces the topic of object perception by discussing how robotic vehicles need to have the ability to “recognize” objects to go through a course with various obstacles, and how difficult it is for computers to do object recognition tasks that humans can easily accomplish. There are three major reasons that object perception is challenging: (1) the stimulus on the receptors is ambiguous, as demonstrated by the inverse projection problem; (2) objects can be hidden or blurred; and (3) objects look different from different viewpoints. So, given these challenges, how do humans perceive objects? The first psychologists to formally study this were the Gestalt psychologists. Arising as a reaction to the structuralism approach of analyzing sensations, the Gestaltists believed that “the whole was different than the sum of its parts.” Gestaltists expanded on this theory by identifying laws of perceptual organization, such as good continuation, Pragnanz, similarity, proximity, common fate, common region, and uniform connectedness as grouping laws. Another important Gestaltist issue was perceptual, or “figure-ground,” segregation: how do we segregate objects from the background. The basic properties of figure-ground are that the figure appears more “thing-like,” is more memorable, is seen as being closer, and it “owns” the border between the figure and ground. Factors that determine what is seen as the figure are: orientation, upper vs. lower region, and meaningfulness. Perceiving scenes and objects in scenes has been investigated by using brief presentations of a scene to see if the observer can get the “gist” of the scene. “Global image features” have been identified that enable the observer to get the gist of the scene. Researchers have also studied what regularities occur in the environment that our visual system would most likely respond to. There are physical regularities and semantic regularities. The importance of horizontal and vertical orientations as physical regularities has been shown by research by Copolla et al. (1998), and by studies of the oblique effect. Human’s use of the “light-from-above” assumption, to disambiguate changes in lightness and darkness, also exemplifies physical regularities. Semantic regularities are supported by research that shows the effect of context on object perception and expectations about the scene. Helmholtz’s theory of unconscious inference is then cited to discuss the role of inference in perception as are the likelihood principle and Bayesian inference. The physiology of object perception has been studied Tong et al. (1998) demonstrated preferential firing of neurons in the PPA and FFA. These findings indicate that neural activation is not driven solely by external stimuli but is affected by the interpretation of the stimulus. The connection between brain activity and perception was furthered by research aimed at associating fMRI identified voxels with visual stimuli. These lines of research resulted in the development of orientation, structural, and semantic “decoders”. The chapter ends with a more in-depth discussion of face processing. First, evidence that face processing is “special” is presented; the evidence includes findings that faces constitute an important part of our social publicly accessible website, in whole or in part.


interactions, face processing is associated with specialized brain structures and preferential processing, and that face processing is sensitive to inversion effects whereas other stimuli are less affected (similar results have been found using negative images). Second, the development of face processing is discussed. Infants appear to have some ability to discriminate between faces within two days of birth. Interestingly, this discrimination may be based on the contrast between the forehead and hairline. Adultlevel face recognition does not develop until adolescence or early adulthood.

Demonstrations, Activities, and Lecture Topics (1) Figure/Ground and Logos: A common application of figure/ground reversals is in many company logos. For example, the FedEx logo contains an arrow between the last “E” and the “x.” Most students are unaware of this, even though they have seen the logo numerous times. Other logos that have arrows in figure/ground patterns are the logos for McLane Food Distribution, and Equal Exchange Coop. Ravenswood Winery has a reversible figure/ground stimulus as its logo (the ravens are actually less likely to be seen as the figure at first glance). One of the most clever examples was the previous Big Ten Conference logo. When Penn State was added to the conference, there were 11 teams in the Big Ten: The logo reflected this by having the white area between the letter “T” as an “11” when seen as the figure. In 2011 the conference expanded to include the University of Nebraska – Lincoln. At that point, the logo was redesigned so that is was just the word “BIG”, but the white between the “I” and “G” can be seen as the number “10”, this preserves the full name without explicitly referencing the number of member institutions. (2) Camouflage as a Case of Visual Segregation: The natural world provides us with a fair share of examples in which figure group segmentation can be difficult. To reinforce the concepts related to figure/ground issues and Gestalt principles, students could examine good examples of camouflage in light of those concepts. One web resource for animal examples of camouflage can be found at “Dr. Dave’s House of Fun” on the “Invisible Animals” page. (3) Meaningfulness on Gestalt laws and Perceptual Segregation: Two other examples of how meaningfulness and familiarity affect object perception can be presented. One is to show students an example of flame-painting techniques, used most often on cars. Students with more familiarity with these techniques will easily see the “flames” as the figure; less car-oriented students will most likely see “tadpole” shapes as the figure. For examples of flame-painting, go to the Kustom Flames website. A second example is to present the numbers “25” and/or “52” using a traditional LCD font (e.g., digital clock display). No matter how close you get the numbers, students will still recognize the number, showing that meaningfulness outweighs closure. You can make the point that humans before being familiar with LCD displays might have been more likely to perceive the stimulus as a unitary figure (like a vase in the case of “25”).


(4) Figure/ Ground “Transposition” and Cognitive Therapy: Rian McMullin, a clinical psychologist, has proposed that seeing the different percepts in figureground stimuli and other bi-stable stimuli can generalize to more flexible and adaptive thinking patterns. Beyond some anecdotal evidence, actual research support is not provided, but it is still an interesting application, especially for students who may have interest in counseling or clinical psychology. McMullin, R. E. (2000). The new handbook of cognitive therapy techniques. New York: Norton. (5) Book suggestions: Chapter 4 in Block and Yuker’s “Can You Believe Your Eyes?” is a great resource for examples of figure-ground reversible figures. “The Mediterranean Sea” (Figure 4.1) example is a great way to introduce object perception to your class. Block and Yuker also devote a chapter to illusory contours, and another to perceptual organization, providing examples for many Gestalt laws. Roger Shepard’s “Mind Sights” is also an excellent reference for figure-ground stimuli. Block, J. R., & Yuker, H. (1989). Can you believe your eyes? Bristol, PA: Brunner/Mazel. Shepard, R. N. (1990). Mind sights. New York: Freeman. (6) “DISNEP” and Other Context/Priming Effects: One of the topics in the chapter discusses context effects in relation to perceptual intelligence. You can present some additional, classic examples of context effects, such as the “13”/”B” figure and “THE CAT” figure (Figures 15.1 and 15.3, respectively) in Block and Yuker (1989). A similar effect can be accomplished by downloading the “Disney font”. The “Y” will be perceived as a “P” if the context is appropriate. For example, if you type “DEEY” in Disney font, students will read it as “DEEP,” and “SNIY” will be read as “SNIP.” If you then present “DISNEY,” most students will read it as “DISNEY,” but occasionally a priming effect will occur, and read it as “DISNEP.” Shepard (1990) also has the unambiguous versions of the “woman’s face” and “saxophone player” that can be used to prime the ambiguous version of “Sara Nader” (Figures III-4 and III-5) to show the top-down effects on figureground. (7) Regions and Figure/Ground: Vecera’s discovery of the lower vs. upper region determinant of figure-ground is discussed in the textbook. If you want to expand on the text material, Vecera (2004) describes an interesting follow-up study. Vercera tested the influence of reference frame in determining figure-ground by manipulating the observer’s head position. The article also highlights the differences between theories of figure-ground. Vecera, S. P. (2004). The reference frame of figure-ground assignment.


Psychonomic Bulletin & Review, 11 (5), 909-915. Suggested Websites Sandlot Science This website is geared toward a general audience, but includes numerous illusions relevant to concepts presented in this chapter. From the homepage select “The Mother of all Site Maps” and scroll down to “Ambiguous Figures” and “Camouflage”, Optical Illusions and Visual Phenomena – Michael Bach This is another great website for illusions that will be cited for multiple chapters. This website is particularly useful in that it often provides explanations and citations for original sources. Go to the heading of “Gestalt” to find examples of the Dalmatian hidden figure, the effect of context on viewing the Kanisza triangle, and the effect of blur on perception of a figure-ground hidden message. The “Specialties with faces” section provides several examples exploiting the tendency to interpret stimuli as faces and some special properties of face perception (e.g., inversion effect with the Thatcher Illusion). M. C. Escher The official Escher website has a gallery of works that show Escher’s use of reversible figure-ground. Click on the “Symmetry” gallery for numerous examples. Photomosaics – Robert Silvers In discussing the importance of faces and face perception, examples of our natural tendencies to see faces could be discussed. In “photomosaics” global pictures - of a face, for instance - are created from numerous smaller pictures. TV Series Scene – The X-Files: Clyde Bruckman’s Final Repose (16:24-17:10): In this scene from an episode from Season 3 (and one of the finest episodes of the series), FBI agents Mulder and Scully are by a lake, when Mulder sees a propane tank, and says it looks like a “Nazi stormtrooper,” and asks Scully if she concurs. Scully, always the rational scientist, gives a brief description of how top-down processing affects object perception. Mulder asks Scully if she answered his question, though. Scully admits “Yes, it does look like a stormtrooper,” but that supports her explanation. No description of the scene is needed, as long as students have some familiarity with the series.

publicly accessible website, in whole or in part.


CHAPTER 6: VISUAL ATTENTION

Chapter Outline I.

Introduction A. Some Questions We Will Consider

II.

Scanning a Scene A. Demonstration: Looking for a Face in the Crowd B. Mechanics of Scanning 1. Fixation 2. Saccadic eye movement 3. Overt attention 4. Covert attention

III.

What Directs Our Attention? A. Visual Salience 1. Demonstration: Attentional Capture 2. Saliency maps B. Cognitive Factors 1. Scene Schemas a. Shinoda et al. (2001) traffic sign simulator study 2. Observer Interests and Goals a. Observing eye movements 3. Task-Related Knowledge a. Eye-movements determined primarily by task b. “Just in time” strategy

IV.

What Are the Benefits of Attention? A. Attention Speeds Responding 1. Speeding Responding to Locations a. Spatial attention b. Method: Precueing i. Valid and invalid trials 2. Speeding Responding to Objects a. Same-object advantage B. Attention Can Influence Appearance 1. Perceptual response 2. Speed of responding 3. Perceived contrast (e.g., Carrasco et al., 2004) C. Attention Can Influence Physiological Responding 1. Attention to Objects Increases Activity in Specific Areas of the Brain 2. Attention to Locations Increases Activity in Specific Areas of the Brain 3. Attention Synchonizes Neural Activity Between Areas of the Brain

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


a. Local Field Potential (LFP) V.

Test Yourself 6.1

VI.

Attention and Experiencing a Coherent World 1. Why Is Binding Necessary? a. Binding problem 2. Feature Integration Theory (FIT) a. Preattentive stage b. Focused attention stage c. FIT and Divided Attention i. Illusory conjunctions ii. Balint’s syndrome d. FIT and Visual Search i. Conjunction search ii. Demonstration: Search for Conjunctions e. FIT and Top-Down Control

VII.

What Happens When We Don’t Attend? A. Inattentional Blindness B. Change Blindness 1. Demonstration: Change detection a. Continuity errors C. Is Attention Necessary for Perceiving Scenes? 1. Evidence That Perception Can Occur Without Attention a. Dual task procedures – Li et al. (2002) b. Central vs. peripheral task 2. Evidence That Perception Requires Attention a. Cohen et al. (2010)

VIII.

Distraction A. Distraction and Task Characteristics 1. Effect of task-irrelevant stimuli B. Attention and Perceptual Load 1. Load theory of attention a. Perceptual capacity b. Perceptual load c. Low- and high-load tasks

IX.

Something to Consider: Distracted Driving A. 100-Car Naturalistic Driving Study (Dingus et al. 2006) B. Cell-phone study (Strayer and Johnston 2001)

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


X.

Developmental Dimension: Attention and Perceptual Completion A. Object unity B. Method: Habituation 1. Habituation and dishabituation C. Studies 1. Kelman and Spelke (1983) 2. Slater et al. (1990) 3. Johnson and Aslin (1995) 4. Johnson et al. (2004) a. Perceivers vs. non-perceivers

XI.

Test Yourself 6.2

XII.

Think About It

XIII.

Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. Define attention and explain why attention is necessary. 2. Identify the factors that determine where a person looks in a visual scene with supporting examples from research studies. 3. Describe Posner et al.’s (1978) precueing task and what it revealed about attention. 4. Explain the same-object advantage and the implications for attention. 5. Describe the link between neural responding and attention. 6. Define change detection, change blindness, and continuity errors. 7. Describe how the dual-task procedure has been used to determine the role of attention in scene perception; also discuss findings from those studies. 8. Describe the load theory of attention and how it accounts for the interaction between task difficulty and distraction. 9. Compare and contrast feature and conjunction searches. 10. Discuss methods and findings from developmental studies of attention and perceptual completion. Chapter Overview/ Summary Chapter 6 begins with a discussion of the need for attention to regulate the overwhelming amount of information in the environment. It is necessary to focus on “interesting” or “important” aspects of the environment A brief overview of eye movements follows as they are one observable measure of attention. The factors involved in determining where we look in a visual scene are then discussed. These are: (1) stimulus salience; (2) knowledge about the scene; and (3) task requirements (as exemplified by scanning when making a sandwich) and predictions about dynamic interactions involved in the task (as © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


exemplified by Shinoda et al.’s driving simulation study). The next section of the chapter addresses how attention enhances perception. Responses to attended locations (spatial attention; Posner et al., 1978) are faster as are responses to locations within attended objects (e.g., Egly et al. 1994). Attending to a stimulus can also affect perceived stimulus contrast (Carraasco et al., 2004). In addition to behavioral effects, attention is reflected in physiological responses. First, when images are superimposed, applying attention to one, will lead to increased activation in the relevant brain region. For example, “focusing” on a house that has a face superimposed on it will lead to increased activation in the PPA, but not the FFA. Second, covert “movement” of attention can be used to create an “attention map” corresponding to locations in a scene. Those maps can be used to predict where someone is attending in the scene. Finally, attention can shift the location of a neuron’s receptive field. Perception can also be affected by a lack of focused attention: this is demonstrated by “inattentional blindness” and “change detection.” Inattentional blindness occurs when observers fail to report an unattended stimulus even though they look right at it. Simons and Chabris (1999) demonstrated this by showing a film of people playing basketball, when a person in a gorilla suit walks through. Observers rarely report seeing the “gorilla.” In the change detection task, any difference between two successively presented scenes is reported; the difficulty in doing this is termed “change blindness,” (e.g., Rensink, 2002). Change blindness is related to the concept of continuity errors, evident in many Hollywood films. Even when observers are forewarned that changes in the scenes will occur, change detection is still difficult. With evidence for the importance of attention in stimulus processing, some wonder if attention is required for the perception of any complex stimulus. The dual-task procedure has been used to investigate this question in regards to scene perception (here “scene” is loosely defined). Results of a series of studies suggest that some attention may be required for scene perception. A discussion of the interaction between task difficulty and the processing of irrelevant stimuli helps explain the mixed results found in scene perception research. When a task is difficult, irrelevant stimuli are less likely to be detected/distracting than when the primary task is easy. This finding is explained by a load theory of attention and can be related back to the scene-attention research. In addition to being necessary for basic detection and recognition, attention is needed to experience a “coherent” world. Perception seems to provide a cohesive experience, but that only comes about via the integration of many information sources, this is referred to as the binding problem. Treisman’s feature integration theory states that features are extracted in a preoperational stage and then “glued” together in the focused attention stage. Illusory conjunctions are used as support for this binding function of attention, as is the functioning of individuals with Balint’s syndrome. Visual search of “real” conjunctions also highlights the importance of binding. The “Something to Consider” section of the chapter addresses research on distracted driving, and the use of cell-phones while driving. The chapter concludes with a discussion of the relationship between infant movement perception, eye movements/attention, and object perception (perceptual completion). An infant’s ability to perceive objects as unified has been tested using occluded rods as stimuli. The perception of a unified rod does occur by 4 months, but movement of the rod is key to this perception. Johnson showed the importance of eye movements in developing this ability.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Demonstrations, Activities, and Lecture Topics (1) Video Suggestion: “The Mind’s Eye: How the Brain Sees the World” has a clever example of change blindness. Participants walk up to a counter where the researcher presents the consent form. The researcher then ducks down below the counter to get something, and a second researcher replaces the first. Very few participants notice the change! The film includes the reaction of the participants when they are informed of the change. Video is available from Films for the Humanities & Sciences (KWX11458-KS for DVD). (2) Attention Operating Characteristics (AOCs): Ronald Kinchla developed a way of analyzing the results of attention experiments that is similar to signal detection theory. If you cover Appendix D earlier in the course, you can show the similarities between the ROCs and the AOCs. The references below expand on Kinckla’s pioneering work. Kinchla, R.E. (1980). The measurement of attention. In Nickerson, R.S. (ed), Attention and Performance VIII. Hillsdale, NJ: Erlbaum. Sperling, G., & Melchner, M. J. (1978). The attention operating characteristic: Examples from visual search. Science, 202, 315-318. Sperling, G., & Dosher, B. A. (1986). Strategy and optimization in human information processing. In K. Boff, L. Kaufman, and J. Thomas (eds.), Handbook of Perception and Performance Vol. 1, 2-1 – 2-65. New York: Wiley. Sperling, G., Wurst, S. A., & Lu, Z-L. (1993). Using repetition detection to define and localize the processes in selective attention. In Meyer, D. E., & Kornblum, S. (eds.), Attention and Performance XIV. Cambridge, MA: MIT Press. (3) How Did They Do That?: Attentional Blink: An additional phenomenon that has generated much research is the attentional blink, in which a second target stimulus is not reported in RSVP (rapid serial visual presentation) tasks. The idea is that attention directed to the first target stimulus uses up attentional resources for about 500 ms, in which time the second target is presented. Jane Raymond is a prominent researcher on the attentional blink. She has graciously submitted a narrative of her discovery of the effect: “At about the time we discovered the ‘attentional blink’ in the lab, I was actually doing a lot of work on visual motion perception, thinking about visual information that changes in both space and time and is perceived in episodes. Attention research at this time was tightly focused on spatial attention, the spotlight metaphor, and how we select information from static scenes. I wondered if attention, the spotlight, could also operate in a space-time framework like

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


motion. Could the spotlight work in episodes, switching on and off? These ideas were the beginning of the conceptual groundwork for the attentional blink. “During this time period, I was becoming too busy to do my own computer programming in the lab. I had little kids at home, too many students, university committees, etc. My life just wasn’t compatible with zoning out for three or four days at a time writing computer code. So I decided to try one of the new (at that time) high-level software packages that could generate behavioural experiments more easily. As a test, I decided to re-do a simple experiment reported by Wechselgartner and Sperling (1987) that involved presenting a rapid sequence of letters. Moreover, their procedure seemed well suited for looking at how attention is used across time, something I was developing curiosity about. However, their procedure seemed overly complicated to me, so I slimmed it down devising a simple dual task RSVP procedure optimal for probing attention in time. I left it with my student, Karen Arnell, and went off for a holiday. When I can back, she had collected a stunning set of data that we realised very quickly was something important. It turned out that the software package was terrible (its timing was hopeless); we had to redo the programming (back to creating code from scratch) and run the experiments all over again. So much for short cuts! After many hours of discussion with Kim Shapiro and Karen Arnell, we hammered out the critical control experiments and developed an interpretation of what we had found. The name, the attentional blink came to me because we were also doing some eye movement studies of a visual perception task … and noticed that people habitually blink just when a trial ends. It seemed to me that when an attentional episode is over, the attentional system does just what the eyes do—blink!” Raymond, J. E., Shapiro, K. L., & Arnell, K. M. (1992). Temporary suppression of visual processing in an RSVP task: An attentional blink? Journal of Experimental Psychology: Human Perception and Performance, 18, 849-860. Raymond, J. E. (2003). New objects, not new features, trigger the attentional blink. Psychological Science, 14, 54-59. Shapiro, K .L., Arnell, K. M., & Raymond, J.E. (1997). The attentional blink. Trends in Cognitive Science, 1, 291-296. (4) Feature Integration Theory: The best way for students in understand Treisman’s feature integration theory is to be an observer in a visual search task. The CD-ROM has a module in which the student can search for different targets, to demonstrate the difference between pre-attentive processes and focused attention. (5) Brain scanning studies of attention: For a great reference for information on the physiology of attention, get Attention & Performance XX. Part 3 includes many reviews in this area, including a chapter by Hillyard, who is a leading researcher in this field.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Hillyard, S. (2004). Imaging of visual attention. In Kanwisher, N., & Duncan, J. (eds), Attention & Performance XX. Oxford: Oxford University Press. Osman also has a great chapter on the physiology of attention in a cognitive science book. Osman, A. (1998). Brainwaves and mental processes: Electrical evidence of attention, perception, and intention. In Scarborough, D., & Sternberg, S. (eds), An invitation to cognitive science, Volume 4: Methods, models, and conceptual issues (2nd edition). Cambridge, MA: MIT Press. (6) Annual Reviews of Psychology: A great place to get an overview of the major issues in any topic area is the Annual Reviews of Psychology. Several chapters have been devoted to attention. Some of the topics included in these chapters match the text material, such as early vs. late selection, theories of attention, and processing of non-attended stimuli. Some specific resources are provided below. Posner, M.I., & Rothbart, M.K. (2007). Research on Attention Networks as a Model for the Integration of Psychological Science. Annual Review of Psychology, 58, 1-23. Pashler, H., Johnston, J.C., & Ruthruff, E. (2001). Attention and Performance. Annual Review of Psychology, 52, 629-651. Egeth, H.E., & Yantis, S. (1997). Visual attention: Control, representation, and time course. Annual Review of Psychology, 48, 269-297. Johnston, W. A., & Dark, V. J. (1986). Selective attention. Annual Review of Psychology, 37, 43-75. Kinchla, R. E. (1992). Attention. Annual Review of Psychology, 43, 711-742. (7) Eye movements and Attention: The textbook reviews some research on scanning and attention, but instructors may want to expand on this fascinating area of attention. The classic work in this area is by Yarbus (as cited in Chapter 6). Noton and Stark are credited with coining the term “scan paths,” a concept mentioned in Chapter 6. Another major researcher in this area is Eileen Kowler from Rutgers University; a sample of her work is listed below also. Gersch, T. M., Kowler, E., & Dosher, B. A. (2004). Dynamic allocation of visual attention during execution of saccades. Vision Research, 44, 1469-1483. Kowler, E. (1995). Eye movements in visual cognition. In Kosslyn, S.M.,& Osherson, D.N. (eds.) An invitation to cognitive science: Volume 2 (2nd ed). Cambridge, MA: MIT Press. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Noton, D., & Stark, L. (1971). Scanpaths in saccadic eye movements while viewing and recognizing patterns. Vision Research, 11, 929-942. Yarbus, A. L. (1967). Eye movements and vision. New York: Plenum. (8) Pop Culture and Change Blindness: Similar to the concept of continuity errors discussed in the chapter, there are many videos depicting how changes can go undetected. YouTube provides some great examples. The “colour changing card trick,” by “Quirkology” and “Test Your Awareness: Whodunnit?” by “dothetest” are excellent examples with many changes that easily go undetected. Also, puzzle books that require the observer to “spot the differences” between two (or more) pictures are very popular. “Life” magazine’s “Picture Puzzle” books are probably the most popular, and can easily be purchased at amazon.com. Many grocery store check-outs will also carry “Brain Games Picture Puzzles,” published by Publications International Ltd. The underlying challenge here is directly related to change blindness. And lastly, a 2008 New York Times article by Natalie Angier entitled “Blind to Change, Even as It Stares Us in the Face” provides an accessible summary of change blindness research.

Suggested Websites Arrington Research: Eye Tracking Systems If you’re interested in the technical aspect of different types of eye tracking devices, Arrington Research is one manufacturer that gives details about their products. Bottom-Up Visual Attention Home Page This website provides details about the saliency model, movies demonstrating eye movements in a display, images depicting model results for predicting eye movements in scenes, an interactive version of the model, and relevant research. It is an excellent resource. Daniel Simons – Home Page Simons’ website contains brief clips of his inattentional blindness research as well as ordering information for DVDs – “Surprising Studies of Visual Awareness” - from Viscog Productions Inc. A short description of Simons’ research and a personal blog are also provided on the site. Viewable/Downloadable Examples of Change Blindness Ronald Rensink’s website has numerous examples of change blindness. The site also contains descriptions of his research projects and publications.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 7: TAKING ACTION

Chapter Outline I.

Introduction A. Some Questions We Will Consider

II.

The Ecological Approach to Perception A. Ecological validity 1. J.J. Gibson: study perception as people move through environment B. The Moving Observer Creates Information in the Environment 1. Optic Flow: Movement of elements in scene relative to observer a. Gradient of Flow b. Focus of Expansion (FOE) 2. Invariant Information C. Self-Produced Information 1. Bardy and Laurent (1998): Somersault study D. The Senses Do Not Work in Isolation 1. Demonstration: Keeping Your Balance

III.

Staying on Course: Walking and Driving A. Walking 1. Visual Direction Strategy 2. Spatial updating B. Driving a Car 1. Land and Lee (1994)

IV. Wayfinding A. The Importance of Landmarks 1. Decision-point landmarks vs. non-decision-point landmarks B. The Brain’s “GPS” 1. Place cells and place field 2. Head direction cells 3. Border cells C. Individual Differences in Wayfinding 1. Maguire et al. (2006) – London bus and taxi drivers study V.

Test Yourself 7.1

VI. Acting on Objects A. Affordances: What Objects Are Used For 1. Humphreys and Riddoch (2001) – Patient M.P. with temporal lobe damage B. The Physiology of Reaching and Grasping

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


1. The Dorsal and Ventral Pathways 2. The Parietal Reach Region (PRR) a. Visuomotor grip cells VII. Observing Other People’s Actions A. Mirroring Others’ Actions in the Brain 1. Mirror Neurons 2. Audiovisal mirror neuron 3. Mirror neuron system B. Predicting People’s Intentions 1. Iacoboni et al. (2005): fMRI study – action vs. intention 2. Mirror neurons play important role in social interactions (e.g. Yoshida et al. 2011) VIII. Something to Consider: Action-Based Accounts of Perception A. Action-specific perception hypothesis (Witt, 2011) 1. Judgment bias: expectation vs. perception 2. The Ecological Approach to Perception (1979) – J. J. Gibson IX. Developmental Dimension: Imitating Actions A. Meltzoff studies: Children imitate adult facial expressions and actions X.

Test Yourself 7.2

XI. Think About It XII. Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. Discuss the historical background of the ecological approach to perception. 2. Apply the concepts of optic flow (such as invariance and FOE) to a person driving a car. 3. Describe how people use flow information when driving and the research showing that people do not need to have flow information when walking. 4. Discuss the role of landmarks in wayfinding, and the Gibsonian term “affordances”. 5. Identify the neurons and brain areas that are believed to be responsible for navigation, and discuss findings from patients with brain damage in relation to wayfinding and affordances. 6. Describe the procedure, results, and conclusions of studies on the physiology of reaching and grasping in both monkeys and humans. 7. Explain what mirror neurons are and how they were discovered.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


8. Debate why many focus on perception as a tool for action and survival instead of focusing only on perception as a tool for developing mental representations. 9. Discuss the ways in which ability and expectations can affect perception.

Chapter Summary / Overview The topic of Chapter 7 is “Taking Action”; it details the interaction between perception and action, primarily in context of an ecological approach to perception. J.J. Gibson, in his initial work with a pilot’s ability to land a plane, believed that laboratory studies of vision were too artificial, and that the question should be “What information do we use as we move through the environment?” He answered this by identifying that humans use optic flow patterns as we move. The characteristics of optic flow are described in relation to the gradient of flow and the focus of expansion. Optic flow produces invariant information as the observer moves through the environment that can be used to judge speed and heading. Our own movements create changing sources of information that have to be responded to. This can be appreciated in the context of a gymnast doing somersaults; updated visual information gathered in the midst of the summersault can be used to make adjustments and complete the maneuver. In short, the senses do not work in isolation. This concept is also illustrated by Lee and Aronson’s “swinging room” studies, first done with children, then adults. The manipulation demonstrated that flow patterns can be created in the swinging room, and that the participants compensated based on the flow of information. There are also situations where we don’t necessarily need flow information. “Blind-walking” studies show that we can make adjustments while walking in the environment with our eyes closed. Thus, flow information appears to be important, but is not used for all locomotion tasks. Landmarks pose another information source used in wayfinding. Evidence suggests that those landmarks most vital for wayfinding are looked at more often and are more likely to be remembered. Case studies reveal that directional ability and landmark identification may be dissociable. From here, the discussion transitions to acting on objects. Gibson’s concept of “affordances”, suggests that stimuli contain information that indicates how a stimulus can be used. In line with the focus on acting on objects, the physiology of reaching and grasping is discussed. The parietal reach region (PRR) in monkeys has demonstrated sensitivity not only to reaching behavior, but also to grasping. Interestingly, another class of neurons – visuomotor grip cells – respond to the site of a graspable object and when actually moving to grasp the object. Research with human participants has demonstrated that the parietal region is used for directing movement, but also for avoiding collisions. The recent discovery of mirror neurons has contributed considerably to our knowledge of how we watch other people take action. Mirror neurons are not simply motion detectors, they have been shown to predict intention as well as expectations (e.g., what actions other people may take). The “Something to Consider” section addresses why the approach to the study of perception has shifted from focusing primarily on the formation of mental representations to the study of perception and action. Several experiments are discussed that demonstrate how perception is affected by our abilities and our expectations about the task itself.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Demonstrations, Activities, and Lecture Topics (1) J.J. Gibson and “Big Bangs in Perception”: Murray White (1987) identified the most cited authors and publications in perception, taking citations from numerous books and the Social Science Citation Index and Science Citation Index. Although his results could be used for practically any chapter in the textbook, it seems appropriate to include it here to emphasize J.J. Gibson’s contribution to perception. Gibson had 249 citations (fourth behind Hubel at 532, Neisser at 261, and Stevens at 250), which exceeded Gregory (99), Boring (84) and Helmholtz (67). Three of Gibson’s books were also among the most cited publications, such as“The Senses Considered as Perceptual Systems” had 56 citations, which placed it in a similar position as Juelsz’s (1971) “Foundations of Cyclopean Perception” (52 citations) and Hubel and Wiesel’s Journal of Physiology article on receptive fields in the monkey striate cortex. (54 citations). White, M. J. (1987). Big bangs in perception: The most cited authors and publications. Bulletin of the Psychonomic Society, 25, 458-461. (2) Strobe and Balance: An embellishment to the demonstration of “Keeping Your Balance” is to not only have the “eyes closed” vs. “eyes open” conditions, but to use a strobe light (with the room lights off) to test intermediate levels of visual condition. I typically set the strobe at 2.25 flashes/second, using a strobe from Electronic Brazing Company (Model #510-AL), which typically results in a significant difference in balance time between the “eyes closed” condition and “strobe” condition, and between the “strobe” and “eyes open” conditions. I impose a 2-minute limit on the “balancing time” also. Run as a repeated-measures experiment (if you have a lab section), it is also a nice example of the logic of repeated-measures designs and counterbalancing treatment order. Note: BE CAREFUL! Alert anyone with any possible problems with strobe lights (e.g., epilepsy) not to do that condition! Also, keep a penlight handy for the strobe condition. Lastly, make sure that students do not struggle to keep their balance: hopping around on one foot to maintain balance can possibly lead to injury. This possibility will be minimized if the student simply put his/her foot down when they start losing balance. (3) Humans Catching a Fly Ball: Chapter 7 discusses research on skilled actions like somersaulting. Another example from sports is how baseball players catch fly balls. The following journal articles can be used to provide information for any baseball fans in your class. The Marken article also has good background information on many of the issues involved in catching. Gray, R., & Sieffert, R. (2005). Different strategies for using motion-in-depth information in catching. Journal of Experimental Psychology: Human Perception & Performance, 31, 1004-1022. Marken, R.S. (2001). Controlled variables: Psychology as the center fielder views it. American Journal of Psychology, 114, 259-281. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(4) Dogs Catching a Frisbee: In the previous edition of the textbook, Goldstein briefly discussed research that relates the strategies used by humans in catching balls to how dogs catch Frisbees. This research still is associated with this chapter. The two dogs in this study (a springer spaniel named Romeo and a border collie named Lilly) perform the LOT strategy and the Optical Accleration Cancellation (OAC) strategy. Depending on the level of your students, the information in this article may be difficult, so you may want to lecture about the study, rather than assigning it as a reading. Shaffer, D. M., Krauchunas, S. M., Eddy, M., & McBeath, M. K. (2004). How dogs navigate to catch Frisbees. Psychological Science, 15, 437-441. (5) Watching a Pitched Baseball: In addition to catching, “perception and action” research can be applied to watching a pitched ball. Obviously, batting is one situation where this occurs, but there is also research on how umpires watch the pitch. In regard to batting, a nice example of science and practice is work by Tony Abbatine, who has worked with the New York Mets and high-caliber players such as Manny Ramirez. Articles about Abbatine have appeared in the New York Times (“2001 Baseball Preview; Mets Set Sights on New Techniques), USAToday and Baseball Weekly (“Vision specialist schools hitters on pitch tracking”). He is the founder of Frozen Ropes, a training program for batting that includes a visual component. Part of the program involves using cues from the pitcher’s body and characteristic seam spins on the baseball to recognize the type of pitch. Research on umpires by Ford (full citation below) is interesting for its methodology, as well as definitive application of the results to positioning umpires. Ford, G. G., Gallagher, S. H., Lacy, B. A., Bridwell, A. M., & Goodwin, F. (1999). Repositioning the home plate umpire to provide enhanced perceptual cues and more accurate ball-strike judgments. Journal of Sport Behavior, 22, 2838. (6) “Sport Vision”: The link between sports and perception is highlighted in this chapter, but a more debatable aspect of this topic is whether vision “exercises” can actually improve sports performance. The New York Times article “A little flabby around the eyeballs” addresses some of these issues, with anecdotal evidence and links to some practitioners’ websites. The article also describes four of these exercises that you can do in class to get students up and moving. The “apparatus” needed are easily obtainable (e.g., spaghetti, straws, beanbags, etc.). (7) Mirror Tracing: Another classic demonstration of the role of vision on action is mirror tracing. A “mirror tracing”-type task is found on Walter Beagley’s Eye Lines software webpage. The participant tries to trace a star pattern, but the mouse lateral direction is opposite the cursor direction (e.g., you move the mouse to the left, the cursor goes right). The difficulty in doing this task shows how disrupting the visual information leads to motor difficulties. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(8) Video Suggestion: “Seeing beyond the obvious: Understanding perception in everyday and novel environments [#AAV-1343]” (1990), the NASA/University of Virginia collaboration cited in the depth perception chapter, also has nice segments about optic flow and “perception in novel environments,” such hypergravity (highlighting the vision/vestibular interaction) and zero-gravity. (9) Inverted Goggles: Speaking of disrupting visual information, another famous study on sensori-motor processes is the “inverted vision” study first done by Stratton (1897). You can also have your students experience the effect first-hand by getting inverting goggles. Jim Matiya, a psychology teacher in San Francisco, is a great source for these goggles. They can also be obtained from PsychKits, an online distributor. (10) Proprioception Reading: A thorough treatment of proprioception was written by John Dickinson (1976). Among the interesting information is a review of research on balancing ability of various athletes, including swimmers and wrestlers. Dickinson, J. (1976). Proprioceptive control of human movement. Princeton, NJ: Princeton Book Company. Suggested Websites UFO World Cup Frisbee Dog Series To introduce the Shaffer et al. (2004) study on the similarities between humans catching a fly ball and dogs catching a Frisbee, use this website or video clips of champion Frisbee catching dogs. It doesn’t discuss how the dogs do it, but it’s a fun site to get students’ interest. Centeye Optical Flow Centeye is a company that makes sensors for flying robots. Their “Technology” page provides a tutorial on various aspects of optic flow that has applications in computer vision. Pedestrian Detection: Mobileye Another application for optic flow can be found in automobile systems. Mobileye’s Pedestrian Collision Warning (PCW) system uses optic flow as a secondary system intended to detect nearby stationary objects. Optic flow can also provide information for vehicle detection. In addition to discussing their applications and principles fundamental to those processes, the site has a brief video clip that highlights their systems in action. This information can be found in the “Artificial Vision Technology” section of their website. Action Research Laboratory The “Action Research Lab”, located in England, has wealth of information. Click on “Demonstrations of ARL Displays” for video clips of optic flow displays and applications to steering and braking. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


VENlab – Brown University This is the more general website for the “Virtual Environment in Navigation” (VEN) Lab at Brown. Great mix of human and animal studies and robotics. On the “Research” page you can find out more information about wayfinding, obstacle avoidance, and navigational strategies . PandA Lab – Perception and Action Lab The “PandA” labs at Rensselaer has numerous on-going research projects on perception and action including vision and locomotion, affordance perception in actions, and the dynamics of steering. Hank Virtual Environments Lab This University of Iowa laboratory is involved in studying the connection between perception in action in real and virtual environments. Studies range from those testing children’s ability to safely cross a street on a bicycle to those investigating the connection between actual walking speed and perceived walking speed. Center for the Ecological Study of Perception & Action CESPA is the “Center for Ecological Study of Perception and Action” at the University of Connecticut. This website cites the projects and researchers at this leading lab on the ecological approach to perception, as well as numerous links to other websites. Movie Scene – Dr. Doolittle (1998) (Scene 8: “Hair of the Dog”; although it is easier to get to the scene by going to Scene 9: “Animal Assembly” and going back a minute): In this scene, Eddie Murphy as Dr. Doolittle is reluctantly taking “Lucky” the dog to meet his family. Lucky has his head out the window, and is watching the lines go fast (and getting a little nauseated). Doolittle tells Lucky to look in the distance at the trees going by. This funny scene can then launch you into the concept of gradient of flow.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 8: PERCEIVING MOTION

Chapter Outline I.

Introduction A. Some Questions We Will Consider

II.

Functions of Motion Perception A. Motion Provides Information About Objects 1. Examples of animal survival B. Motion Attracts Attention 1. Attentional capture C. Motion Helps Us Understand Events in Our Environment 1. Event and event boundary D. Life Without Motion Perception 1. Akinetopsia or “motion blindness” a. Patient L.M. (Zihl et al., 1983, 1991).

III.

Studying Motion Perception A. When Do We Perceive Motion? 1. Real motion 2. Illusory motion a. Apparent motion 3. Induced motion 4. Motion aftereffects (e.g., waterfall illusion) B. Comparing Real and Apparent Motion 1. Larsen et al. (2006) fMRI study 2. Similar areas activated by real and apparent motion C. What We Want to Explain 1. Motion perception is more than a stimulus crossing the retina

IV.

Motion Perception: Information in the Environment A. J.J. Gibson and the “optic array” 1. Local disturbances in optic array 2. Global optic flow

V.

Motion Perception: Retina/Eye Information A. The Reichardt Detector (Reichardt, 1969) 1. Output unit 2. Delay unit B. Corollary Discharge Theory 1. Signals From the Retina and the Eye Muscles a. Image Displacement Signal (IDS) b. Motor Signal (MS)

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


c. Corollary Discharge Signal (CDS) 2. Behavioral Evidence for Corollary Discharge Theory a. Demonstration: Eliminating the Image Displacement Signal With an Afterimage b. Demonstration: Seeing Motion by Pushing on Your Eyelid 3. Physiological Evidence for Corollary Discharge Theory a. Patient R.W.: lesions in medial superior temporal (MST) area b. “Real motion neurons” VI.

Test Yourself 8.1

VII.

Motion Perception and the Brain A. The Movement Area of the Brain 1. Middle temporal (MT) area 2. Coherence B. Effect of Lesioning, Deactivating and Stimulating 1. Method: Transcranial Magnetic Stimulation (TMS) 2. Method: Microstimulation C. Motion from a Single Neuron’s Point of View 1. The aperture problem 2. Demonstration: Movement of a Bar Across an Aperture 3. Solving the aperture problem

VIII.

Motion and the Human Body A. Apparent Motion of the Body 1. Shortest path constraint B. Motion of Point-Light Walkers 1. Perceptual organization a. Biological motion 2. Brain Mechanisms

IX.

Something to Consider: Motion Responses to Still Pictures A. Implied Motion B. Representational momentum

X.

Developmental Dimension: Motion Preferences Among Newborn Babies A. Simion et al. (2008) – innate ability to detect biological motion

XI.

Test Yourself 8.2

XII.

Think About It

XIII.

Key Terms

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Learning Objectives: By the end of the chapter, the student should be able to: 1. Discuss the major functions of motion perception and the ways in which we can perceive motion. 2. Describe real and apparent motion, what the aperture problem is, and how the visual system “solves” this problem. 3. Detail how lesioning, transcranial magnetic stimulation, and microstimulation have been used to study how neurons signal motion. 4. State the major principles of the corollary discharge theory of movement perception, and summarize the behavioral and physiological support for the theory. 5. Discuss behavioral and physiological research on implied motion, and relate this to the concept of representational momentum. 6. Discuss developmental research on motion preference in newborn babies.

Chapter Summary/ Overview Perceiving motion is vital to daily activities, such as playing sports, crossing the street, and understanding events in our environment. Considering the case of a woman who has akinetopsia can help us understand the importance of perceiving motion. Motion also attracts attention, and provides information for form perception (camouflage exemplifies this concept). The challenge of dividing continuous streams of perceptual information into meaningful events is discussed. Although event boundaries can be difficult to determine, one common boundary marker is a change in object speed. When do we perceive motion? There are four major instances: (1) real motion; (2) apparent motion (the basis for movies and message boards); (3) induced motion (a stationary object appears to move if another object moves); and (4) motion aftereffects (as demonstrated by the waterfall illusion). Larsen et al.’s (2006) research addressed the similarities between the perception of real and apparent motion using fMRI’s. Although our understanding of motion seems like a simple process, even commonplace situations can be used to demonstrate the complexity of this task. Perception of motion is more than detecting movement across the retina. J. J. Gibson’s ecological approach emphasizes that observers and objects move in the environment. Movement is perceived through changes in the “optic array.” These changes can be local disturbances, or global optic flow. Another approach is to explain motion perception from a neurological standpoint. A relatively early proposal was the Reichardt detector. This detector can explain how we can detect motion in a specific direction, but does not explain why, for instance, you can look around at stationary objects without thinking those objects are moving. A more complex explanation that takes eye movements into account is the Corollary Discharge Theory. According to this theory, a “comparator” compares image movement on the retina to a motor command sent to the eye muscles. Movement is perceived if only one of these happens at one time. Thus, perception of movement is better accounted for with multiple information sources. Behavioral evidence that supports this theory includes the perceived movement of an afterimage and jiggling your eyeball with your finger. Patients with vertigo, and the discovery of real-motion neurons provide physiological support as © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


well. From this point, the chapter transitions to a discussion of the brain areas associated with movement perception and special cases of movement perception. The brain area most often associated with movement perception is MT. One way it has been tested is by manipulating the amount of coherence of moving dots on a screen; the more sensitive an area is to movement, the less coherence should be required to detect that movement. Research shows that as coherence is increased, MT cell firing increases, and that lesioning MT cells affects the level of coherence needed to determine the direction of movement. Microstimulation of MT also greatly affects movement perception and can even cause misperceptions. Within the MT, individual neurons are code movement information. The aperture problem is related to the restricted “view” any individual neuron has of a scene. More specifically, when only a portion of two bars is “perceived” by the neuron, the scope of the bars’ movements may not be apparent (e.g., a bar moving right and up and down may only appear as moving right if the top and/or bottom of the bar is outside of the “aperture”). This problem is “solved” by the firing of MT neurons that, over a time course of about 140 ms, combine the signals from V1 neurons to correctly determine the direction of movement. Next, the importance of being able to perceive the motion of people is discussed in terms of biological motion. In apparent motion it has been demonstrated that the MT cortex is sensitive to possible human actions, but not impossible actions (e.g., fist passing seamlessly through a head). This bias does not occur for objects, such as boards. Johansson first showed that observers can perceive “biological movement” by watching point-light walkers. The superior temporal sulcus (STS) is the brain area that has been identified in perceiving biological motion. Newsome, using monkeys, showed that the MT cortex is vital to perceiving biological motion, and Grossman et al (2005) used transcranial magnetic stimulation to support the role of the STS in perceiving biological motion. The next section of the chapter addresses implied motion, and motion responses to still pictures. Still pictures can also imply motion that is about to occur. This information can affect memory for the original stimulus. For example, a picture of a boy jumping from fence is later remembered as the boy being closer to the ground. This is an instance of representational momentum The brain areas activated in perceiving implied movement are the MT and MST. Finally, research on motion preferences among newborn babies is discussed. Findings support that humans are born with the ability to detect biological motion. Demonstration, Activities, and Lecture Topics (1) Video Suggestions: The Mind’s Eye: How the Brain Sees the World has a segment on Zihl et al.’s motion agnosia patient. This is highly recommended; students tend to have many questions about her situation, and the film answers most of the questions. The application of apparent movement to the creation of movies is explored in The Movies Begin: A Treasury of Early Cinema 1894-1914 (Kino Video, 1994). “The Great Train Robbery” and “A Trip to the Moon” are included, along with a description of Muybridge’s, Lumiere’s, and Edison’s early motion work. Preview the video first for anything you might consider objectionable (e.g., nudity, cockfights).

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(2) Attention and Apparent Motion: This is a lecture topic that links top-down processes with illusory motion. The following references investigate different stimuli, but are related by the focus on attentional processes. Both works provide good background information on the mechanisms that might underlie the phenomenon of interest. Downing, P. E., & Treisman, A. M. (1997). The line-motion illusion: Attention or implementation. Journal of Experimental Psychology: Human Perception and Performance, 23, 768-779. Lu, Z.-L., & Sperling, G. (1995). Attention-generated apparent motion. Nature, 377, 237-239. (3) Waterfall Illusion in Video Games: Since 2005 when the video game “Guitar Hero” was made widely available, many people have firsthand experience with the waterfall illusion. A search of “Guitar Hero Illusion”, or “Guitar Hero Vision Problem”, yields numerous reports of individuals experiencing the illusion. The “Perceptionsense” blog, topic “Guitar Hero Illusion”, provides a nice description of the phenomenon and a video of someone playing the game that can be used to induce the effect in the classroom. (4) Biological Motion LIVE!: The computer-generated examples of biological motion are interesting, but nothing is better than a live performance! Purchase some “glow-in-the-dark” stars (available from many stores or go to amazon.com) and some extra tacky stuff (“poster putty”), get 2-4 student volunteers (who are not in the class), some appropriate music, and a dark room, and voila! you have a cheap biological motion demonstration. In a lighted room, put 12 stars on each person (a la Johannson). Have the class close their eyes, and have them keep their eyes closed until all the “point-light dancers” are in the room. Turn on the music, have the dancers stay stationary, and then tell the class to open their eyes. Keep the dancers stationary for about 5 seconds, and then start dancing. Students will be able to see that no “structure” is seen in the stars until movement occurs, but once movement occurs, students easily identify the dancers’ movements, and sometimes can get the gender of the dancers as well. Music I’ve used for this exercise included: Springsteen’s Dancing in the Dark; Led Zeppelin’s In the Light, Village People’s YMCA, the Ramones’ Do You Wanna Dance, Moby’s We Are All Made of Stars, and perhaps most appropriately, Maria Muldaur’s It Ain’t the Meat (It’s the Motion). A word of warning: Be careful!! You are in dark room!! Have the point-light dancers stay in the same location, and avoid bumping into each other. You may want a practice session to work out any problems, and bring a penlight to use. (5) “Motion Capture”: Video Games, Movies and Sports: Johannson’s research on biological motion is really the precursor to what is called “motion capture” by computer-graphics professionals (the term has a different meaning in perception research). Points of light are placed on individuals (usually many more than 12!) © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


to create characters in video games and movies (e.g., “The Polar Express” used point-lights on Tom Hanks’ face to create the conductor). This technique has also been used to help analyze performance of athletes, such as golfers and baseball pitchers. The American Sports Medicine Institute’s “Biomechanical Evaluations” page, provides a detailed discussion and video of motion analysis for pitchers. (6) Spiral Aftereffects and Interocular Transfer: As you present the waterfall illusion, you may also discuss the spiral aftereffect. With that introduction, you can lead into the logic of interocular transfer. Specifically for the spiral aftereffect, if you adapt to the spiral with just the right eye, and then switch to just the left eye when viewing a test stimulus, you still get the effect. Have students discuss what this would mean in terms of the locus of the effect, and what other phenomena might be tested with this procedure. (7) Induced Movement: A couple of examples of induced movement can be easily achieved. One example, found in previous editions of Goldstein, is to record a basketball game or hockey game, and select a segment that has a lot of end-to-end action. Put a dot on the screen while you play that segment. The dot will appear to fly around, even though observers know that the dot is not physically moving. A second example is the opening credits of the film Star Trek: Insurrection. The names on the screen stay in the same position on the screen, but appear to move as the camera pans over a scene of a village. (8) Apparent Motion and Scanimation Books: Rufus Butler Seder has published two immensely popular children’s books that make use of apparent motion: Gallop! and Swing!. Although aimed at children, the technique he devised for these books is amazing: you can take one apart to reveal how it is done (or if you have a problem with “defacing” a book, there are now “scanimation” note cards!) (9) Camouflage and Motion Perception: This chapter addresses how motion can affect form perception in regard to how different species can use camouflage and “freezing” to help them to survive. Another interesting link to vision is how the animal uses their vision to determine how it should camouflage itself. A nice video demonstration can be found at The New York Times, in “Revealed: Secrets of the Camouflage Masters”, by Carl Zimmer. This article presents the work of Roger Hanlon on his research of cuttlefish camouflage. Suggested Websites Counter-Rotating Spirals Illusion (by “Dogfeathers”) This is a stimulus for the spiral aftereffect that has the inner segment moving in one direction, and the outer segment moving in the opposite direction. You also have control over the speed of the spiral, and you can reverse direction of the spirals in this version.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


NYU Movement Lab Christoph Bregler, now at NYU’s Courant Institute, is a leader of research in applications of motion capture. He, along with the NYU Movement Group, have developed a website containing many examples of motion capture and their current projects. Some projects have included motion capture and retargeting in classic cartoons, capture of swimming/ diving in the 2012 Olympics, and capture of facial expressions. Bio Motion Lab This is the website for answers to all your questions about biological motion and excellent demonstrations. At this Queen’s University (Canada) laboratory researchers are studying gender, gait, and the “emotional” aspect of the gait. One of the demonstrations on this site allows you to manipulate points of light by changing the subject (cat, human, pigeon), inverting, scrambling, and masking the lights, as well as changing the direction the lights are “moving” in. Simi – Reality Motion Systems This website supplies nice applications of “motion capture” to improving athletic performance in gymnasts, figure skaters, ski jumpers, and runners. Movie Scene – Flashdance (1983) (DVD scene 11: “Imagination”: 2:19-3:17). In this scene, a dancer in a club performs with a strobe light on. This is an example of apparent (or stroboscopic) movement. The strobe rate is at a level that results in apparent, although smooth, movement perception. You can also relate the “jerkiness” of this apparent movement to the frame rate in early movies.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 9: PERCEIVING COLOR

Chapter Outline I.

Introduction A. Some Questions We Will Consider B. Cerebral achromatopsia

II.

Functions of Color Perception A. Signaling function B. Perceptual organization

III.

Color and Light A. Reflectance and Transmission 1. Chromatic colors: Selective reflection 2. Achromatic colors: equal reflection across spectrum 3. Reflectance curves 4. Selective transmission – transparent objects 5. Transmission curves B. Color Mixing 1. Mixing paints a. Subtractive color mixture 2. Mixing lights a. Additive color mixture

IV.

Perceptual Dimensions of Color A. Spectral colors B. Nonspectral colors C. Chromatic colors or hues 1. Saturation and desaturation 2. Value: light-to-dark dimension of color D. Color solid: 3D color space 1. HSV color solid

V.

Test Yourself 9.1

VI.

The Trichromatic Theory of Color Vision A. Color-Matching Evidence for Trichromatic Theory 1. Young-Helmholtz theory 2. Method: Color Matching B. Physiological Evidence for Trichromatic Theory 1. Cone pigments a. Opsin and Retinal 2. Cone responding and Color Perception

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


a. Metamers: Physically different stimuli that produce the same percept (look identical) b. Metamerism occurs because the stimuli result in the same firing pattern C. Are Three Receptor Mechanisms Necessary for Color Vision? 1. Vision With One Receptor Type a. Isomerization b. Principle of univariance 2. Vision with Two Receptor Types: a. Ratios of responses between pigments b. Trichromatic: Pattern of activity c. Monochromats, dichromats, and trichromats VII.

Opponent-Process Theory of Color Vision A. Hering’s Phenomenological Evidence for Opponent-Process Theory 1. Color circle 2. Hering’s primary colors B. Hurvich and Jameson’s Psychophysical Measurements of the Opponent Mechanisms 1. Demonstration: Afterimages 2. Complementary afterimages 3. Method: Hue Cancellation C. Physiological Evidence for Opponent-Process Theory 1. Opponent Neurons D. How Opponent Responding Can Be Created by Three Types of Receptors 1. Difference information

VIII.

Color in the Cortex A. Is There a Single Color Center in the Cortex? 1. Distributed processing B. Types of Opponent Neurons in the Cortex 1. Single-opponent neurons 2. Double-opponent neurons

IX.

Color Deficiency A. Monochromatism 1. Color blind B. Dichromatism 1. Color deficient 2. Color vision test: Ishihara plates 3. Unilateraral dichromat 4. Three forms of dichromatism: a. Protonopia b. Deuteranopia c. Tritanopia

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


X.

Test Yourself 9.2

XI.

Color in A Dynamic World A. Color Constancy 1. Chromatic Adaptation a. Demonstration: Adapting to Red 2. The Effects of the Surroundings a. Demonstration: Color and the Surroundings 3. Memory andColor a. Memory Color B. Lightness Constancy 1. Lightness and reflectance 2. The Ratio Principle 3. Lightness Perception Under Uneven Illumination a. Reflectance edge and illumination edge 4. The Information in Shadows a. Demonstration: The Penumbra and Lightness Perception 5. The Orientation of Surfaces a. Demonstration: Perceiving Lightness at a Corner

XII.

Something to Consider: Color is a Creation of the Nervous System A. Wavelengths do not have color B. Examples from other modalities

XIII.

Developmental Dimension: Infant Color Vision A. Cones underdeveloped at birth 1. Bornstein et al. (1976) a. 4-month-olds can categorize colors like adults do

XIV. Test Yourself 9.3 XV.

Think About It

XVI. Key Terms

Learning Objectives By the end of the chapter, the student should be able to: 1. Outline how wavelength, intensity, and saturation affect our color experience. 2. Describe how additive and subtractive color mixing works. 3. Describe and draw reflectance curves for different pigments. 4. Explain the basic principles of and the behavioral and physiological support for the trichromatic theory. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


5. Explain the basic principles of and the behavioral and physiological support for the opponent-process theory of color. 6. Discuss the issues regarding color processing in the cortex. 7. Define color constancy and lightness constancy. 8. Discuss the color perception capabilities of human infants. Chapter Overview /Summary Chapter 9 opens with a description of a person with cortical color blindness and the challenges he faced. This case introduces the reader to the functions of color vision, such as beauty, signaling functions, perceptual organization, and identification of objects. Wavelength and color is specified, including how reflectance curves are plotted to show how different colored surfaces reflect different wavelengths. This includes a discussion of chromatic and achromatic colors. Related to this topic of “color” composition is color mixing. The firing patterns of the cones are related to color mixing. Mixing light of different wavelengths involves additive color mixing – wavelengths in sources are added together. Mixing paint colors results is a type of subtractive color mixing. When pigments are mixed the wavelengths absorbed by both are combined in the same mixture, so they both “subtract” or absorb those wavelengths; whatever remains is reflected. Perceptual dimensions of color are next discussed, elaborating on spectral colors, nonspectral colors, chromatic colors and properties such as saturation, and color solids. The two major theories of color vision are then discussed. The Young-Helmholtz “trichromatic theory” proposes that our color experience is based on having three different cone receptors. Behavioral support comes from color matching experiments. Physiological support comes from the discovery of three different cone pigments in the retina, and the finding that different patterns of firing occur for different colored stimuli. An interesting aspect of color perception is then described. Two stimuli can have different physical properties but appear to be the same color; these are called metamers. One additional issue associated with the trichromatic theory is why three receptor mechanisms are necessary, by explaining why one mechanism would not be sufficient, and then why two mechanisms would not be sufficient. The second major theory of color vision is Hering’s “opponent-process theory,” which states that color experience is dependent on “black/white,” “red/green,” and “blue/yellow” opponent mechanisms. Behavioral evidence for the theory includes color afterimages. The discovery of opponent neurons in the LGN and retina provide physiological evidence. How color is represented in the cortex is then discussed. Although some evidence suggests a cortical “color center,” evidence supports a distributed processing theory. Color deficiency is elaborated upon in the next topic. The retinal physiology of color also has implications for color deficiencies. Monochromats don’t have functioning cones, dichromats have only two functioning cone mechanisms and anomalous trichromats mix wavelengths in different proportions. Unilateral dichromats have been used to further investigate color deficiencies. Protanopes are missing the long-wavelength cones; deuteranopes are missing the medium-wavelength cones; and the tritanopes are probably missing the shortwavelength cones. The “neutral point” is also introduced to describe the types of dichromacy.The remaining topics in the chapter address constancy. First, color constancy is defined, along with an example and a demonstration. Second, the role of chromatic © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


adaptation in color constancy is discussed; this construct also explains that adaptation “tones down” dominant colors in a scene. Third, the effects of surrounding colors are related to color constancy. Finally, a brief discussion of the interaction between memory and color is presented. The other type of constancy, lightness constancy, deals with achromatic colors. How do we know if differences in lightness are due to illumination or reflectance? Possible causes for lightness constancy are the intensity relationships (described by the ratio principle); distinguishing between illumination edges and reflectance edges under uneven illumination (including the effect of shadows); the orientation of surfaces; and perceptual organization. “Something to Consider” examines how the color experience is created. As Newton stated, “The Rays… are not coloured,” meaning that colors are created by the perceptual system. This issue is also related to analogous situations in audition and olfaction. The chapter concludes with a discussion of infant color vision. Bornstein et al. used the habituation technique to show that 4-monthold infants can categorize colors the same way adult humans do.

Demonstrations, Activities, and Lecture Topics (1) Christine Ladd-Franklin: Experimental psychology has historically been a maledominated field. One important exception is Ladd-Franklin, who was a pioneer in color vision research, including an evolutionary approach. Listed below is a small sample of her work, along with an important recent article by Furumoto that focuses more on her struggles against the gender discrimination in the 1890s than her research. Furumoto, L. (1992). Joining separate spheres – Christine Ladd-Franklin, Womanscientist. American Psychologist, 47, 175-182. Ladd-Franklin, C. (1992). Tetrachromatic vision and the development of vision. Science, 55, 555-560. Ladd-Franklin, C. (1922). Practical logic and color theories. Psychological Review, 29, 180-201. (2) Color and design: A major application of color research is design. The two books below have extensive information and great examples about how color is used in paintings, print, and websites. These resources have additional information on color mixing, historical information, and emotion aspects and symbolism of colors. Lauer, D. A., & Pentak, S. (2005). Design basics. Belmont, CA: Wadsworth. Pipes, A.(2004). Introduction to design. Upper Saddle River, NJ: Pearson-Prentice Hall. (3) Pink locker rooms: Controversy at the University of Iowa concerned the pink color of visiting football team’s locker room. Critics maintained that painting everything © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


pink was homophobic and sexist; painting the pink was akin to writing “sissy” on the wall. Supporters claim that the color has nothing to do with these issues: The origin of the tradition was former Iowa coach Hayden Fry, who used his undergraduate psychology knowledge to attempt to generate a more passive ambience for the visiting team. The pink color scheme was preserved in the 2004 stadium remodel. The reason for highlighting this story is to show how use of color can generate such emotional controversy. ESPN presented a short report on the locker room in 2005 entitled, “Opponents seeing red over Iowa’s pink locker room.” Following suit, Bondurant-Farrar High School received a substantial gift to renovate their football facilities. There was one catch … the opponents’ locker room should designed after the University of Iowa’s. The story is entitled, “Iowa stadium forced to include pink visitors’ locker room as part of $3 million donation.” (4) Retinal color mapping: An oldie but goodie that emphasizes the link between perception and physiology. Chapter 1 in Power, Hausfeld, and Gorta (1981) details how students can use a perimeter to determine the retinal areas where colors can be detected. Using small color patches of red, yellow, green, and blue, students can map out the regions when these colors can be seen. Lafayette Instruments is a good source for the apparatus needed, including the perimeter, retinal maps, and the color disks. There also is a white disk, which you can use to introduce the concept of “catch trials.” Power, R. P., Hausfeld, S., & Gorta, A. (1981). Workshops in perception. Boston: Routledge & Kegan Paul. (5) Classic readings: Highly recommended is an “instant classic”: Margaret Livingstone’s Vision and Art. Although depth and object perception are also covered here, it’s the information on color and the stunning examples that make this book such an amazing resource. The Mollon chapter is another great resource for the physiological aspects of color vision. Livingstone, M. (2002). Vision and art: The biology of seeing. New York: Abrams. Mollon, J. D. (1982). Colour vision and colour blindness. In Barlow, H.B.& Mollon, J. D. (eds), The senses. Cambridge, UK: Cambridge University Press. (6) Video suggestions: A great video that covers the vision of different species is Through Animal Eyes (1985), from the PBS series Nature. The film simulates what dogs, cats, birds, insects, and fish might be experiencing. Much of the video is about color vision, but also addresses some other issues, such as eye movements and compound eyes. Another video that my students have enjoyed is “The Island of the Colorblind,” a documentary by Oliver Sacks from his “Mind Traveler” BBC/WNET series. It is a fascinating look at a “perfect storm” (somewhat literally!) of conditions that led to widespread color deficiency on the island of Pingelap. Sacks also has a conversation with Knut Nordby, a Norwegian scientist who was born colorblind. As an added bonus, students get to see the engaging Dr. Sacks, after reading about him in © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


the textbook a couple of times. (Sacks also has a book of the same title). Additional information can be obtained at Oliver Sacks’ website. (7) Fish Color Vision and Evolution: Another animal related article on color vision can be found from a 2008 The New York Times article, “Seeing Red and Blue Can Divide a Species – of Fish.” This article, with a link to the original journal article, discusses how the depth of the water in which the cichlid fish in Lake Victoria is related to the color of the fish. (8) Color Across Disciplines: The importance of color in marketing, emotion, and therapy can be highlighted as a way of introducing this chapter. For example, the colors of the iMacs have been identified as a reason for the early success of these computers. Leslie Harrington is one of the highest profile “color consultants.” She has appeared on the Today Show and has a professional website containing more information about her work.

Suggested Websites Designing a Color Graphics Page (Checklist) This phenomenal NASA website covers anything you might want to know about colors in graphic design. A color tool is also accessible on this site for demonstrating the different characteristics of color. Color Matters There’s extensive information on this website on a wide range of color issues (it even discusses metamerism!). A multidisplinary site that has accessible information for all students, with great links and other resources. An added feature of this website is that you can get an eBook copy of the website to use offline (go to the tab “Color & Design”, and click on “Color Design eBooks”.) Perceptual Science Group at MIT The website for the Perceptual Science Group at MIT has many illusions pertinent for this chapter, especially lightness illusions. Included in the “Gallery” are the checkerboard illusion, a simultaneous contrast illusion, Knill and Kersten’s illusion (lightness with 3-D shapes), and the White’s illusion The explanations for the illusions are excellent. Optical Illusions and Visual Phenomena Michael Bach has a couple of color illusions on his website: Benham’s top, neon color spreading, and the “lilac chaser.” The last one, which has been making its rounds on the internet, combines color afterimages and the Troxler effect. You can also find a color mixing example in the “Colour” section of the site. Visual Perception Library The Visual Perception Library (viperlib) website directed by Peter Thompson and Rob Stone at the University of York Psychology Department has numerous resources related © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


to this chapter. One particularly relevant offering comes from the “Lightness Brightness” section which contains examples of lightness illusions and explanations. Amos Storkey – Visual Illusions Amos Storkey also has many illusions on his website. The best one for color is Daniel White’s cyan color afterimage effect. Optical Illusions Etc: South Pole After Image This Walt Anthony website contains many examples of optical illusions including some fantastic examples for after images and even instructions on how to create an after-image illusion using your own photographs. Behr Paint The Behr paint company has a color tool (ColorSmart) for shopping for their interior and exterior paints. You can select a hue, and then change the brightness and saturation (“mutedness”) to find paint colors. Added bonus here is the creativity used in naming these colors. Ralph Lauren paints also have very creative names (some favorites: “cougar,” “bungee pink,” “painted stork,” and “golden retriever”). Vischeck This website, developed by Bob Dougherty and Alex Wade while at Stanford University, simulates what people with color deficiencies may perceive. Movie Scene – Pleasantville (1998) (1:00:15 – 1:02:37): The main plot of Pleasantville is that a teen (David, played by Tobey Maguire) and his sister from the 1990s enter the world of a 1950s television program “Pleasantville.” All is in black and white in the world of Pleasantville, until the repressed attitudes of the 50s are removed by the 1990s influence. Although the whole movie exemplifies the importance of color, this particular scene sums up the idea. In this scene, David brings an art book to Mr. Johnson (Jeff Daniels), a soda jerk, and aspiring artist. As Mr. Johnson flips through the book of classic art work (all in color), his amazement and inspiration grows. After looking at the book, Johnson poignantly remarks to David how people who are to be able to see colors don’t know how lucky they are. (Note: Some of the art work contains nude paintings).

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 10: PERCEIVING DEPTH AND SIZE

Chapter Outline I. II.

Introduction A. Some Questions We Will Consider Perceiving Depth A. Cue approach to depth perception 1. Example: Occlusion

III.

Oculomotor Cues A. convergence and accommodation B. Demonstration: Feelings in Your Eyes

IV.

Monocular Cues A. Accommodation and pictorial cues B. Pictorial Cues 1. Occlusion 2. Relative Height 3. Familiar and Relative Size 4. Perspective Convergence 5. Atmospheric Perspective 6. Texture Gradient 7. Shadows C. Motion-Produced Cues 1. Motion parallax 2. Deletion and accretion a. Demonstration: Deletion and Accretion 3. Integrating Monocular Depth Cues

V.

Binocular Depth Information A. Demonstration: Two eyes: two viewpoints B. Stereoscopic depth perception C. Seeing Depth With Two Eyes 1. Strabismus D. Binocular Disparity 1. Corresponding Retinal Points a. Horopter 2. Noncorresponding Points and Absolute disparity a. Angle of disparity b. Crossed disparity c. Uncrossed disparity 3. Absolute Disparity Includes Distance From the Horopter 4. Relative Disparity Is Related to Objects’ Positions Relative to Each Other

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


E. Disparity (Geometrical) Creates Stereopsis (Perceptual) 1. Stereopsis: Depth perception from disparity 2. Random-dot stereogram 3. Stereoscope F. The Correspondence Problem VI.

The Physiology of Binocular Depth Perception A. Binocular depth cells B. Disparity tuning curves

VII.

Test Yourself 10.1

VIII.

Perceiving Size A. The Holway and Boring Experiment 1. What is Visual Angle? 2. How Holway and Boring Tested Size Perception in a Hallway B. Size Constancy 1. Demonstration: Perceiving Size at a Distance 2. Size Constancy as a Calculation a. Size-Distance Scaling: S = K(R x D) b. Demonstration: Size-distance scaling and Emmert’s Law 3. Other Information for Size Perception

IX.

Illusions of Depth and Size A. The Müller-Lyer Illusion 1. Misapplied Size Constancy Scaling 2. Demonstration: The Müller-Lyer Illusion With Books 3. Conflicting Cues Theory B. The Ponzo Illusion 1. Railroad track example C. The Ames Room 1. Size-distance scaling vs. relative size D. The Moon Illusion 1. Apparent-distance theory 2. Angular size contrast theory

X.

Something to Consider: Depth Information Across Species A. Frontal eyes vs. lateral eyes B. Movement Parallax and Insects C. Sonar and Echolocation

XI.

Developmental Dimension: Infant Depth Perception A. Binocular Disparity 1. Aslin (1977): Binocular fixation by 3 months old 2. Fox et al. (1980): disparity information at about 3.5 – 6 months

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


B. Pictorial cues 1. Depth from Familiar Size a. Ganrud et al. (1985): Familiarization period and test period b. Method: Preferential Reaching 2. Depth from Cast Shadows a. Yonas and Granrud (2006): emerges at about 7 months XII.

Test Yourself 10.2

XIII.

Think About It

XIV. Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. 2. 3. 4.

Compare and contrast the different cues that signal depth perception. List and define the different pictorial cues with examples. Explain how binocular disparity results in depth perception. Discuss how random-dot stereograms are created and their importance, explain what the correspondence problem is. 5. Outline the anatomy and physiology of depth perception. 6. Discuss the method, results, and implications of Holway and Boring’s “hallway” research. 7. Define size constancy and state the size-distance scaling equation and its components. 8. Debate the proposed theoretical explanations for the Müller-Lyer, Ponzo, Ames Room, and Moon illusions. 9. Discuss the differences in depth perception seen across species and how these variances reflect the needs of specific species. 10. Describe infant development of depth perception. Chapter Overview/ Summary Chapter 10 begins with the mystery of how we can perceive objects at different distances when the information on our retinas is two-dimensional. The major answer to this problem is the cue approach to depth perception. Cues can be classified as oculomotor, monocular, and binocular. The oculomotor cues of convergence (eyes turn inward to focus on a near object) and accommodation (eye muscles increase power of the lens) can be demonstrated by keeping your finger in focus as you bring it closer to your nose. A major sub-classification of monocular cues is pictorial cues, which are those that can be depicted in a picture. These cues include occlusion, relative height, relative size, familiar size, perspective convergence, atmospheric perspective, texture gradient, and shadows. Some monocular cues are motion-related, such as motion parallax (images of near objects © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


move faster than distant objects), and accretion and deletion (combination of occlusion and movement). The range of effectiveness of the monocular cues is summarized; for example, accommodation and convergence are most effective at close range, atmospheric perspective at long range, and occlusion and relative size across all ranges. The mechanics and importance of binocular depth cues are discussed. Stereoscopic depth perception plays a large role in daily life and is, perhaps, best expressed by individuals with deficiencies. Susan Barry presents one such case, as she suffered from strabismus. The binocular depth cue of binocular disparity is then discussed in detail. Images of objects that are the same distance from the observer fall on corresponding retinal points; images that are at different distances fall on non-corresponding (or disparate) points. A measure of the amount of disparity is the angle of disparity. Binocular disparity is then shown to be the basis of stereoscopes. A demonstration in the text shows that stereopsis can be achieved even with a stereoscopic device. The same principles also underlie the different types of “3-D” movies and television sets. To isolate binocular disparity from pictorial depth cues, Julesz developed random-dot stereograms. Random-dot stereograms highlight another issue in binocular disparity: the correspondence problem. This problem addresses how the visual system determines what image in one eye corresponds to what image in the other eye. The physiology of depth perception is then discussed. Research shows that some neurons in the parietal lobe respond to texture gradient information, and that disparity-selective cells exist. Selective rearing studies in kittens and microstimulation of disparity-selective cells in monkeys support the hypothesis that disparity-selective cells are responsible for stereopsis. Size perception is related to depth perception. The Holway and Boring (1941) “hallway” study is a classic example of this. In order to understand the study, the concept of “visual angle” is described. The major result of Holway and Boring is that size perception is more accurate when depth information is available, but we judge object size on visual angle when depth cues are lacking. This study also leads to another important concept: size constancy. Size constancy is when we perceive an object to be roughly the same size, even though the visual angle of the object can vary when it is at different distances. Size constancy can be understood as a calculation: S = K (R x D), where S is object’s perceived size; K is a constant; R is the size of the retinal image; and D is the perceived distance of the object. Emmert’s law is used as support for this equation. The role of relative size and texture gradient in size constancy are also discussed. Misapplying size-constancy can lead to visual illusions, such as the Ponzo illusion, the Ames Room illusion, the moon illusion, and the Müller-Lyer illusion. Alternative explanations for the Müller-Lyer illusion (conflicting cues theory) and the moon illusion (angular contrast) have also been proposed to account for some phenomena that misapplied size-constancy can’t answer. A discussion of the differences between species in depth perception follows. Animals that have frontally located eyes can make use of disparity information. Insects make use of movement parallax, while bats use echolocation. The chapter concludes by describing how depth perception develops over the first 6 months. Binocular fixation must develop first (within the first 3 months), then the use of disparity information emerges between 3 ½ and 6 months. Use of familiar size as a depth cue, which involves not only visual capabilities but also memory, emerges between 5 and 7 months.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Demonstrations, Activities, and Lecture Topics (1) Field Trip: This chapter affords a great opportunity to get out of the classroom and into a museum. Most campuses have an art gallery. If the exhibit is appropriate, you can take your class to the gallery, and asked them to find examples of pictorial depth cues in the paintings or photographs. Off campus possibilities might be any exhibit on depth illusions (I took my class to such a traveling exhibit at the Museum of Science and Technology in Syracuse one year that had a full size Ames room and many other examples). Students were thrilled by the experience. (2) Familiar Size with Oversized Coins: A demonstration of familiar size that is similar to the Epstein study cited in the text can be easily accomplished. You can get 3” diameter versions of American coins. I got mine at Wall Drug in South Dakota, but they can also be found online at the House of Chuckles (under “Novelties: Money Fun”). If you have a document camera (e.g., an ELMO) in your classroom, you demonstrate the effect of familiar size by placing a real quarter on the ELMO and holding the “jumbo” quarter just slightly higher. (Be sure that the students cannot see the document camera as you’re doing this). Students will report that the “jumbo” quarter is much closer to them than the real quarter, and won’t suspect the fake coin is oversized. One exception: Occasionally, a particularly observant student will notice the size of your fingers compared to the coin, and realize the coin is oversized. (3) Book Suggestions: There are numerous books on depth perception and depth illusions, but I’ll suggest just a couple of my favorites. Frisby, J. P. (1979). Seeing: Illusion, brain, and mind. New York: Oxford University Press. (Great information and numerous anaglyphs.) Horibuchi, S. (1994). Stereogram. San Francisco: Cadence Books. (The best book I know of for explaining stereograms and techniques for free-fusing stereoviews and autostereograms. Includes a chapter on Christopher Tyler’s invention of the autostereogram.) Ittelson, W. H. (1968). Ames demonstrations in perception. New York: Hafner Press. (A classic book that reveals the “secrets” to constructing the Ames room and other similar illusions.) (4) Cues in Conflict: An excellent paradigm for investigating the strength of individual depth cues to put them in conflict with each other, and see which one “wins out.” There is parametric research in this area (e.g., Dosher, Sperling, & Wurst, 1986), but this concept can be demonstrated easily with 3-D glasses and the photographs on the Virtual Lab CD-ROM. If students look at “Lincoln Park” with the glasses on in a way consistent with the scene (e.g. blue on right eye; red on left eye), strong depth is perceived. When the glasses are turned around (e.g., © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


blue on left eye; red on right eye), the pictorial cues seem to be weighted more than stereopsis, so the scene looks “normal” but with less depth. If you look at the “Ballerina” photograph, consistent disparity and pictorial cues will lead the student to see the ballerina’s leg coming out toward them, but by turning the glasses around (having conflicting disparity information with the pictorial cues), most students see a bizarre perception of the ballerina’s leg being sucked into her body! Dosher, B. A., Sperling, G., & Wurst, S. A. (1986). Tradeoffs between stereopsis and proximity luminance covariance as determinants of perceived 3-D structure. Vision Research, 26, 973-990. (5) 3-D Movies and Books: A good way to preview stereopsis is show a scene from a 3-D movie in class. My current favorite is Spy Kids 3-D, and I use a segment from the “Mega-Racer” scene (DVD scene 14). The DVD comes with 4 glasses; additional glasses can be purchased from American Paper Optics (website is given below). Since this is scene is really a combination of video game and film, you can also highlight how movement parallax and exaggerated pictorial cues are used to increase the perception of depth. You can be even more current by watching out for TV shows and other films that use 3-D. For example, the NFL is presenting some games in 3-D, the 2008 version of Journey to the Center of the Earth is in 3-D, as is Avatar, Frankenweenie, and Life of Pi. ViewMasters, Wheatstone-type stereoscopes (available in most antique stores), and 3-D magazines (October 2013 had a “Sports Illustrated Kids Wheels” 3-D issue, May 2011, a “Sports Illustrated Kids Sports Blast!”). can be educational, especially if you point out the disparity in the magazines or the stereoview cards. One card in particular I use is The Keystone Eye Comfort and Depth-Perception Series (#E.C. 2 – V32930), which was created to help people see depth by labeling the different objects in the scene. Another interesting book is Masterpieces in 3D: M.C. Escher and the Art of Illusion, a book of M.C. Escher prints that have disparity; the book comes with a built-in 3-D viewer. (6) Video Suggestions: The first video suggestion is an educational video created by the Ames Research Lab of NASA, titled “Seeing Beyond the Obvious: Understanding Perception in Everyday and Novel Environments.” Hosted by Dennis Proffitt (University of Virginia), pictorial, oculomotor, binocular, and motion-related depth cues are presented and explained. NASA’s interest in this area is also revealed. The other video suggestions are less scholarly. The most easily accessible of these is the “Men in Black” music video by Will Smith, that is included in the Men in Black DVD and VHS. The hallway in the beginning of the video is a nice example of perspective convergence and relative size, and the background in the dance sequence is a classic stimulus to illustrate shading as a depth cue! Another music video that contains numerous size and depth illusions, including the Ames Room and other “construction-related” illusions, is © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


“Hourglass” by Squeeze, available in the “Squeeze Play” VHS compilation of their videos, or on YouTube. Clips from “The computer that ate Hollywood” documentary can be found online. One segment about the Ames Room is explained by V. S. Ramachandran. One other entertaining video is the Saturday Night Live skit “Mr. No-Depth Perception” (from the 1990-1991 season) where Kevin Nealon demonstrates just how important depth perception for everyday tasks, sports, and social interactions. It can be found online if you search for “Saturday Night Live Skit Mr. No-Depth Perception”. (7) Eyelines and Stereoscopic Images: The software Eyelines by Wally Beagley (available online) can also be used to create stereograms, and experiments that use stereopsis, with little training. The following article details the instructional and research applications: Wurst, S. A. (1994). Generating stereoscopic displays with Eye Lines: Applications in instruction and research. Behavior Research Methods, Instruments, & Computers, 26 (2), 148-150. (8) A Newer Variant of the Müller-Lyer Illusion: Pennel and Mershon reported an “unexpected variation” of the Müller-Lyer illusion that is easy to replicate. The length of diagonal lines that are placed below a square are misperceived. The other interesting aspect of this illusion is how it was discovered: The authors were creating a paper-folding tutorial using a computer-assisted drawing program. This helps show students the applications of the illusion, rather than just a curiosity. Pennell, T. K., & Mershon, D. H. (1998). An unexpected variation of the MüllerLyer illusion. Insight: The Visual Performance Technical Group Newsletter, 20, 1-3. (9) “The Headcrusher” and Visual Angle: One way I have introduced visual angle is by presenting a video of the “headcrusher” character from “The Kids in the Hall” TV series (can be found by searching online). In this series of skits, the “headcrusher” separates his thumb and index finger by about an inch at arm’s length to “crush the head” of the image of a passerby’s head. The visual angle of the separation of the fingers needs to match the visual angle of the person’s head. For example, the finger separation of 1 inch, at a distance of 18 inches, subtends 3.18 degree visual angle. You can ask the students to determine how far away the person should be for his/her head to match that visual angle, assuming the length of a person’s head is about 11 inches. It’s a great, entertaining way to get the students involved in this topic. (10) The Visual Cliff: Another classic method for determining an infant’s depth perception is the visual cliff by Eleanor Gibson. Although it does not isolate depth cues like the use of random dot stereograms, the method is interesting from a historical perspective, and videos are available (e.g., a video is available online on Vimeo; search for “Visual Cliff Psychology Video”) where seeing the infants can © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


pique students’ attention. You can also use this as a preview of the chapter, and have students critique the method.

Suggested Websites National Stereoscopic Association This is the website for the National Stereoscopic Association, that also publishes Stereo World Magazine, which has current 3-D products, information on photography, etc. American Paper Optics American Paper Optics is one of the leading manufacturers of 3-D glasses, both anaglyph and polarized. If you need extra glasses for showing a 3-D movie in class, they would be a great resource. Studio 3D Another great website with a wealth of information about stereoscopes. A nice feature is that you can custom order View-Master reels. You can take pictures of your campus, and they can be converted into a reel that you can put in any ViewMaster. Julian Beever The artist Julian Beever does spectacular drawings on street pavements that give viewers the strong impression of depth, especially when viewed at the proper angle. These pictures can be related to depth cues and shape constancy. ePsych: An Electronic Psychology Text Modules appropriate for Chapter 10 are “Depth Valley” which deals with imaging optics, occlusion, the implicit assumptions made in using depth cues, and familiar size; and in the “Professor Shadow’s Illusions”, check out the “Ball and Shadow” illusion (a nice demonstration of the effect of shadowing on depth) and the “Gallery of Illusions” section. Magic Eye Most students are familiar with Magic Eyes from poster stores or Sunday newspapers. Their website has numerous examples, viewing techniques, and different applications. 101 Visual Phenomena & Optical Illusions This website has numerous illusions, with good explanations, and references. For Chapter 10, the appropriate illusions are an interactive Müller-Lyer (with other creative versions), interactive T illusion, Shepard’s “Subterranean Terror”, and a brief description of the Moon illusion. Illusion Works This website links you to Al Seckel’s home page, which has links to interactive exhibits.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Banks Lab – University of California – Berkeley Infant spatial vision is one of the research topics at the Banks Lab in Berkeley. Research projects also include work examining binocular correspondence (horopters), stereoscopic surface perception, and picture/depth perception. Movie Scene – Star Wars: Attack of the Clones (2002) (0:00-2:00): Actually, it doesn’t matter what “episode” you select; the now-iconic opening of the film is essentially the same for our purposes. The opening “script” contains numerous depth cues (such as atmospheric perspective, perspective convergence, shadowing, relative size, familiar size, and texture gradient). You can use this either as a preview to the chapter; students can usually identify a number of the depth cues even before they learn about them; or as a review after you have covered depth cues.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 11: HEARING

Chapter Outline I.

Introduction A. Some Questions We Will Consider

II. III.

The Perceptual Process for Hearing Physical Aspects of Sound A. What is sound? 1. Physical definition 2. Perceptual definition B. Sound as Pressure Changes 1. Vibrations of air molecules, air pressure changes 2. Condensation, rarefaction – sound wave C. Pure Tones (Sine wave pattern) 1. Sound Frequency a. Hertz (cycles/sec): Unit of measurement for frequency 2. Sound Amplitude and the Decibel Scale a. Decibels: Unit of measurement for sound pressure level (SPL) b. Method: Using Decibels to Shrink Large Ranges of Pressures i. sound pressure level (SPL) D. Complex Tones and Frequency Spectra 1. Periodic tone 2. Fundamental frequency 3. Harmonics 4. Frequency spectra Perceptual Aspects of Sound A. Thresholds and Loudness 1. Loudness and Level a. Amplitude b. Decibels (physical measure) vs. loudness (psychological) c. Magnitude estimation procedure (S.S. Stevens) 2. Thresholds Across the Frequency Range: The Audibility Curve a. Auditory Response Area b. Equal Loudness Curves B. Pitch 1. Fundamental frequency 2. Tone height: Relation of pitch and tone frequency 3. Tone chroma and octave 4. Effect of the missing fundamental

IV.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


C. Timbre 1. Varying frequency spectra for different instruments 2. Attack and decay 3. Periodic and aperiodic sounds V.

Test Yourself 11.1

VI.

From Pressure Changes to Electricity A. The Outer Ear 1. Pinnae 2. Auditory Canal a. Protective function b. Resonance and resonant frequency 3. Tympanic membrane or eardrum B. The Middle Ear 1. Ossicles a. Malleus: the “hammer” b. Incus: the “anvil” c. Stapes: the “stirrup” 2. Oval window 3. How ossicles amplify sound 4. Middle ear muscles C. The Inner Ear 1. Inner Ear Structure a. Cochlea b. Cochlear partition i. Separates scala vestibuli and scala tympani c. Organ of Corti (contained in the cochlear partition) i. Cilia, inner and outer hair cells ii. Basilar and tectorial membrane 2. Vibration Bends the Cilia 3. Bending Causes Electrical Signals a. Tip links 4. The Electrical Signals Are Synchronized With the Pressure Changes of a Pure Tone a. Phase locking

VII.

How Frequency Is Represented in the Auditory Nerve A. Békésy Discovers How the Basilar Membrane Vibrates 1. Traveling wave motion of basilar membrane 2. From apex (cochlea end) to base (oval window) B. The Cochlea Functions as a Filter 1. Tonotopic map of frequencies 2. Method: Neural Frequency Tuning Curves a. Threshold as function of frequency for neuron firing b. Characteristic frequency

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


C. Returning to the Outer Hair Cells: The Cochlear Amplifier 1. The Cochlear Amplifier VIII.

Test Yourself 11.2

IX.

The Physiology of Pitch Perception A. Place and Pitch 1. Place theory 2. Amplitude modulation, Noise, Amplitude-modulated noise B. Temporal Information and Pitch a. Temporal coding C. Place and Pitch (Again) 1. Resolved and unresolved harmonics D. Problems Remaining to Be Solved E. The Pathway to the Brain 1. Auditory nerve fibers to subcortical structures a. Cochlear nucleus b. Superior olivary nucleus c. Inferior olivary nucleus in brainstem d. Inferior colliculus in midbrain e. Medial geniculate nucleus in thalamus 2. Primary auditory cortex or auditory receiving area, A1 3. Core area 4. Belt area 5. Parabelt area F. Pitch and the Brain 1. Pitch Neurons in the Marmoset 2. Pitch Representation in the Human Cortex a. Anterior auditory cortex

X.

Hearing Loss A. Presbycusis 1. Hair cell damage 2. Loss of sensitivity B. Noise-Induced Hearing Loss 1. Leisure noise C. Hidden Hearing Loss 1. Audiogram

XI.

Something to Consider: Cochlear Implants A. Implant consists of microphone, sound processor, transmitter, and array of electrodes implanted along the cochlea

XII.

Developmental Dimension: Infant Hearing A. Thresholds and the Audibility Curve B. Recognizing Their Mother’s Voice

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


XIII.

Test Yourself 11.3

XIV.

Think About It

XV.

Key Terms

Learning Objectives At the end of this chapter, the student should be able to: 1. Describe the physical stimulus for sound, including the concepts of sound waves, pure tones, amplitude, and frequency. 2. Analyze how physical characteristics of the sound wave are related to the perceptual dimensions of loudness and pitch. 3. Illustrate what an audibility curve is, and how these curves are different for different species. 4. Define timbre, and discuss how perception of timbre is related to perception of complex sounds. 5. Identify the structures and the function of the outer ear and middle ear. 6. Describe, in detail, how transduction occurs in the cochlea, how outer hair cells can act as a cochlear amplifier, and how phase locking relates to coding of frequency information. 7. Outline what cochlear implants are and how they work. 8. Describe the auditory capabilities of infants, for tones and voices, and how prenatal exposure influences perception after birth.

Chapter Overview/ Summary Chapter 11 opens with first-hand accounts of how important hearing is to humans. Hearing has signaling functions, but is also vital for pleasure (from music) and social interactions (speech). The physical stimulus for hearing is first discussed. Sound waves can vary in amplitude and frequency. A sound wave that is sinusoidal is called a pure tone and is often used in auditory research. Amplitude is associated with the perceptual experience of loudness. The unit of measurement for sound pressure levels is the decibel. The transformation of sound pressure levels to decibels is captured with the formula dB = 20 x logarithm (p/po). The logarithmic nature of the formula helps compress the large range of sound pressure levels. The sound of rustling leaves is approximately 20 dB; a jet at take-off is about 140 dB. Although it is important to understand pure tones, an understanding of complex tones is also necessary. Additive synthesis of pure tones result in complex tones that can be characterized by their fundamental frequency and harmonics. A discussion of sound would also be incomplete if the perception of sound were not discussed. Loudness is a perceptual correlate of amplitude. The audibility curve shows the range of hearing, plotting threshold as a function of frequency. It reveals why our sensitivity changes with different frequencies. The range of hearing differs from species to species. Pitch is the perceptual correlate of frequency, and is measured in Hertz. Hearing the notes on a keyboard is used to introduce the concepts of tone height, © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


tone chroma, and octaves. Timbre is another sound quality, which accounts for the perception of difference between two tones that have the same loudness, pitch, and duration. Complex stimuli are used to study timbre. Timbre also depends on time course of the tone (attack and decay). The next major topic is the anatomy and physiology of the ear. The outer ear, consisting of the pinnae, auditory canal, and tympanic membrane, collects sound waves and amplifies some frequencies. The middle ear, consisting of the ossicles, oval window, and the middle-ear muscles, help amplify the sound waves, and provide protection for the inner ear. The main structure of the inner ear is the fluid-filled cochlea. The vibrations of the fluid in the cochlea affect the Organ of Corti and the basilar and tectorial membranes, causing the bending of the inner hair cells, resulting in transduction. Theories explaining the coding of the neural firing in the cochlea include Békésy’s place theory and phase locking - which accounts for the timing of firing patterns. Tonotopic maps and evidence from frequency tuning curves associated with auditory nerve provide support for place theory. A practical application of place theory is the development and use of cochlear implants. In addition, one major advance on Békésy’s work is the finding that the outer hair cells elongate and contract, serving as an amplifier of basilar membrane movement. Basilar membrane vibrations are then examined in relation to complex tones. Next is a discussion of the physiology of pitch perception. Pitch perception relies on more than the place of peak activity on the basilar membrane; in fact, it relies heavily on periodicity for most frequencies and place for frequencies greater than 5,000 dB. Physiological research has also revealed the existence of pitch neurons. Auditory pathways to the brain, including subcortical structures and primary auditory area, are discussed. Next, hearing loss associated with hair cell damage is discussed. Presbycusis and noise-induced hearing loss are two examples. While Presbycusis may not be preventable, often times noise-induced hearing loss can be, therefore, tips for minimizing noise-induce hearing loss are presented. Hidden hearing loss, and evaluating hearing with an audiogram are described, followed by an explanation of how cochlear implants work. The chapter concludes with a discussion of how hearing develops in infancy. Audibility curves can be found in 3-month-olds; those curves become similar in shape to adults by 6 months. Infants as young as 2-days-old can recognize their mother’s voice, familiar prose, and native language (nipple-sucking pattern used as the DV). Prenatal exposure to the mother’s voice explains this effect.

Demonstrations, Activities, and Lecture Topics (1) Measuring Sound Levels: Although a list of typical sounds and the related decibels are provided in the text, you may want to purchase a sound meter (D.A.S. Distribution is one affordable dealer) and record, in class, the level of some common sounds, such as hair dryers, music from boom boxes at different “volume settings”, cow bells, or the ring of a cell phone. You could also purchase some “Thunder Sticks” or other popular noise-makers that are used in sporting arenas to show how loud just a couple can be, and relate this to hearing loss. (2) “The Mosquito” Teen Repellent and Ringtones: An application of audibility curves for different age groups was reported by Sarah Lyall in The New York © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Times (November 29, 2005) article “What’s the Buzz? Rowdy Teenagers Don’t Want to Hear it.” Howard Stapleton has invented the “Mosquito”, which emits a high frequency sound that most people under 30 years old can’t detect, but is detected by most people under 20 years old. He is testing out this “anti-loitering” device so that teens will be annoyed by the sound, and won’t congregate where the Mosquito is. The application has now gone one step further, and to the teens’ advantage. Companies are now marketing ringtones that teens can hear but “parents and teachers can’t.” The Mosquito Ringtones website provides free ringtones and excellent sample stimuli. For instance, I can’t hear the 14,000 Hz tone, but it drives most class members nuts if I let it go on. This is bound to create some discussion in class. (3) Video Suggestions: A brief description of cochlear implants was presented in the chapter. Your students may be surprised to find out that there is some controversy surrounding their use. A point to consider is this: does the recommendation that someone have this procedure imply that there is something fundamentally wrong with them that needs fixing? Sound and Fury (the documentary by Josh Aronson) and its follow up, Sound and Fury: 6 years later are great film dealing with cochlear implants and the related controversy. It can be purchased online at amazon.com. A slightly older, but excellent film that addresses the same topic is Cochlear Implants: The Deaf Community’s View (Beyond Sound Productions, 1994), narrated by Mel Carter and Sharon Carter. (4) Simulating Disabilities: One way for students to gain a greater appreciation of their sensory systems is to simulate a perceptual disability. This is a more expanded version of Goldstein’s suggestion to close your eyes as you sit at your desk, and similar to Eileen Lusk’s simulation. My colleague Karen Wolford and I have detailed the exercise we used in the reference that follows. By working with your Disabled Students Office, you can make a day-long activity of simulating a hearing disability (using ear plugs), or a visual disability (using light-reducing goggles). An inexpensive exercise, but has a lasting effect on the participants. Wurst, S.A., & Wolford, K. (2000). Incorporating disability awareness into psychology courses: Applications in abnormal psychology and perception. In Ware, M. E., & Johnson, D. E. (eds), Handbook of demonstrations and activities in the teaching of psychology: Volume 3 (2nd ed). Mahwah, NJ: Erlbaum, 89-91. (5) Inside the Actor’s Studio: One way to introduce the relationship between audition and emotion is to cite the questionnaire used by James Lipton on the Bravo program “Inside the Actor’s Studio.” (Unfortunately, some students will be more familiar with the Saturday Night Live parody than the actual show!) After interviewing the actor on his/her childhood, career, and craft, Lipton uses a tenitem questionnaire to delve into the actor’s personality. Two of the questions are about audition: “What is your favorite sound or noise?” and “What is your least favorite sound or noise?” Elton John, for example, answered “a church choir” and a “fork scraping on a surface” as his favorite and least favorite sounds. Other © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


favorites tend to be “my spouse’s voice” or “my children’s laugh”; least favorites have been “jackhammers” and “screeching brakes.” But why ask about sounds? Why not smells or sights? Is there something more inherent about the relationship between sound and emotion? (6) Hearing Dogs: Another way that deaf people can navigate the world is by using “hearing dogs.” Most students will be aware of “seeing eye dogs,” but might be less familiar with companion dogs for deaf people. If possible, you may want to contact your local agency, and see if they would do a class presentation. If not, there is much information about this service at many websites, such as: Hearing Dogs for Deaf People; Dogs for the Deaf, Inc; and Canine Companions for Independence. (7) Preventing Hearing Loss: One of the most important topics you can cover with students is noise-related hearing loss. Several anecdotal experiences can be cited, such as President Clinton, Steven Stills, and Pete Townshend. A search of the literature can result in numerous research studies that show the link between loud music and hearing loss. One example that has generated some recent publicity is David Opperman’s (University of Minnesota) research about the effectiveness of wearing ear plugs at rock concerts. Jeannie Chung and her colleagues at Harvard Medical School have also researched the link between loud music and hearing loss. Sports performers and fans are also exposed to dangerous noise levels. The use of “ThunderStix” in arenas and stadiums (especially dome stadiums) contribute to the noise level, and race tracks are not surprisingly high noise locales. Richard Petty, for example, has hearing loss, and Luann VanCampen of NIOSH has measured SPLs of 140 dBs at races. A readable article on the link between race car events and hearing, “The Sound and the Fury, and Possibly the Danger”, can be found at The New York Times. An unexpected danger to hearing also lurks in children’s toys. Hilton and Caicedo’s Sight & Hearing Association study tested dB levels from toys near the ear and at arm’s length (approx. 10 inches). One toy electric guitar in this study produced 114 dB; all 14 toys in the study measured over 90 dB at the speaker. (8) “Snapshots”: An interesting inter-modal experience was undertaken by The Elements Quartet chamber music group. These performers contacted contemporary composers and asked them to select a personal photograph (such as a child on a swing, or a parrot dressed up as a cowboy), and compose music that matched the “snapshot.” This project highlights the link between music and vision. The quartet did perform this project on my campus, and did a bonus performance in my class also! You might want to consider working with your music department to get funding for a performance at your institution: It is well worth the effort! (9) Video Suggestions: The PBS program The Mind: Development has some good clips on sensory testing of newborns, including Fifer’s “sucking to mother’s voice” study. The companion book also has information on research by Fifer, © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


DeCasper, and Couchesne. Another PBS program The Secret Life of the Brain presents sensory testing of infants and children in “Program 2: The Child’s Brain: Syllable from Sound” Restak, R.M. (1988). The mind. New York: Bantam Books.

Suggested Websites Smithsonian Folkways The Smithsonian Institute maintains this website devoted to preserving sound recordings, mostly of music and historical speeches, but also instructional recordings and recordings of various nature sounds and sound effects. “The Science of Sound” CD (also available via cassette and for download) has direct relevance to this chapter, covering frequency, pitch, the subjective nature of sound (including noise vs. music) and the Doppler effect. You can also explore the site to find birdsongs, frog sounds, bodily sounds through stethoscopes, and sounds associated with science fiction movies. Purves Lab The laboratory website for Dale Purves, M.D. (Center for Cognitive Neuroscience at Duke University) provides studies on visual perception as well as some on sound and music. Demonstrations on the “See for Yourself” page include one in which harmonics are missing fundamentals and one in which you can play musical intervals to examine tonal preferences. Empirical evidence related to these phenomena are also provided. Cochlear Cochlear is one of the leading manufacturers of cochlear implant devices. Explore this website to find out information on how the implants work (with animations), screening for candidates, and information about implants and the deaf community. Boys Town National Research Hospital Boys Town Hospital is one of the leading hospitals/ research centers for cochlear implants. On their “Hearing Services” page you find information about their “Cochlear Implant Center”. Procedures and candidacy are discussed for a range of ages, as well as personal accounts from people with implants and their families. The “Knowledge Center” provides access to videos, podcasts, and articles as well. The hospital also auditory perception laboratories, so check out the links to these as well on the “Research” page under “Clinical & Behavioral Labs”. AUDITORY List This is a website for a listserv devoted to auditory issues, but also has numerous links and conference announcements.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Jont Allen’s Auditory Models Page Jont Allen maintains this site which links to a variety of resources on functioning of the auditory system. A few of the links point to Allen’s past and present research laboratory pages which contain useful information Neuroscience for Kids: The Senses The “Neuroscience for Kids” website, maintained by Eric Chudler, provides exercises for different sensory modalities (“Experiment” page under “The Senses”). Although written at a lower level, you can adapt some of the exercises for class (such as identification of sounds and sound localization), or encourage non-traditional students to do some of the demonstrations with their children. Denoyer-Geppert The Denoyer-Geppert Company has models of the ear and films on the auditory system. Movie Scenes (1) Oscar-winner example: Children of a Lesser God (31:30-33:28): In this clip, Sarah (Marlee Matlin), a deaf girl, explains to John Leeds (William Hurt) how she feels music tactilely (through her nose?) and allows her to dance. (2) “Not-even-close-to-being-an-Oscar-winner” example: Rock’n’Roll High School (23;00-24:20): This won’t be to everyone’s taste, but in this scene of the cult classic, Principal Togar (Mary Woronov) demonstrates that loud music can lead to hearing loss. She exposes a lab mouse to increasing noise levels using the Rockometer, starting with “Muzak” and “Donny and Marie,” escalating through “Kansas,” “The Rolling Stones,” “Ted Nugent,” and “The Who” until she reaches the “Ramones,” when the mouse disappears in a puff of smoke. Togar then states that hearing loss has occurred. Animal rights people may have a problem with the clip, but it is very cartoony, and actually very similar to scenes in Shrek. Also note the total distortion of the decibel scale: The “Ramones” level has a numerical value of 300!

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 12: Hearing II: Location and Organization

Chapter Outline I.

Introduction A. Some Questions We Will Consider B. Challenges to auditory processing 1. Sound localization 2. Precedence effect 3. Auditory stream segregation 4. Music and movement

Section 1: Location II. Auditory Localization A. Auditory space 1. Location cues a. Binaural b. Monaural 2. Three dimensions a. Azimuth b. Elevation c. Distance B. Binaural Cues for Sound Localization 1. Interaural Level Difference (ILD) a. Acoustic shadow 2. Interaural Time Difference (ITD) 3. The Cone of Confusion C. Monaural Cue for Localization 1. Spectral cue III.

The Physiology of Auditory Localization A. The Jeffress Neural Coincidence Model 1. Coincidence detectors – zero ITD 2. ITD detectors 3. ITD tuning curves B. Broad ITD Tuning Curves in Mammals 1. Broadly tune vs. sharply tunes neurons 2. Population code vs. place code C. Cortical Mechanisms of Localization 1. Evidence That Area A1 Is Involved in Locating Sound 2. Evidence That the Posterior Belt Area Is Involved in Locating Sound 3. Evidence That the Anterior Belt Is Involved in Perceiving Sound 4. What and Where Auditory Pathways

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


IV.

Hearing Inside Rooms A. Direct vs. Indirect Sound B. Perceiving Two Sounds That Reach the Ears at Different Times 1. Lead vs. lag speaker 2. Precedence effect C. Architectural Acoustics 1. Reverberation time 2. Acoustics in concert halls: a. Intimacy time b. Bass ratio c. Spaciousness factor

V.

Test Yourself 12.1

Section 2: Organization VI.

The Auditory Scene: Separating Sound Sources A. Auditory scene analysis B. Location C. Onset time D. Timbre and Pitch 1. Auditory stream segregation 2. Scale illusion or melodic channeling E. Auditory Continuity F. Experience

VII.

Musical Organization: Melody A. What Is Melody? B. Phrases 1. Semitones C. Grouping 1. Auditory stream integration 2. Gap fill 3. Arch trajectory D. Tonality 1. Return to the tonic 2. Tonal hierarchy 3. Musical syntax E. Method: Event-Related Potential in Language F. Expectation 1. Regularities in the environment

VIII.

Musical Organization: Rhythm A. What Is Rhythm? 1. Music structures

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


2. Inter-onset interval B. The Beat 1. Basal ganglia C. Meter 1. Metrical Structure and the Mind 2. Metrical Structure and Movement a. Method: Head-Turning Preference Procedure b. Vestibular system 3. Metrical Structure and Language a. Stress patterns in different languages IX.

Something to Consider: Connections Between Hearing and Vision A. Multisensory interactions B. Hearing and Vision: Perceptions 1. Ventriloquism or visual capture 2. Two-flash illusion 3. Speechreading C. Hearing and Vision: Physiology 1. Coordinated receptive fields 2. Echolocation

X.

Test Yourself 12.2

XI.

Think About It

XII.

Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. Identify the different coordinates of auditory space and where auditory localization is most accurate. 2. Recognize the two major binaural location cues and how effective they are for different frequencies. 3. Explain why ambiguity exists when judging location of sounds. 4. Identify the structures in the auditory pathway and the auditory areas in the cortex. 5. Discuss the physiological aspects of sound localization, including findings from different species. 6. Explain what the precedence effect is and describe how the effect can be demonstrated. 7. Differentiate between the auditory what and where pathways along with evidence supporting this distinction. 8. Apply the basic ideas of auditory scene analysis to a “real-life” situation.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


9. State the principles that affect auditory grouping and discuss research that supports the effect of these principles. 10. Illustrate the concept of metrical structure and explain how it is influenced by grouping tendencies, movement, and language. 11. Describe the interactions between vision and hearing, especially when information from the two modalities is conflicting.

Chapter Overview / Summary Chapter 12 addresses how we localize sounds, how we identify sound sources, how we hear inside rooms, and the interaction between audition and vision. Auditory space is determined by the azimuth (horizontal), elevation (vertical), and distance. Binaural location cues are those that use information from both ears; they fall into two categories: interaural time difference (ITD) and interaural level difference (ILD). Large ITDs are indicative of sounds coming from one side, ITDs of zero indicate the sound came from directly ahead. ILDs occur for high-frequency sounds because the head creates acoustic shadow; the acoustic shadow does not occur for low-frequency sounds, therefore, ILDs are good localization cues for high-frequency but not low-frequency sounds. Judging elevation and distance is challenging because a number of different locations in space can yield the same ILD and ITD, resulting in ambiguity (cone of confusion). The major monaural cue is called a spectral cue; it is called this because the head and pinnae decrease the intensity of some frequencies, and increase others. Hofman et al. researched localization in elevation and distance, who had participants wear molds that changed the shape of their pinnae. Azimuth locations were still judged accurately, but localization for elevation was poor. Adaptation occurred in about 19 days. When the mold was removed, interestingly, localization for elevation was normal. The physiology of auditory system involves a progression of information from the auditory nerve to subcortical structures to the auditory cortex (A1; primary auditory receiving area). From A1, information travels to the core area, belt area, and parabelt area. Physiological evidence for specialized localization neurons has been obtained. Jeffress proposed neural circuits that respond to specific ITDs; if these detectors existed, they would be narrowly tuned. Empirical evidence from animal studies is mixed. Mammals appear to have broadly-tuned detectors whereas birds have sharply-tuned detectors. Localization also occurs in A1, where neurons respond to specific points in space where sounds originate, and in the auditory belt area to even more precise areas in space. As found in vision, what and where pathways have been found for auditory processing also. When listening to sounds inside rooms, we experience direct sounds and indirect sounds (those that bounce off the walls and floors before reaching the ear). When two sounds reach the ears at different times, we tend to fuse the two sounds into one. The precedence effect is when we perceive the sound as coming from the source that reaches our ears first. Architectural acoustics also is an important issue when listening to music inside rooms. Reverberation time must be accounted for when designing concert halls such as Avery Fisher hall and the Walt Disney Hall, and classrooms, where the problem of background noise is greater. To identify the source of sounds, humans do auditory scene analysis, in which we separate the sound sources into separate percepts. Many principles account for auditory grouping. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Sounds from the same general spatial location are grouped together. Onset time, similarity of timbre and pitch (as demonstrated the scale illusion), auditory continuity, and experience also affect auditory grouping. Another way to organize auditory information is to use metrical structure. We impose organizational structure on beats even when none exist. This effect is can be influenced by movements that have been paired with the beats (bouncing, an effect found in infants and adults alike) and the dominant stress patterns found in our native language (e.g., short-long in English, long-short in Japanese). The last issue addressed in the chapter is the interaction between audition and vision. Sensory systems do not operate in isolation; instead, multisensory interactions define our perceptual experience. The connection between vision and hearing is demonstrated in the ventriloquism effect (visual capture), the two-flash illusion, and speechreading. In addition to interactions that shape perception, physiological interactions occur between the sensory areas. Monkey receptive areas for sight and audition are linked and blind individuals using echolocation demonstrate activation of both the auditory and visual cortices. Demonstrations, Activities, and Lecture Topics (1) The Perception of Music: A recent Scientific American article by Eckart Altenmuller is an excellent supplemental reading for undergraduates. He addresses physiological mechanisms involved in music listening, cross-modal experiences, the effect of learning, and a detailed example of what occurs when we hear the song “Happy Birthday.” Altenmuller, E. O. (2004). Music in your head. Scientific American: Mind Special Issue, 14 (1), 24-29. (2) Obsolete Sounds: Experience is cited in the text as an auditory localization cue. However, the sounds that we are exposed to are constantly changing. For example, the sound of a rotary phone and the sound of a hand-held telephone slammed down are probably not as familiar to college students as to their professors. Potentially, in a few years, the “dial-up” screech will also be obsolete. Many of these sounds are being preserved, however, usually for sound effects purposes, but can also be used in research and class presentations. Valentino Production Music is a leading sound effects company. (3) Role of Vision in Auditory Localization: Historical Perspective: An early, and fascinating, apparatus to test this issue was the “pseudophone,” invented by P.T. Young in 1928 (as cited in Boring, Landfeld, and Weld’s Foundations of Psychology, 1948). The wearer of this device had an artificial pinna on the left side of the head that attached to the right ear, and vice versa. According to Boring et al., the wearer of the device did a daily routine; not too surprisingly “Automobiles at a busy intersection created a real hazard.” After a while adaptation did occur. A somewhat more recent version was the “Bice Device”. Some information about both of these devices can be found in a 2001 Psychology Today article, “Auditory Illusion” by Kaja Perina. A 2007 story in © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


the University of Virginia Magazine, “Bice Devices”, includes a photo of the Bice device and information about Raymond Brice. (4) Role of Vision in Auditory Localization: Lab Exercise: Power, Hausfeld, and Gorta detailed a workshop on the role of vision in auditory localization. Using speakers, you can present sounds to the participant in three conditions: lights on, lights off, and eyes closed, but instruct the participant not to move his/her eyes. This workshop, therefore, also addresses the role of eye movements in locating sounds. Power, R.P., Hausfeld, S., & Gorta, A. (1981). Workshops in perception. Boston: Routledge & Kegan Paul. (5) Virtual Auditory Space Systems: For any more technologically-oriented students, you may refer them to a recent presentation by Scarpaci et al. that details a system created for generating VAS. Scarpaci, J. W., Colburn, H. S., & White, J. A. (2005). A system for real-time virtual auditory space. Proceedings of the ICAD 2005 Conference, Dublin. (6) Perception of Musical Meter and Dyslexia: While discussing the perception of meter, students could be asked to speculate on why humans would benefit from having this type of mechanism. One straightforward talking point from the chapter is that meter perception might be important for language acquisition and speech sound segmentation. Building on this idea, you could discuss Huss et al.’s (2011) finding of a link between developmental dyslexia and meter perception. Huss, M., Verney, J.P., Fosker, T., Mead, N., & Goswami, U. (2011). Music, rhythm, rise time perception and developmental dyslexia: perception of musical meter predicts reading and phonology. Cortex, 47, 674-689. Suggested Websites Barbara G Shinn-Cunningham Home Barbara Shinn-Cunningham is the head of the Auditory Neuroscience Lab at Boston University. Some of her research that is relevant to this chapter (included in the “Papers” page) involves reverberation and spatial hearing. Brain Development Lab: University of Oregon The Brain Development Lab at The University of Oregon has produced interesting research, including studies on auditory rhyming using evoked potentials. Research examining the role of music training on cognition is also included on their “Publications” page.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Auditory Neuroscience Laboratory: University of Sydney This University of Sydney website has much information on virtual auditory space and other relevant auditory scene perception topics. Binaural Recording: Simon Fraser University At this website from Simon Fraser University, there is an interesting example of sound recorded using Kunstkopf (“artificial head”). This is one part of a handbook that is loaded with information about audition; browse around! Optical Illusions & Visual Phenomena In addition to the many examples we have already seen from Michael Bach, he has a great example of Sekuler et al.’s (1997) Motion-Bounce Illusion – though not discussed in the text, can create interesting discussion in class. Movie Scene –Wild Wild West (1999) (8:45-9:43): In this scene, Gordon (Kevin Kline), disguised in drag, needs information from General “Bloodbath” McGrath (Ted Levine). McGrath has a unique feature: Since he lost a pinnae in the Civil War, he uses a gramophone horn instead. It’s interesting that it can swivel (he drains some liquid out of his ear this way). You can discuss with students how such a pinnae could affect sound localization: e.g., the smoothness of the “pinnae” and the effect of swiveling it into different positions.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 13: SPEECH PERCEPTION

Chapter Outline I.

Introduction A. Some Questions We Will Consider

II.

The Speech Stimulus A. The Acoustic Signal 1. Acoustic stimulus or signal 2. Articulators 3. Vowels a. Formants 4. Sound spectrogram 5. Consonants a. Articulators and manner of articulation b. Place of articulation c. Formant transitions B. Basic Units of Speech 1. Phoneme a. Vary across languages

III.

The Variability of the Acoustic Signal A. Variability From Context 1. Coarticulation 2. Perceptual constancy B. Variability From Different Speakers 1. Pitch, accent, speech speed

IV.

Perceiving Phonemes A. Categorical Perception 1. Voice onset time 2. Phonetic boundary B. Information Provided by the Face 1. Multimodal a. McGurk effect b. Audiovisual speech perception c. Brain areas activated for speech perception include face areas for familiar voices. C. Information From Our Knowledge of Language 1. Phonemic restoration effect 2. Bottom-up and top-down processing

V.

Test Yourself 13.1

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


VI.

Perceiving Words and Sentences a. Perceiving Words in Sentences 1. Demonstration: Perceiving Degraded Sentences 2. Shadowing b. Perceiving Breaks Between Sequences of Words 1. Speech segmentation 2. Demonstration: Organizing Strings of Sounds 3. Transitional probabilities and statistical learning 4. Saffran et al. (1996): Infant studies demonstrates statistical learning C. Perceiving Degraded Speech 1. Noise-vocoded speech

VII.

Speech Perception and the Brain A. Aphasias 1. Broca’s area and Broca’s aphasia 2. Wernicke’s area and Wernicke’s aphasia a. Word deafness: Extreme form of Wernicke’s 3. Parietal lobe involved in syllable discrimination 4. Voice area in human STS - Belin et al. (2000) 5. Voice cells in monkey temporal lobe: Perrodin et al. (2011) 6. Dual-stream model of speech perception 7. Electrode responses to single phonemes: Mesgarani et al.

VIII.

Something to Consider: Speech Perception and Action A. Motor Theory of Speech Perception 1. Activation of motor mechanisms enables perception. 2. Audiovisual mirror neurons 3. Link between producing and perceiving speech a. TMS: D’Ausilio et al. 2009 b. fMRI: Silbert et al. 2014

IX.

Developmental Dimension: Infant Speech Perception A. The Categorical Perception of Phonemes 1. Eimas et al. (1971): 1-month-old infants have adult-like VOTs B. Learning the Sounds of a Language 1. Experience-Dependent Plasticity 2. Social gating hypothesis: Kuhl 2007, 2010.

X.

Test Yourself 13.2

XI.

Think About It

XII.

Key Terms

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Learning Objectives At the end of the chapter, the student should be able to: 1. Recognize the basic concepts associated with the speech stimulus, such as phonemes, the acoustic signal, articulators, and formants. 2. Discuss the problem of variability due to context and different speakers with examples of research associated with these problems. 3. Discuss how VOTs have been used in categorical perception research. 4. Describe the McGurk effect, and what the implications are of this phenomenon. 5. Illustrate how words are identified in sentences, and how word segmentation occurs. 6. Explain how the information from speaker characteristics affect speech perception. 7. Summarize the types of aphasia, and what these conditions tell us about the physiology of speech perception. 8. Critique the motor theory of speech perception. 9. Discuss the methods and results of two research studies that show that plasticity occurs in speech perception. 10. Explain how phonemic categorization is researched and the similarities between adult and infant speech perception.

Chapter Overview/ Summary Chapter 13 opens with a comparison of human and computer speech recognition: How is it that humans can perceive speech in a wide range of conditions, but computers have difficulty? The first topic in understanding speech perception is the speech stimulus. The acoustic signal is the patterns of air pressure that occur when speech is produced using the articulators (tongue, lips, teeth, jaw, and palate). The acoustic signal is comprised of formants, which are the frequencies in which different peaks of pressure occur. Sound spectrograms are used to analyze the acoustic signal, identifying formants and formant transitions. Phonemes are the smallest segment of speech that, if changed, changes the meaning of the word. For example, the phonemes /b/, /i/, and /t/ form the word “bit,” Changing the phoneme /b/ to /p/ would change the word to “pit”. The number of phonemes is language-dependent, with English having 13 “vowel sound” phonemes and 24 “consonant sound” phonemes. The second major topic in the chapter is the relationship between the acoustic signal and phonemes. Variability occurs because the phoneme’s acoustic signal changes with context, and because different speakers have differences in pitch, speech speed, dialects, accents, and “sloppy” pronunciation. Categorical perception is another phenomenon associated with the speech stimulus. By using computers to manipulate “voice onset times” (VOTs), researchers have identified “phonetic boundaries,” which are the VOTs at which the perception of the sound changes from one phoneme to another. The McGurk effect shows how vision can also affect speech perception. The phonemic restoration effect also shows the effect of context on perception: We perceive words even though a sound stimulus replaces a phoneme in a word. Another major issue is the segmentation problem (how to separate sounds into words when there are no breaks in the acoustic signal?). Knowledge and context aid in segmentation. This benefit of “top-down” processes can be demonstrated when reading degraded sentences and re-segmenting sound © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


patterns. The importance of transitional probabilities and statistical learning is shown. Physiological aspects of speech perception are then discussed. Broca’s aphasia and Wernicke’s aphasia are conditions related to cortical damage for speech production and comprehension, respectively. Brain scanning studies and plasticity studies have also identified the location of speech areas in the cortex (e.g., STS voice area, temporal lobe voice cells). Mirror neurons may also contribute to speech perception with some even having been found in a structure analogous to Broca’s area. The existence of mirror neurons in this capacity has been used as support for the motor theory of speech perception, although alternative explanations exist. The chapter concludes with a discussion of how phonemic perception in infants. One-month-old infants can categorize phonemes, in a way similar to adults. There is also evidence that infant phoneme perception “tunes” to the phonemes spoken within the language the infant is exposed to. These cross-cultural studies show that distinguishing phonemes is affected by exposure to the language.

Demonstrations, Activities, and Lecture Topics (1) Word segmentation and humor: One way to introduce this topic is to note how many jokes are based on word segmentation. I use two that are favorites of my parents (and it also highlights the difference between my father’s and mother’s sense of humor!). My mom’s joke is: A little boy tells his mother that he knows what God’s name is. “What is it?” asks the mother. “Andy,” replies the boy. “Why is that?” asks the mother. “Because,” said the boy, I heard you singing “And He (Andy) walks with me /And He (Andy) talks with me/ And He (Andy) tells me I am His own.” (These are the lyrics to the hymn “In the garden”) Dad’s joke: A little boy and a man are standing at a bus stop on a cold, sleety day. The boy looks up at the man and asks “Is that an icicle on your nose?”. Man replies “No, it’s not (It’s snot).” The class will then be able to come up with plenty on their own, including people’s names (e.g., the doctor “Ben Dover,” etc.) (2) “Mad Gabs” and Segmentation: The board game “Mad Gabs” is based on the same principle as “segmenting strings of sounds”, as phrased in the text. You can use the game for more examples, or develop some on your own. You can also demonstrate top-down processes using these by using priming stimuli. For example, I’ll present a “Smiley face” to half the class before showing “Heaven Iced Hay.” The primed students generally tend to get “Have a nice day” before the non-primed students. Other primes and word strings I’ve used are: “No Doubt” / “Go when stiff on knee” (Gwen Stefani); “Dentyne” / “Shook Hurl His Gum” (“sugarless gum”); and (“Award”/ “Pull It Sir Pries” (Pulitzer Prize). What you can also try (although I’ve had limited success) is to see if prior experience in playing Mad Gabs helps. See if the students who have experience get the correct segmentation faster than “Mad Gab” novices; this would also exemplify the effect of knowledge on speech perception. (3) “Nobody expects the Spanish… lyrics!”: Billa Reiss, in the instructor’s manual for the 6th edition of Goldstein’s text, suggests the book “ ‘Scuse Me While I Kiss This Guy” by Gavin Edwards for examples of misheard lyrics. There is a website for these mondegrens © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


on the website “Kissthisguy: The Archive of Misheard Lyrics”. One of my favorites to present to students is the song Loser by Beck. The chorus contains the lyric “Soy un perdador” (Spanish for “I’m a Loser”), but since all other lyrics are in English, students tend to perceive the acoustic signal as English words, such as “Sores on me head to toe” (which also would fit a song about a “loser”). Two others that work for me: Don’t Stand So Close to Me by The Police, contains the line “Just like the old man in the book by Nabokov”, which no one gets unless they have a knowledge of Lolita, and The Ramones’ Commando. I’ve had a whole class get none of the lyrics correct. All three of these songs are easily found in video form. One word of warning about the website: some of the misheard submissions don’t make a lot of sense, and some of them, while not profane, can be inappropriate for class. I did Pink Floyd’s “The Wall” because a student asked, and we found some hate-related phrases and sexual references (“More orgasms in the classroom” instead of “No dark sarcasm in the classroom”). Speaking of sarcasm and music lyrics…. (4) Isn’t it Ironic…don’t you think?: Kristen Link and Roger Kreusz have conducted interesting research on indexical speaker characteristics. “Ostensible speech acts” occur when a speaker is saying one thing, but the speaker is not being serious and the listener knows it. Irony/ sarcasm, jokes, and lies are examples. Some factors that they have investigated are sincerity and speech-act category, and regional dialect differences. Link, K. E., & Kreuz, R. J. (2005). The comprehension of ostensible speech acts. Journal of Language and Social Psychology, 24, 1-25. (5) Video Suggestion: An interdisciplinary approach to speech perception can be used with the video “Playing Shakespeare: Irony and Ambiguity”. The use of speaker characteristics by actors reading Shakespeare, demonstrates the relationship between speech perception and theater. Also, you get to see Ben Kingsley when he had hair! (6) Evolutionary Theory and Voice: Gordon Gallup and his colleagues have conducted research on attractiveness ratings of people’s voices, and applied the findings to evolutionary theory. Their findings are remarkable in the amount of information that a person can get just by listening to a voice. For example, voice attractiveness predicted the shoulder-to-hip ratio in males, and the waist-to-hip ratio in females. Hughes, S. M., Harrison, M. A., & Gallup, G. G., Jr. (2002). The sound of symmetry: Voice as a marker of developmental instability. Evolution and Human Behavior, 23, 173-180. Hughes, S. M., Dispenza, F., & Gallup, G.G., Jr. (2004). Ratings of voice attractiveness predict sexual behavior and body configuration. Evolution and Human Behavior, 25, 295-304.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(7) Phonemic Restoration and Censorship: A possible application of the phonemic restoration effect that you may want to discuss involves how radio programs use beeps to mask censored words. However, if the first and phoneme of the censored word is presented, with a beep in-between, the phonemic restoration effect should occur, meaning that a listener would still perceive the profanity. The question is: Is the word actually censored if the acoustic signal is changed, but the perception doesn’t? (8) Suggested Readings: Winifred Strange edited a book on cross-language research in speech perception. Given that many curriculums emphasize cross-cultural issues, this book would be an excellent source, and may help provide a background on secondlanguage learning. Another suggestion for a broad range of issues in speech perception, including methodologies, historical background, physiology, and evolutionary issues, is Raphael et al.’s “Speech Science Primer.” Raphael, L.J., Borden, G.J., Harris, K.S. (2007). Speech science primer: Physiology acoustics, and perception of speech, Fifth Edition. Baltimore: Lippincott Williams & Wilkins. Strange, W. (1995). Speech perception and linguistic experience: Issues in crosslanguage research. Baltimore: York Press. Several cross-language speech perception studies have been conducted. Consider looking at work by Catherine Best., and for developmental cross-language research, work by Janet Werker. (9) Interesting and Unexpected Aspects of Language Development: A 2013 New York Times article is relevant to the chapter discussion of audition development. It is entitled, “Babies seems to pick up language in utero.” The article discusses Christine Moon and colleagues’ finding that babies only 7 – 75 hours old were able to discriminate between English and Swedish vowel sounds. This suggests that something more specific than prosody is “picked” before birth. Going along with the topic of how language develops, a 2012 article in Frontiers in Auditory Cognitive Neuroscience, discusses the role of music (“Music and Early Language Acquisition” by Brandt, Gebrian, & Slevc). The authors approach “spoken language as a special type of music”, an assertion that lends itself well to discussion!

Suggested Websites There are several speech perception labs in the United States and Canada. The websites for many of these labs are listed below, each with an amazing amount of information and demonstrations, and auditory stimuli. Take some time to browse these sites to determine what is most interesting for you to use in your course.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Speech Perception and Production Laboratory: Queen’s University Kevin Munhall is the director of the lab at Queen’s University in Kingston, Ontario. Some research topics include audiovisual integration of face-to-face integration, the McGurk effect, and X-ray analysis of speech production. The “Collaborations” page contains links to many researchers and labs whose research is relevant to several chapters in the text. Speech Perception & Learning Laboratory Lori Holt at Carnegie Mellon researches phonetic context effects. Some of the fascinating research she conducts investigates the role of perceptual dimensions in auditory category learning. Haskins Laboratories The Haskins Labs at Yale conducts research on many topics in speech perception. One topic specific to this chapter is that of sine wave synthesis of speech. An overview of the research is provided, along with example of the sentences used, and spectrograms. The page on sine wave synthesis can be found under the “Research” tab, in the “Features and Demos”. Peter Assmann: University of Texas at Dallas Peter Assmann has a great site, with a wide-range of research projects detailed, such as the effects of interrupted speech, competing voices, and analysis and synthesis of children’s speech. Many examples of his stimuli are also available. Northwestern University School of Communication Some laboratories in the department conduct research related to topics in this chapter, the Aphasia and Neurolinguistics Research Laboratory, headed by Cynthia Thompson. UCLA Speech Processing and Auditory Perception Laboratory This UCLA laboratory directed by Abeer Alwan provides an overview of projects including investigations of how vocal structures work to produce sounds and development of models of speech perception in noise. This site contains many references. Movie Scene – My Fair Lady (1964) (1:03:52-1:06:10): Flower girl Eliza Doolittle (Audrey Hepburn) is being taught proper speech patterns by Professor Henry Higgins (Rex Harrison) in this scene. The famous “The rain in Spain…” phrase and the phoneme /h/ are practiced. A brief figure of the position of the articulators is also visible in this scene.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 14: THE CUTANEOUS SENSES

Chapter Outline I.

Introduction A. Some Questions We Will Consider B. Somatosensory system 1. Cutaneous sensation 2. Proprioception 3. Kinesthesis

Section 1: Perception by the Skin and Hands II.

Overview of the Cutaneous System A. The Skin 1. Functions: warning and protection 2. Layers: Epidermis and dermis B. Mechanoreceptors 1. In epidermis a. Cutaneous receptive field b. Small receptive fields 1. Merkel receptor: Slowly adapting fiber: SA1 2. Meissner corpuscle: Rapidly adapting fiber: RA1 2. In the dermis a. Deeper in the skin b. Larger receptive fields 1. Ruffini cylinder: SA2 fiber 2. Pacinian corpuscle: RA2, or PC fiber C. Pathways from Skin to Cortex 1. Spinal cord pathways a. Medial lemniscal pathway b. Spinothalamic pathway 2. Thalamus a. Ventrolateral nucleus (contralateral hemisphere) D. The Somatosensory Cortex 1. Somatosensory Receiving Area (S1) in parietal lobe 2. Secondary Somatosenory Receiving Area (S2) 3. Penfield a. Sensory homunculus E. The Plasticity of Cortical Body Maps 1. Experience-dependent plasticity

III.

Perceiving Details A. Example: Reading Braille B. Method: Measuring Tactile Acuity

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


1. Two-point threshold 2. Grating acuity C. Receptor Mechanisms for Tactile Acuity 1. Demonstration: Comparing Two-Point Thresholds D. Cortical Mechanisms for Tactile Acuity IV.

Perceiving Vibration and Texture A. Vibration of the Skin 1. Pacinian corpuscle (PC) B. Surface Texture 1. Duplex theory of texture perception a. Spatial cues b. Temporal cues c. Demonstration: Perceiving texture with a pen

V.

Perceiving Objects A. Active vs. passive touch B. Demonstration: Identifying Objects C. Identifying Objects by Haptic Exploration 1. Interaction of sensory, motor, and cognitive systems 2. Exploratory procedures (EPs) D. The Cortical Physiology of Tactile Object Perception 1. Cortical Neurons Are Specialized 2. Cortical Responding Is Affected by Attention 3. Cortical Responding Can Occur While Watching Touching

VI.

Test Yourself 14.1

Section 2: Pain Perception VII.

Types of Pain A. Inflammatory pain B. Neuropathic pain C. Nociceptive pain and nociceptors

VIII.

The Gate Control Model of Pain A. The direct pathway model of pain 1. Phantom limbs B. Gate control model 1. Nociceptors: transmission cells 2. Mechanoreceptors: inhibitory “close the gate” signals 3. Central control: signals from cortex Top-Down Processes A. Expectation 1. Placebo effect 2. Testing

IX.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


X.

a. Baseline, no expectation, positive expectation, negative expectation 3. Nocebo effect B. Attention C. Emotions 1. Looking at Pictures 2. Listening to Music The Brain and Pain A. Brain Areas 1. Multimodal nature of pain a. Sensory component of pain b. Affective (or emotional) component of pain B. Chemicals and the Brain 1. Opioids 2. Naloxone 3. Endorphins

XI.

Observing Pain in Others A. Empathy 1. ACC – anterior cingulated cortex

XII.

Something to Consider: Social Pain and Physical Pain A. Physical-social pain overlap hypothesis

XIII.

Test Yourself 14.2

XIV. Think About It XV.

Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. Name the three parts of the somatosensory system, and outline the functions of the skin. 2. Describe the layers of the skin and the types of mechanoreceptors. 3. Map the pathways from the skin to the cortex, as well as in the sensory cortex. 4. Describe methods used to measure tactile acuity. 5. Identify the receptor mechanisms and cortical mechanisms that are responsible for tactile acuity and feeling vibrations. 6. Describe the duplex theory of texture perception with supporting research 7. Summarize the exploratory procedures used in haptic exploration and the physiology of tactile object perception. 8. Distinguish among the three different types of pain.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


9. Debate the issues with the direct pathway model and the major principles of the gate-control model of pain. 10. Outline the four cognitive factors that affect pain perception and the components of the multimodel nature of pain. 11. Explain how opioids, endorphins, and placebos affect the perception of pain. 12. Discuss research illustrating that social situations can result in ACC activation. Chapter Overview/ Summary Chapter 14 opens with a description of a 17-year-old who lost the ability to sense through the skin and ability to sense the movements of his body. This leads to the distinction between the different components of the somatosensory system: cutaneous senses, proprioception, and kinesthesis. The focus of this chapter is on cutaneous senses. The skin is one of the largest organ of the human body, and in the two layers of the skin are the two sets of mechanoreceptors each. The four types of mechanoreceptors are: (1) Merkel receptors; (2) Meissner corpuscles; (3) Ruffini cyclinders; and (4) Pacinian corpuscles (PC). The pathways from the skin through the spinal cord to the thalamus in the cortex are described. From the thalamus, most signals go to S1, the somatosensory receiving area in the parietal lobe. Penfield’s research led to the mapping of body parts to areas on S1, represented as a “homunculus.” Cortical magnification is evident (e.g., the lips are small in size, but represent a greater area on S1). Plasticity occurs, as shown by animal rearing studies, and by research with musicians who show greater cortical representation for body parts needed for their specific instrument. The cutaneous senses are used to perceive details (acuity can be measured using two-point thresholds and perception of gratings), vibrations (PC fibers are responsible for vibration perception), and textures (both spatial and temporal cues are needed). Hollins and Risner (2000) investigated the connection between spatial and temporal cues in texture perception. They proposed that the difference between textures provides cues for texture discrimination. Moving between the textures produces vibrations that are interpreted as textural differences. When adapted to a particular vibration frequency (e.g., 10 Hz) texture perception decreased; this provides support for the duplex theory of texture perception. Perceiving objects can be done with passive touch or active touch. The anecdote of the ability of Geerat Vermij, an evolutionary biologist who was blind since 4-years-old, to use the active touch to study mollusks highlights the importance of haptic percption in perceiving objects. Haptic perception is using active touch to identify three-dimensional objects. Lederman and Klatzky showed that humans use specific exploratory procedures when identifying objects through haptics. The physiological mechanism for haptics seems to be the firing of SA1 fibers, and more specific neural firing as you go up the pathway. Pain perception is another topic area in the cutaneous senses. Three different types of pain are inflammatory, neuropathic, and nociceptive (the focus of the rest of the chapter). The direct pathway model of pain is critiqued by specifying two ways that pain can occur that are problematic for the model. Therefore, a better model is the “gate control model.” The gate system is located in the dorsal horn of the spinal cord. Inputs received from nociceptors (via transmission cells) open the gate increases pain perception. Mechanoreceptors carry information about non-painful stimulation (e.g., rubbing skin) to the horn working to close the gate. Cognitive information from central © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


control fibers also close the gate effectively reducing pain. Some cognitive factors can affect pain level: (1) expectation; (2) shifting attention; (3) emotional distractors; and (4) hypnosis. The multimodal nature of pain delineates between the sensory and affective components of pain. Area S1 is important for pain perception (sensory), as is the anterior cingulated cortex (ACC) for the affective aspect. The role of opioids, naloxone, endorphins, and placebos in blocking pain is also discussed. Research also shows that humans empathetically feel pain of others, as shown by increased ACC activity when watching someone else suffering.

Demonstrations, Activities, and Lecture Topics (1) VR and Pain Management: Goldstein briefly mentions how playing a computer game affects pain level. Hunter Hoffman is a leading researcher in this field, using VR to help decrease pain in burn victims. He also is doing fascinating research on the experience of touch in virtual vs. actual physical environments. A comprehensive overview of his research can be found at his University of Washington website and additional details (including photos, examples, and articles) can be found on the HITLab (Human Interface Technology Lab). (2) Cutaneous Sensitivity: Goldstein also describes a quick and cheap way to demonstrate two-point cutaneous sensitivity. More precise two-point aesthiometers are available from Power Medical Supplies (Item#16011). These are relatively lowcost, easily manipulated and readable, and can be used to demonstrate psychophysical methods. I usually test the cheek and the forehead, using the method of limits. Just be careful: the points are precise, and also pretty sharp. Other fun related stuff from Lafayette Instruments are the pinwheel aesthesiometer (can be painful: check with your IRB!), and an extensive cutaneous sensitivity kit (pressure, thermal stimulation, paradoxical heat) that comes with booklet of suggested activities (Model 16010). (3) Visual Capture and Touch: Power et al. describe an activity to show that visual capture occurs in haptics. (Previous chapters discussed how visual capture occurs in audition). Basically, you need an open-sided box and a lens that will increase or decrease the visual size of an object. Select an object (they suggest a small block) and have participants actively touch the block through a piece of cloth while looking at the object through the lens. Then have participants judg the size of the block. Most likely, they will make judgments based on the visual cues than haptics. Power, R.P. (1981). The dominance of touch by vision: Occurs with familiar objects. Perception, 10, 29-33. Power, R.P., Hausfeld, S., & Gorta, A. (1981). Workshops in perception. Boston: Routledge & Kegan Paul.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(4) Tactile Versions of Illusions: The usual interest in visual illusions can be expanded by demonstrating and/or discussing how some visual illusions occur haptically also. One of the most researched is the Müller-Lyer illusion. A recent article by Millar and Al-Attar addresses important information in this topic area. Millar, S., & Al-Attar, Z. (2002). The Müller-Lyer illusion in touch and vision: Implications for multisensory processing. Perception & Psychophysics, 64, 353-365. (5) Dowels and Object Perception: Another demonstration in this chapter that you can expand upon is the use of “tools” in haptic object perception. Instead of using a pen, you can have students use 1/8 inch diameter dowels (available at craft and hardware stores) to use as the tool. Blindfold the participants and have them sit as they use the dowel to try and identify objects. Observe their strategies, and what information they use (vibrations through stick, sound as they tap on the object, etc.). Try to use a wide range of stimuli: I have used a basketball hoop with net, a tripod, blackboard eraser, a banana (that you don’t plan on eating!), a stuffed animal, scissors, and a Skinner box. (6) “Feelspace project”: A very recent example of a stimulus substitution device was developed by Peter Konig at the University of Osnabrück in Germany. He and his research team have created a belt that vibrates to let the wearer feel their orientation in space. For example, if the person was facing west, a vibration would be felt on the left hip. Early indications are that long-term exposure to the belts improve performance on orientation tasks. See the website “FeelSpace: The Magnetic Perception Group”. (7) Pinocchio Illusion: Another interesting tactile illusion that has received much interest is the “Pinocchio illusion,” in which vibration of the tendon of the bicep can result in participant’s feeling that the forearm is getting longer. Lackner (1988) is the discoverer of the effect, but a Google search will provide more recent information. Lackner, J. R. (1988). Some proprioceptive influences on the perceptual representation of body shape and orientation. Brain, 111, 281-297.

Suggested Websites Psychophysics of Haptic Interaction The Microdynamic Systems Laboratory at Carnegie Melon conducts a variety of research projects typically integrating human and computer interactions. The Psychophysics of Haptic Interaction division aims to provide a high-fidelity haptic experience in a virtual world. In addition to providing descriptions of current projects and associated technology, the site summarizes data that reflect the effectiveness of the interactions.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Precision Microdrives: Haptic Feedback Overview Precision Microdrives specializes in miniature motors. On their website they describe a variety of projects using these motors to provide haptic feedback, vibrations, and alerting systems. This site if full of tech information including a blog. Pain Management Research Institute The Department of Anaesthesia and Pain Management at the University of Sydney website is an extensive, interdisciplinary source for information about pain management skill, pharmaceutical treatments, and other areas of research conducted by their faculty. Also of interest is information about their graduate programs in pain management. Pediatric Pain and Palliative Care Center Many hospitals have pain management centers. One of the leading centers that specialize in children’s pain is the David Center for Children’s Pain and Palliative Care at the Hackensack University Medical Center. International Veterinary Academy of Pain Management Another population that has received specialized attention for pain management is pets. The International Veterinary Academy of Pain Management works on treatments, causes, and behavioral effects of pain on dogs and other pets. Scholarly Articles – Dr. Jim Taylor Jim Taylor is a consultant specializing in the psychology of performance in business, sport, and parenting. Pain management is one important topic in sport psychology he addresses. His article, “Pain Education and Management in the Rehabilitation from Sports Injury” coauthored with Shel Taylor, has an excellent discussion of the different ways pain management is applied to athletes. It can be found under the “Pain Management in Injury Rehab” section. Movie Scene – Star Trek: First Contact (1996) (20:20-21:11): In this scene, Picard (Patrick Stewart) and Data (Brent Spiner), an android, have time-traveled back to earth and are looking at a pioneering space ship that Picard remembers seeing in the Smithsonian Institute as a young boy. He touches the ship, stating that actually touching the ship enhances the personal connection and that touch makes the ship more real. Data also touches it, noting the design imperfections, and concludes that “it is no more real than without touch.” Troi (Marina Sirtis) then appears, asking whether “you three (Picard, Data, and the ship) would like to be alone.” This scene highlights the importance of touch in humans, and our differences from computerbased sensory experience. Movie Scene – Lawrence of Arabia (1962) (4:35-5:58): A classic scene from one of the greatest epic films of all time. This scene gives the viewer a sense of who Lawrence (Peter O’Toole) is. Lawrence is with other officers and allows a match to burn down to his fingers. Another officer, William Potter, tries it, and finds the burning match against his fingers painful. “Of course it hurts,” replies Lawrence. Potter then asks what the trick is. Lawrence replies that the “trick, © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


William Potter, is not caring that it hurts.” This scene provides a nice segue to psychological aspects of pain management.

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


CHAPTER 15: THE CHEMICAL SENSES

Chapter Outline I.

Introduction A. Some Questions We Will Consider 1. Gatekeeper function of chemical senses 2. Neurogenesis unique to chemical senses

II.

Taste

III.

Taste Quality A. Basic Taste Qualities B. Connections Between Taste Quality and a Substance’s Effect

IV.

The Neural Code for Taste Quality A. Structure of the Taste System 1. Tongue a. Papillae: Four categories i. Filiform ii. Fungiform iii. Foliate iv. Circumvilliate b. Taste buds in all papillae except filiform c. Taste cells and taste pores 2. Processing Pathway a. Fibers from tongue, mouth, and throat b. Nucleus of the solitary tract (in brain stem) c. Thalamus d. Frontal lobe: Insula and frontal operculum cortex B. Population Coding 1. Across-fiber patterns C. Specificity Coding 1. Elimination of specific tastes 2. Amiloride: blocks sodium selectively

V.

Individual Differences in Tasting A. Video microscopy technique B. Tasters, non-tasters, supertasters

VI.

Test Yourself 15.1

Section 1: Olfaction and Flavor VII.

The Functions of Olfaction

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


A. Macrosmatic vs. microsmatic species (dogs vs. humans) B. Pheromones C. Isolated congenital anosmia (ICA) VIII.

Olfactory Abilities A. Detecting Odors 1. Method: Measuring the Detection Threshold a. Forced-choice method B. Discriminating Between Odors C. Identifying Odors 1. Demonstration: Naming and Odor Identification D. Individual Differences in Olfaction

IX.

Analyzing Odorants: The Mucosa and Olfactory Bulb A. The Puzzle of Olfactory Quality 1. Odor objects 2. Two stages in odor object perception a. Analyzing: olfactory bulb and mucosa b. Synthesizing: olfactory cortex B. The Olfactory Mucosa 1. Olfactory receptor neurons (ORNs) and olfactory receptors C. How Olfactory Receptor Neurons Respond to Odorants 1. Method: Calcium Imaging 2. Recognition profiles of each odorant D. The Search for Order in the Olfactory Bulb 1. Glomeruli in olfactory bulb 2. Method: Optical Imaging 3. Method: 2-Deoxyglucose (2DG) Technique 4. Chemotopic/odor/odotoptic map

X.

Representing Odors in the Cortex A. Olfactory areas outside the olfactory bulb 1. Piriform cortex – primary olfactory area 2. Orbitofrontal cortex (OFC) – secondary olfactory area 3. Amygdala – general emotional responses B. How Odorants are Represented in the Piriform Cortex (PC) 1. Widespread activation pattern in PC C. How Odor Objects are Represented 1. “Learning” in the PC a. Mixture and component 2. Dual-pathway: a. Odor objects depend on experience: piriform cortex

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


b. Innate responses to chemicals (e.g., pheromones): separate pathway XI.

Test Yourself 15.2

XII.

The Perception of Flavor A. Demonstration: “Tasting” With and Without the Nose B. Taste and Olfaction Meet in the Mouth and Nose 1. Retronasal route 2. Nasal pharynx 3. Oral capture C. Taste and Olfaction Meet in the Nervous System 1. Bimodal neurons in the OFC D. Flavor is Influenced by Cognitive Factors E. Flavor is Influenced by Food Intake: Sensory-Specific Satiety 1. Sensory-specific satiety Something to Consider: The Proust Effect: Memories, Emotions, and Smell A. Relationship between odors and specific aspects of memory 1. Proust effect 2. Developmental Dimension: Infant Chemical Sensitivity B. Newborns can discriminate some smells and tastes C. Infants develop acceptance to tastes C. Effect of pregnant women’s diet on infant’s taste preferences

XIII.

XIV. Test Yourself 15.3 XV.

Think About It

XVI. Key Terms

Learning Objectives At the end of the chapter, the student should be able to: 1. Identify the functions of taste, identify taste qualities, and structures and functions of the taste system. 2. Critique the evidence for specificity vs. distributed coding in taste. 3. Describe how anosmia demonstrates the importance of olfaction, and how odor detection thresholds are measured. 4. Explain why a dog’s odor detection is better than that of a human. 5. Identify the structures and functions of the olfactory system. 6. Discuss how recognition profiles affect receptors. 7. Describe the 2DG technique and how the results can be used to reveal activation patterns in the olfactory bulb. 8. Provide an example of the direct link between odor perception and the features of molecules. © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


9. 10. 11. 12. 13.

Outline how odorants are represented in the Piriform Cortex and the process by which odor objects come to be represented in the Piriform Cortex. Discuss how “tasting” is affected when olfaction is eliminated. Describe the physiology of flavor and how expectations and satiety affect flavor. Summarize the Proust effect and related research findings. Discuss infant detection of scents and tastes, and early influences on the development of flavor preference.

Chapter Overview/ Summary The chemical senses of olfaction and taste are considered to be “gate-keepers”: identifying what the body needs for nourishment and what the body needs to keep out to prevent danger. Another unique feature of the chemical senses is neurogenesis, the constant renewal of the receptors. Taste is discussed first. Taste serves as a gatekeeper and helps provide information about the effect a substance will have on the body. Five basic tastes have been identified: sweet, bitter, sour, salty, and umami. The perception of these tastes is then explored from a physiological perceptive starting with the tongue (papillae and taste receptor sites) and going all the way to the brain (insula, frontal operculum cortex, and orbital frontal cortex). The next issue addressed deals with the representation of taste. Evidence exists for both distributed and specificity coding. One potential resolution is that basic taste qualities are the result of specificity coding; distributed coding is responsible for more subtle differences. Although taste is experienced in similar ways across individuals, there is one major difference that has been investigated in relative detail: “tasters” vs. “non-tasters”. Tasters are sensitive to bitter tastes (PROP is typically used as the stimulus) compared to non-tasters. Especially sensitive tasters are classified as supertasters. The chapter then moves on to describe the other chemical sense, olfaction. The importance of olfaction is addressed first. There are two general types of “use” for the olfactory system. Macrosmatic species (e.g., dogs) rely heavily on olfaction for communication, guiding orientation, and mating; microsmatic species (e.g., humans) also use olfaction, as evidenced by sensitivity to olfactory cues of hormonal activity and the challenges that human anosmics face. For humans, there is a wide range of odor detection thresholds for different substances. Dogs have greater odor sensitivity than humans because of the number of receptors they have, not because their individual neurons are more effective, efficient, or sensitive. Odor discrimination is also measured in humans, with the threshold difference between two odors being about 11%. Odor identification, in which the name of the odor also must be retrieved, is more difficult than you might think and it is easily influenced by experience, or knowledge, as having a label for an odor can affect identification and perception. Along with the challenge of identifying odors, it has been difficult to implement an odor “system” as we lack the words to describe many of these sensations (e.g., “woody violet” vs. “sweet violet”). As in other modalities, segmenting different odors can be a challenge. The formation and use of “odor objects” helps with this task. The physiology of the olfactory system is then discussed, particularly the action of the olfactory mucosa (including odorant recognition profiles) and the olfactory bulb (including the chemotopic map). Parallels drawn between the olfactory and visual systems help demonstrate the nature of recognition profiles and the chemotopic map. More complex representations of odor are © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


developed in the piriform cortex and are likely composed of distributed associations that strengthen with upon repeated exposure. The last major section of the chapter investigates flavor - the combination of taste and olfaction. The importance of olfaction to “taste” can be demonstrated by trying foods without olfaction. Most substances need olfaction to be judged accurately; one exception is MSG. Interestingly, humans have a bias to attribute flavor to taste because the mouth is where most of the tactile experiences take place; this is referred to as oral capture. Physiologically, convergence between taste and olfaction (and other senses) converge in the OFC which contains bimodal cells capable of responding to multiple senses. Apart from physiology, flavor is affected by past experience, expectations (e.g., cheap vs. expensive wine) and satiety. In addition to the complex experience of flavor, the chemical senses are able to elicit strong emotional responses. The Proust effect epitomizes this tendency. Memories and emotion-laden memories associated with tastes and scents implicate a prominent role for the amygdala and hippocampus in olfaction and taste. The chapter concludes with a developmental look at olfaction and taste. It was shown that newborns can differentiate pleasant odors (banana) from unpleasant odors (rotten eggs). They can also distinguish between bitter, sweet, and sour tastes; salt taste develops later with experience. In addition to having the capability to discriminate between scents and tastes, research indicates that flavor preferences can be influenced by the scent of amniotic fluid and flavors in breast milk.

Demonstrations, Activities, and Lecture Topics (1) Video suggestions: “Taste and Smell” is available from Insight Media, which covers the basic structures and neural processing of the chemical senses. Also, National Geographic’s Explorer series has a great program on The Science of Dogs, which addresses dog olfaction. Ordering information and a summary of the episode can be found at on their website. (2) How Did They Do It?: McClintock and Menstrual Synchrony: Although it is not discussed in the chapter, there is evidence that detection of hormonal states can lead to synchronous menstruation in women. The story of McClintock’s menstrual synchrony study is told in Miriam Horn’s Rebels In White Gloves, and is also accessible under “Menstrual Synchrony and Suppression” at the “Museum of Menstruation” website. In brief, McClintock was a 20-year-old junior at Wellesley when she attended a conference on pheromones, where the researchers (all male) concluded that pheromones did not affect humans. She intuitively believed otherwise: Based on observation of menstrual synchrony, she believed that pheromones were responsible. Within the next couple of years, she conducted her pioneering research, and published it at the age of 23. Another inspiring and insightful story of discovery for students! (3) Pheromones and Sexual Orientation: Another recent and well-publicized study on pheromones in humans was conducted by Savic et al. (2005), who reported that brain activity is different for homosexual men than heterosexual men to odors believed to be pheromones (a substance in men’s sweat, and a compound found in women’s © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


urine). Furthermore, the gay men’s activity was similar to hetereosexual women’s response. This article was published in The Proceedings of the National Academy of Sciences, but several popular media articles have reported the finding. A quick search will result in a number of these articles. (4) Why are Peppers Hot?: Another substance students tend to be interested in is red hot chili peppers. Scientific American Frontiers explored this with a segment entitled “Life’s Little Questions: Why are Peppers Hot?”. In addition to general information you can “Ask an Expert” about the same topic. Linda Bartoshuk is one such expert. You can see her “Q&A” session by searching for her specifically. (5) Olfactory Imagery: Olfactory imagery is another topic that overlaps perception with other areas of psychology. A very recent review by Stevenson and Case is a good supplemental reading for students. Stevenson & Case (2005). Olfactory imagery: A review. Psychonomic Bulletin, 12, 244264. (6) General Interest Books: For a “minor sense,” there is quite a bit of interest in olfaction and taste. Three books can be used for additional readings are: Rachel Herz (2007) The Scent of Desire; Avery Gilbert’s What the Nose Knows; and Jonah Lehrer’s Proust Was a Neuroscientist, for an interdisciplinary approach. Each book has chapters that can stand alone, without assigning the whole book. (7) Higher-Order Processing and Taste Perception Activity: This excellent activity on the factors involved in taste was posted by Deb Brihl from Valdosta State University: “One easy experiment would be to look at the different factors that influence our sense of flavor. You could have them try to determine the taste of jelly beans and have them remove smell and/or vision. Here is what I have done – if you have one of those stores in which you can buy jelly beans by flavor, great. Get some different flavors. I always get some obvious color indicates the flavor (such as blue is blueberry), some that the color is not the most obvious flavor (red is apple - it is amazing how many get that wrong when they use vision) - just a general mix. Student eat a jelly bean and try guessing the flavor when they are able to use both smell and vision, some with just smell, some with just vision, and some with neither. They can discuss the importance of both senses on our perception of taste.” If you are also daring enough, you can also grab some of the “Harry Potter Bertie Bott’s Every Flavor Jelly Beans” and see what the effect of labeling is on taste perception. In addition to usual flavors like “lemon drop”, “banana” and “blueberry,” Bertie has included “booger,” “ear wax,” “sardine,” and “vomit.” I must admit, I have a box, but I’ve never tried one yet, which actually is also instructive: Just labeling something “booger” will yield an affective component, even without tasting it!

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


(8) Higher-Order Processing and Taste Perception Humor: One other way of introducing this topic is with the following (somewhat risqué) joke: A college professor was doing a study, testing the senses of first-graders using a bowl of LifeSavers. He gave all the children the same kind of Lifesaver, one at a time, and asked them to identify them by color and flavor. The children began to say: “Red… Cherry”; … “Yellow… Lemon”; …”Green”… Lime”; “Orange”… “Orange.” Finally, the professor gave all the children honey LifeSavers. After eating them for a few moments, none of the children could identify the taste. “Well”, said the professor, “I’ll give you a clue. It’s what your mother may sometimes call your father.” One girl looked up in horror, spit hers out, and yelled “Everybody, spit them out... they’re a**holes!!!!” (9) Development of Taste and Obesity: An important application of the issues in this chapter is how development of taste for fruits and vegetables can help healthy eating later in life. Catherine Forestell and Juile Menella are prominent researchers in this area. Below is one recent representative article. Forestell, C. A., & Mennella, J. A. (2007). Early determinants of fruit and vegetable acceptance. Pediatrics, 120 (6), 1247-1254.

Website Suggestions Monell Chemical Senses Center The Monell Chemical Senses Center really has anything you might want to know about this chapter!! Ripe Sense A new technological advance is the “ripesense” sensors that can alert shoppers if the fruit in the store is ripe. The relationship between human and non-human sensors can be discussed in light of this development. Tim Hanni – Swami of Umami This blog entry provides a very introduction into the differences in “tastes” in wine. There is some basic mention of concepts discussed in the chapter which could be used as discussion points. BBC Human Senses This BBC website has nice activities and information, including questionnaires on “Are You a SuperTaster?” and “Sniffing the Decades” (“nostalgic” odors). The “Sniffing the Decades” does not use a range of decades that are relevant to current traditional-age college students, but I have had my students use this as an example, and report five odors that they think typify their decade. Aromatherapy – The Balance & Harmony of Body and Mind © 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Certainly a field that needs a healthy dose of skepticism is aromatherapy. This website can be used to introduce students to the claims of aromatherapy supporters. Monell Center – Nutrition and Appetite Julie Menella and Gary Beauchamp are two leading researchers on infant taste preferences. Links to their extensive research can be found here. Movie Scene – 9 ½ Weeks (1986) (DVD scene 13): “Elizabeth’s Feast” is the now-classic (and occasionally parodied) scene in which a mystery man (Mickey Rourke, yes that Mickey Rourke!) starts a relationship with the more reserved Elizabeth (Kim Basinger). In this scene, Rourke asks Basinger to close her eyes, and then gives her a variety of taste and olfactory stimuli (hot pepper, milk, cough syrup, etc.). This scene shows the relationship between (lack of) vision and taste, the relationship between taste and romance, and the different types of sensory experience involved in taste (texture, shape, and taste qualities).

© 2017 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole or in part.


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.