SENSATION and PERCEPTION

CH. 3 SENSATION and PERCEPTION

The vast majority of information acquired throughout life often begins as environmental energy. Our ability to detect environmental energy, however, is very limited with respect to the immense amount and tremendous diversity of energy available.

The distinction between energy that is detectable versus energy that is not (both in type and intensity) is the defining feature of a stimulus.

The interaction between an organism and the external world is not mutually exclusive, but a dynamic process (hence, ever changing). In order to function purposefully in an ever-changing environment, it is necessary to detect the energies around us.

Sensation vs. Perception

Sensation is best defined as the detection of some physical stimulus in the environment by one of your sensory organs. A stimulus is simply a scientific term for anything that acts on your own behavior or mental processes.

 

Sensations refer to your body’s ability to interact with specific types of energy from the environment like light, sounds, tastes, etc. Once this information reaches the brain, meaning is attached to it, which is the defining feature of a perception.

 

Perception is your brain making sense of the physical stimulus from the world by organizing the stimulus into a representation of something useful.

 

Sensation and perception are both important for what you might consider a normal, everyday experience. Perception is not generally possible without sensation because the machinery used to gather physical stimuli from the environment is a crucial requirement.

Stimulus:

A quantifiable pattern of physical energy, which is able to interact with an organism and produce a change in the condition of the organism. A stimulus is a type of environmental energy, like light, that we are capable of detecting and responding to.

Sensation:

The detection of physical stimuli in the environment such as light waves, sound waves, pressure, or chemical molecules.

Perception:

The detailed process of interpreting and making sense of a combination of sensations.

Hallucinations are perceptions of experiences without corresponding external stimuli together with a compelling feeling that these are real. These are the most significant and debilitating symptom of schizophrenia.

There are many different types of hallucinations; the most common include:

 

Auditory hallucinations: when someone hears something that is not there, such as a voice or TV.

 

Visual hallucinations: when someone sees something that is not real, such as a person or creature.

 

Olfactory hallucinations: when someone smells something that is not there, such as smoke or coffee.

 

Gustatory hallucinations: when someone tastes something they did not eat, like metal or spoiled milk.

 

Tactile hallucinations: when someone feels like something or someone touched them, like being grabbed or pulled.

 

Somatic hallucinations: when someone feels something within their body, such as that of bugs crawling under the skin or like an object trapped in their abdomen.

Sensation vs. Perception

Term Definition and Description
Sensation Sensations refer to certain, immediate, and directly qualitative experiences or attributes such as hard, warm, sweet, red, or bright, produced by simple isolated physical stimuli. These specialized cells form the basis for what most people understand as sensations, and it is these sensory experiences that ultimate give rise to our perceptions. Sensations always travel in an ascending fashion, starting from a particular sensory receptor and terminating in the brain. Sensations are produced by stimuli.
Stimulus A stimulus is defined as a quantifiable pattern of physical energy, which is able to interact with an organism and produce a change in the condition of the organism. That is, a stimulus is a type of environmental energy, like light, that we are capable of detecting and responding to.
Potential Stimuli Potential stimuli are physical energies that have yet to be detected, but are in fact detectable, like a distant sound. Therefore, if the physical energy from the environment does not cause a change in the organism, it would not be considered a stimulus. For example, certain animals can detect UV light, magnetic energy, or very high pitch sounds, while humans can’t, so such patterns of energy would not be stimuli to humans.
Perceptions Perceptions refer to the psychological processes whereby meaning, past experience or memory, and judgments are used to evaluate the significance of particular stimuli. For example, how is it that we are able to, in a crowded and loud room, attend to our name being said but not to other names? Obviously, our names have greater significance and history than other names. In addition, perceptions are associated with the organization and integration of sensory attributes. For example, flavor is a perception and is the culmination of taste, smell, and texture

Sensation vs. Perception

 

Sensation vs. Perception: The Process

Getting from sensation to perception is also a fascinating and important process. The physical energy (stimulus) must be modified into a form the brain can use via transduction.

Transduction:

The process of converting a physical stimulus into a meaningful and useful neural signal capable of being interpreted by the brain.

This process takes place in the following order:

stimulation, reception, transduction, transmission,

and perception.

 

All sensory receptors have the ability to take

information from the world around us and

convert it into neural codes that are able to generate neural activity that our brain is able to interpret.

Our sensory systems are particularly skillful in detecting contrasts or changes in the environment and are particularly poor at detecting constant stimulus energy.

Source: Brian Kelley

Sensation vs. Perception: The Process

Modality Five major sensory modalities have been recognized since the history of man: vision, hearing, taste, touch, and smell; however, there are actually more. Furthermore, each modality has sub-modalities (i.e., taste: sweet, sour, bitter or salty).
Intensity Intensity or the amount of a sensation depends on the strength of the stimulus. At the receptor level, stimulus intensity is influenced by two factors. The first relates to the total number of receptors activated (spatial coding), while the second relates to the output generated by a single receptor or a group of receptors (temporal coding). The lowest stimulus intensity a subject can detect is defined as the Sensory Threshold. Interestingly, sensory thresholds are not stable across time nor are they similar across different people.
Duration The duration of a sensation is defined by the relationship between the stimulus intensity and the perceived intensity. Essentially, if a stimulus persists for a sustained period of time, its intensity decreases over time. This phenomenon is known as adaptation. Sensory systems are able to detect change the best, not constant stimulation.
Location There are two important measurements of a person’s ability to detect spatial aspects of a sensory experience: (1) the ability to locate the source/site of stimulation, and (2) the ability to distinguish between two closely spaced stimuli.

When stimulus energy is transduced by the sensory receptor into neural energy, specific features of the stimulus, such as intensity and duration, are represented in the resultant pattern of action potentials.

Despite their diversity, all sensory systems and subsystems extract the same basic information from stimuli, including modality, intensity, duration, and location.

Most of the sensory systems include the following features:

Environmental energy

Detection of environmental energy

Activation of sensory receptors

Transduction of energy

Neural encoding

Mapped organization

Neural pathways

Neural relays

Sensory subsystems

Central or multisensory integration

Sensation vs. Perception: The Process

The common pathway of the sensory systems: from receptor to awareness

As the body transforms environmental energy to perceptions, the amount and complexity of the information changes. The nervous system systematically decreases the information but at each step, it also increase the information’s complexity.

Source: Brian Kelley

 

Sensory receptors are sensitive to a specific form of physical energy (e.g., light, sound, pressure, or movement). Despite their apparent diversity, sensory receptors can be broadly classified into three categories:

 

Exteroceptors

Proprioceptors

Interoceptors

Another way to categorize sensory receptors is based on the type of energy they transduce or are activated by, and this breakdown includes the following:

Sensation vs. Perception: The Process

The common pathway of the sensory systems: from receptor to awareness

Exteroceptors:

Receptors that respond to environmental energy or stimuli that are occurring from the outside of one’s body, such elements include light, sound, touch, and chemical agents.

Proprioceptors:

Are sensory receptors that are activated by muscular movement or passive displacement of body parts.

Interoceptors:

These receptors that are able to respond to materials inhaled, ingested, or passed, and to changes in chemical surroundings, mechanical pressure or shearing force.

Photoreceptors Receptors that are sensitive to radiant electro-magnetic energy (light).
Mechanoreceptors Receptors that sense deformations and motion of solids, liquids and gases. Mechanical forces are those that tend to deform or accelerate objects possessing mass.
Chemoreceptors A class of receptors that detect water soluble, lipid soluble chemicals.
Thermoreceptors Receptors that are sensitive to changes in temperature.
Nociceptors A type of receptor that responds to painful stimuli or stimuli that are capable of causing tissue damage.

Sensation vs. Perception: Method

There are two basic methods of studying the sensation and perception of individuals, both involving measuring sensation’s limits otherwise known as thresholds. Those methods are absolute threshold and difference threshold.

A person’s absolute threshold is measured by taking the smallest amount of a stimulus and gradually increasing its strength until a person correctly guesses the stimulus’s presence 50% of the time.

A difference threshold is estimated by comparing the intensity of two stimuli, and gradually increasing the difference between their intensities until a difference can be detected by the person.

A fascinating aspect of the difference threshold is that it is not fixed, but relative. Weber’s law suggests the difference threshold between two stimuli is relative to the size of the original stimulus.

Weber’s Law:

A principle in sensation that suggests that the size of the difference threshold is relative to the strength of the original stimulus.

Difference Threshold:

A method used to study the sensitivity of sensation; the smallest difference between two stimuli that can be correctly detected 50% of the time; this is also called the just-noticeable difference.

Absolute Threshold:

A method used to study the limits of sensation; the smallest amount of a physical stimulus that can be correctly detected 50% of the time.

Sensation vs. Perception: Method & Adaptation

The sensitivity of sensory receptors can be affected by the duration of a stimulus.

 

Our sensory receptors become less sensitive when exposed to a constant stimulus for a certain amount of time.

Sensory adaption is a decline in a sensation’s sensitivity resulting from the presence of a constant stimulus.

There are also a number of non-sensory factors that can affect the observer’s performance in a signal/sensory detection task:

Motivation

Attention

Experience

Fatigue

Expectation

The Senses: Vision

While anecdotal, the power of light and color has become an integral component of our everyday language. Clearly, we connect how we feel with light, such as feeling “bright” or “dark” and even more specifically we associate and explain how we feel with color.

Color Summary

Blue: sad, depressed, melancholy

Red: angry, aroused,

Green: jealous, sick, a beginner

Black: morbid, powerful,

White: scarred, angelic

Yellow: sick or cautious

Grey: mature

Feminine: pinks

Masculine: blues

© artellia/Shutterstock.com

 

Consider the issue around screen time. Estimates suggest more than 14% of optometry visits are related to eye or vision problems resulting from computer, tablet, or smartphone use.

 

One of the most significant causes of Computer Vision Syndrome is decreased ocular lubrication, also known as tears. Decreased ocular lubrication is caused primarily by decreased blink rate. Blinking is very important because each blink works to coat the eye with tears. Several studies have shown that people typically blink about 12 times per minute; however, blink rate is typically reduced by half for computer users.

The Senses: Vision

Ocular Lubrication:

Our visual system is able to make its own surface liquid which serves to keep the eye moist; natural tears are the mechanism for this process.

The Senses: Vision

Office settings with laptop computers also contribute to additional exposure of the cornea along with reduced tears.

 

Inappropriate computer monitor distance and angle are additional causes of eye strain and Computer Vision Syndrome. Focusing on close objects, compared to distant objects, requires more ocular muscles and those muscles have to work harder.

 

The lens is a biconvex crystalline structure that is actually quite stiff and difficult to stretch, so it shouldn’t be surprising to learn that eye fatigue occurs quickly after constant near viewing.

 

Not only do the ciliary muscles have to work hard, but so do the ocular muscles. Close viewing requires the eyes to converge (i.e., move toward one another), much like what you see when someone “crosses” their eyes, perhaps not quite as dramatic.

Cornea:

The surface of the eye

Lens:

A biconvex crystalline structure that helps focus the visual image onto the retina in the back of the eye.

Ciliary Muscles:

The muscles within the eye that stretch or compress the lens for the purpose of focusing the visual image.

Converge:

The ability of the two eyes to move, rotate inward toward the nose; this is often referred to as being cross-eyed.

There are significant differences between computer reading and paper reading.

 

Even the highest quality computer screens have lower pixilation compared with paper.

 

Pixilation is a physical measure of resolution (the sensory component; what we see), which is described in psychological terms as acuity (the perceptual component; how we see it) or described in everyday language as detail.

 

Virtually no glare occurs from paper, while it can be considerable with computer monitors.

Comparison of Computer Monitors vs. Printed Material
Variable Computer Print
Resolution Generally low Generally high
Contrast Typically low Typically high
Text Color/Blackness Poor Rich
Glare Moderate to high Very low
Flicker Moderate to high None
Ergonomics Hard to manipulate Very easy to manipulate

The Senses: Vision

 

The Senses: Vision

10 Simple Steps to Protecting Yourself from Computer Vision Syndrome:

 

Take a break every 20 min for 20 seconds and focus 20 feet away.

 

Blink! Blink! Blink!

 

Use artificial tears if necessary

 

Visit eye doctor regularly

 

Adjust monitor distance and height

 

Adjust monitor brightness and contrast

 

Minimize Glare

 

Control excessive overhead light

 

Clean monitor frequently

 

For longer projects, read from paper if possible

Light, also known as electromagnetic radiation, is the fundamental energy source for our visual system.

 

Simply stated the human eye is capable of collecting, detecting, transducing, and encoding electromagnetic radiation. The eye is the single most complex sensory organ.

Vision: Light Waves

The wavelength of a light wave—the distance from peak to peak—is the physical property of the light wave stimulus that we would perceive as color.

While all light follows the same basic laws, light does not affect all objects equally. However, there is so much more to light than what we see. Our visual system is only capable of transducing a very small fraction of the entire range of electromagnetic radiation.

While we use different names to describe light, there is no fundamental difference between light at one end of the spectrum versus light at the other end aside from wavelength dependent properties such as frequency and energy.

© brgfx/Shutterstock.com

© Designua/Shutterstock.com

Vision: From the World to the Eyes

Photoreceptor:

A type of sensory receptor specifically for vision, which is located on the retina at the back of the eye.

Retina:

A light-sensitive membrane at the back of the eye that contains the sensory receptors for vision.

Parasympathetic (e.g., “the rest and digest” system) activation allows for pupil constriction, whereas sympathetic (e.g., the “fight or flight” system) activation allows for pupil dilation.

 

Light waves travel through the opening in the front of the eye (the pupil) and are processed on a structure at the back of the eye, the retina. The sensory receptor for vision is called a photoreceptor and many of these rest on the retina.

Light waves bounce off objects in the physical world and enter our mental world through small holes at the front of the eyes. The colorful area of the eye that is documented on your driver’s license is called the iris, and it is the fibrous muscular structure that contains these small holes.

 

These holes, pupils, are the black circles that expand and contract depending on whether you are inside or outside in the sun. Keep in mind, pupils are not structures at all.

 

Pupil diameter is also influenced by stress and arousal (like lying) and is controlled by the autonomic nervous system.

© Miro Kovacevic/Shutterstock.com

 

Vision: Rods and Cones

There are two basic types of photoreceptors in the human eye: rods and cones.

 

Rods are rod shaped and better at processing dim light, which may be experienced at dusk or at night.

 

Cones are cone shaped and are specialized to process colorful images in very high detail. Cones also require substantial light in order to properly work.

Rods:

A specific group of photoreceptors that are specialized to process dim light and are useful for night vision and peripheral vision.

Cones are mostly located in the fovea, an area on the retina associated with the center of vision. Visual acuity is highest in this region. When we look at an object, the center of that object is located in the fovea on our retina.

 

The number of cones significantly decreases as we move from the fovea toward the edges of the retina. These edges of the retina contain far more rods than cones.

Cones:

A specific group of photoreceptors that are specialized to process color and are useful for daylight vision and high visual acuity.

© Sakurra/Shutterstock.com

Vision: From the Eye to the Brain

Optic Nerve:

A large bundle of axons that leave the back of the eye and carries visual information to the visual cortex of the brain.

Blind Spot:

A gap in the retina due to the exit of the optic nerve where no photoreceptors are located; this causes a blind spot in the visual field during sensation.

Notice in the figure that there is a hole in the retina where the optic nerve must exit the inside of the eye.

 

In this location there are zero rods or cones.

 

Therefore, the point at which the optic nerve leaves the eye—the blind spot—is quite literally not sensitive to light waves because there are no photoreceptors on that part of the retina.

Instead, neighboring photoreceptors help fill in the empty areas in perception.

 

After leaving the eye, the optic nerves meet in the brain at a point called the optic chiasm. Here, the optic nerve from each eye splits into two smaller bundles of axons—one for the right and left visual fields of each eye—and is sent to different regions of the brain for higher-order perception.

Optic Chiasm:

The point in the brain at which the optic nerves from each eye meet and partly cross over to the other side of the brain.

© Left Handed Photography/Shutterstock.com

Senses: Audition (Hearing)

Sound Waves

The physical stimulus for audition comes in the form of a wave: a sound wave.

 

Sounds are produced by vibrations (sound can only take place in a medium–gas, solid, or liquid), and these vibrations radiate outward from the source, with alternating peaks and valleys of pressure. Otherwise stated, sound is a result of cycles of compressions and rarefactions.

Wavelength:

The linear distance between two successive compressions or peaks in light waves.

The frequency of the wave or the number of peaks that occur during a defined unit of time determines the pitch. Wavelengths and frequencies

are inversely related.

 

The amplitude of the wave is the maximum change in air

pressure. Amplitude is correlated with the psychological

attribute loudness. Loudness is related to intensity and

intensity is related to pressure.

 

Soundwaves are analyzed by the number of cycles that occur per second, termed a Hertz. Humans can usually hear sounds waves that cycle between 20 and 20,000 Hertz.

Amplitude:

The amount of vibration or pressure in a sound wave often referred to as loudness.

© Fouad A. Saad/Shutterstock.com

Audition (Hearing): Sound Waves

The top image shows sounds of equal loudness but difference pitches while the bottom shows the same pitch at difference loudness levels.

Source: Brian Kelley

Audition (Hearing): Sound Waves

A Decibel is the scientific unit of measurement used to describe the loudness of a particular sound. The decibel level is set to have a minimum of 0, which is the lowest sound audible to the human ear.

Decibals Event Danger
0 Lowest audible sound
30 Whispering in a library
65 Normal conversation
85 City traffic inside a car Danger with prolonged exposure
105 Lawnmower Danger after 2 hours of exposure
125 Balloon popping Sounds become painful
142 Jet engine at 100 feet Short-term exposure may cause permanent hearing loss
140 Loudest sporting event recorded
160 Shotgun blast Instant perforation of eardrum

The image demonstrates the energy requirements for increasing sound intensities.

© Studio BKK/Shutterstock.com

 

Audition (Hearing): From the World to the Ear

What we normally just refer to as “ears,” the pinna, collect sound waves from the environment.

After collecting these sound waves, the outer ear funnels them down through the ear canal right up to the eardrum.

As sound waves come into contact with the eardrum, it vibrates at the same frequency as the sound waves in the ear canal.

Sound waves are then transferred from the eardrum to three tiny bones that constitute the middle ear: the malleus (hammer), incus (anvil), and stapes (stirrup).

 

Together, the function of these tiny bones is to amplify the sound waves coming into contact with the eardrum, and to send the amplified sound waves to the inner ear for processing.

 

The middle ear sends sound waves to the inner ear through a tiny structure similar to the eardrum. This structure is called the oval window and is the “front gate” to the cochlea.

Pinna:

The outer funnel-shaped structure of the ear; normally, this is what people refer to as their ear.

Eardrum:

The thin membrane at the end of the ear canal that vibrates at a specific frequency when bombarded by sound waves.

© Tartila/Shutterstock.com

Audition (Hearing): From the World to the Ear

The cochlea is the location where transduction of sound waves finally begins to occur.

 

As amplified sound waves travel through the fluid in the cochlea, a structure called the basilar membrane begins to ripple as well. Attached to the basilar membrane are the sensory receptors, hair cells, for sound.

 

As the basilar membrane bends due to sound waves, so too do the hair cells. As these hair cells bend with the sound waves, the physical energy is transduced into neural impulses sent from the auditory nerve to the appropriate parts of the brain for higher-order processing.

 

The stimuli (sound waves) are finally converted to information the brain can use through the process of transduction in the cochlea.

Sitting above the inner ear is the vestibular system which is responsible for equilibrium or balance. This sense of spatial orientation is essential for the coordination of motor responses, eye movements, and posture.

Hair Cells:

Thin, hair-like structures that are the sensory receptors for audition; these are located on the basilar membrane inside the cochlea.

Cochlea:

The spiral structure in the inner ear that contains both fluid and the basilar membrane; the latter houses sensory receptors for audition.

Senses: Olfaction (Smelling)

For olfaction, the stimulus comes in the form of chemical molecules in the air, which are released by the substance we are smelling.

 

These chemical molecules enter the nostrils and stimulate the olfactory receptors that are located at the top of the nasal cavity. Once these olfactory receptors are stimulated, the neural signals are sent through the porous part of the skull at the top of the nasal cavity and on to the olfactory bulb.

 

The olfactory bulb, which resides inside the skull, sends messages to other parts of the brain for higher-order processing and perception of the specific odor.

 

The olfactory receptor cells are actual neurons. The olfactory receptor cells undergo continuous neurogenesis (reproduction) about once every 25 to 30 days; the neurons in the olfactory system are the only neurons known to reproduce.

© medicalstocks/Shutterstock.com

© Blamb/Shutterstock.com

Senses: Gustation (Taste)

The stimuli for taste are various chemicals contained in food we consume. Saliva in the mouth helps breaks down food and releases these chemicals, which are then free to be processed by the sensory receptors for gustation. The sensory receptors for gustation are located on your taste buds.

Taste Buds:

The sensory receptors for gustation that are located deep within porous structures on the tongue; there are five basic types of taste buds.

The human tongue contains many thousands of bumps and grooves. Inside these grooves are taste buds, and each taste bud contains several gustation receptors.

 

The gustation receptors are slightly specialized for certain types of taste; each receptor is most sensitive to one particular taste and less sensitive to the other types of taste.

There are five basic types of taste: sweet, salty, sour, bitter, and umami.

One interesting aspect of our sensation for taste is that it can also illustrate the occasional mismatch between the present day and the environments our senses were design to navigate.

© Peter Hermes Furian/Shutterstock.com

Senses: Touch and Pain

Most of our sense of touch is located in our skin. The skin is the largest organ of the body and typically covers around 3,000 inches. Also, receptors for this sensory system are the only non-localized sensory receptors.

These are the major functions of the skin:

1. Maintenance of body temperature: In response to an increase in ambient temperature or strenuous exercise, the production of perspiration by sweat glands helps lower core body temperature and skin temperature (see next chapter).

2. Protection: The skin covers the body and provides a formidable physical barrier that protects underlying tissues from physical abrasion, bacteria, dehydration, and ultra-violet radiation. An individual can actually survive for some time without skin as long as body temperature and infections are contained.

3. Excretion: The skin is able to expel dangerous waste material through perspiration (i.e., excretion of salts, organic materials, and some drugs). By the way, perspiration doesn’t necessarily smell bad; it is the bacteria that quickly reproduce on your skin that omit the unpleasant odor.

4. Synthesis of vitamin D: Upon exposure of the skin to ultra-violet light, the skin is able to produce vitamin D (chemical name is 1,25 dihydroxycalciferol), which acts as a hormone.

5. Immunity: Certain cells of the epidermis play a role in increasing the immune response.

6. Blood reservoir: The skin and underlying vascular supply provide a substantial supply of blood that can quickly be shifted to muscles in times of increased activity.

7. Detection of Stimuli: The skin has a number of receptors that provide information about touch, pain, temperature, and deep pressure. This aspect of the skin is the focus of this section.

Senses: Touch and Pain

There are two types of sensory experiences that make-up somesthesis, which is what we collectively refer to as out bodily sensations: kinesthetic sensitivity and cutaneous sensitivity.

Kinesthetic Sensitivity:

Kinesthetic sense refers to knowledge about spatial position and movement information occurring from mechanical stimulation of mobile joints, muscles, and tendons.

Cutaneous Sensitivity:

Cutaneous senses (skin sense) refers touch, pressure, temperature, and pain (nociception).

Pacinian corpuscles, which are located just below the surface of the skin, are the sensory receptors for touch. These sensory receptors respond to pressure applied to the surface of the skin.

© logika600/Shutterstock.com

© Designua/Shutterstock.com

Senses: Touch and Pain

We now know pain is an extremely variable and difficult system to both explore and make definitive statements about. Pain is a complex perceptual phenomenon influenced by a number of physiological, cognitive, and emotional factors.

 

Our awareness of pain is said to be due to a specialized receptor, the nociceptor. Pain information is mediated by several classes of specialized receptors, nociceptors:

Thermal or mechanical nociceptors are associated with sensations of sharp, stinging pain and tend to be well localized.

Polymodal nociceptors are activated by a variety of high-intensity mechanical, chemical, and very hot or very cold stimuli.

The diagram demonstrates that the brain has dedicated greater cortical space to areas of the body that have greater density of sensory receptors.

Receptors are denser in areas that require better touch discrimination such as the finger tips and are less dense in areas where tactile information is less pertinent to overall function

The touch area of the brain (i.e., primary somatosensory cortex) is somatotopically organized. Each area of skin has a receptive field (the area from which a stimulus can activate a sensory receptor).

© Vasilisa Tsoy/Shutterstock.com

Perception: The Whole is Greater than the Sum of its Parts

Perception allows us to be selective and even occasionally ignore unimportant information in an effort to efficiently gather meaningful and useful information.

The process of visual perception can be simplified into three basic tasks:

 

1. Detection

2. Discrimination

3. Identification

 

The three tasks grow more complex and require more information as one moves from simply detecting to identifying. Identification requires previous learning and may require the processing of much information.

The Gestalt psychologists argued that the brain creates three-dimensional images by organizing sensations into stable patterns, or perceptual consistencies.

 

Simply stated, the brain makes certain assumptions about what is to be seen in the world, and these expectations seem to be derived in part from experience and also from innate neuronal wiring.

 

The degree to which visual

perception is transformational

and therefore creative has only

recently been appreciated

within the scientific world. The

view that perception is not simply a reduction of complex forms but a holistic, creative process was first introduced by a school of thought termed Gestalt psychology.

© Ye Liew/Shutterstock.com

 

Perception: Top-Down Processing vs. Bottom-Up Processing

Top-down processing is an information-gathering process starting from an individual’s knowledge, expectations, and prior experiences.

 

Bottom-up processing is an information-gathering process starting from each individual stimulus.

The Ebbinghaus illusion can help demonstrate top-down and bottom-up processing.

 

To your naked eye, which of the two circles looks bigger to you?

 

If the inner circle to your right seems bigger to you, you are relying on top-down processing. The two inner circles, which are exactly the same size, have different context (surrounding) information. The inner circle on the right is surrounded by much smaller circles and thus appears much larger than the inner circle on the left, which is surrounded by much larger circles. If you only use bottom-up processing, you will not experience this optical illusion.

If we only rely on bottom-up processing, we will not be able to see the whole picture since you are only focused on individual stimulus.

 

Most of us use top-down processing as well.

Importance of Attention in the Perceptual Process

In a way, object perception is a goal-driven process. There is a plethora of information around us and it is neither practical nor meaningful to process all of the sensory input. That is why we need a selective filter known as attention.

 

Attention, or concentrated mental effort, is crucial in the beginning of perception.

Attention:

A concentrated mental effort that functions as a filter to ignore unimportant events and focus on important events.

The unpredictability of an event or diverted attention will result in failure of accurate scene detection (as if we are blind to that event) for a short time. This phenomenon is known as inattentional blindness.

 

Unlike sensational process, the context surrounding individual stimulus as well as your personal expectations and experience will shape your perceptual process.

Inattentional Blindness:

Diverted attention resulting in failure of accurate scene detection as if we are blind to that event.

Perceptual Procedure at the Brain Level: Visual Cortex

The information processed through rods and cones will eventually be projected to the back of our brain known as the occipital lobe, which is where all the visual information will first be processed in our brain.

 

The primary visual cortex can be subdivided into five different sections depending on its primary function.

Occipital Lobe:

The part of our brain responsible for processing the visual information.

© okili77/Shutterstock.com

Perceptual Procedure at the Brain Level: Visual Cortex

The primary visual cortex also processes retinal images in a very specialized way through M and P pathways.

 

The magnocellular pathway (M pathway) receives information from M ganglion cells about peripheral vision and therefore low spatial resolution images from the retina.

 

The parvocellular pathway (P pathway) receives information from P ganglion cells about central vision and therefore high spatial resolution images. These separate pathways serve as the anatomical basis for more localized visual information processing even at an earlier stage

Magnocellular Pathway:

A visual pathway for peripheral vision and low spatial resolution images from the retina.

Parvocellular Pathway:

A visual pathway for central vision and high spatial resolution images.

Retinal images from the left visual field are transmitted to the right hemisphere and images from the right visual field are transmitted to the left hemisphere, which is known as the contralerality of visual processing.

© Alila Medical Media/Shutterstock.com

 

Perceptual Procedure at the Brain Level: Visual Cortex

There are two main streams of processing that start from the occipital lobe:

 

The pathway going into the temporal lobe is known as the “what” pathway and responds to and integrates information about the size, color, and/or the identity of the object.

 

The other pathway, which goes into the parietal lobe, is called the “where” pathway.

 

Visual cortex areas are localized based on their primary responsibilities in the earlier stage of processing.

© Matthew Cole/Shutterstock.com

Perceptual Aspects of Vision: Visual Illusion

Perception of the visual information involves your conscious awareness. Nonetheless, we are not sensitive enough to acknowledge the neuronal connections continuously made in our brain during the process.

 

This process is a highly efficient and organized process to help us provide meanings to the scenes around us and to allow us to make appropriate judgment based on the sources of the information.

 

Perception attempts to find useful ways to make the most of our surroundings, sometimes even at the cost of misrepresentation of the visual stimulus. This is why we experience optical illusions.

In the above illusion, though the sensory procedure involves an objective processing of the visual information, our brain is making an assumption about the context and changes the overall perception of the shades or the colors.

In other words, the information you receive from your eyes about left and right gray blocks is exactly the same. However, the context information around the stimuli will make your brain “think” and come to a conclusion that these two blocks do not have the same luminance.

Gestalt psychologists proposed that people tend to follow a simple rule to organize objects.

 

Our brain will find the most efficient way to interpret the visual input and may trick us to believe a phantom existence of a stimulus.

© diskoVisnja/Shutterstock.com

Perceptual Aspects of Vision: Depth Perception

We use diverse depth cues around us to access the depth information.

 

We make assumptions that an object blocking something is closer to us than the object being blocked and extract depth information from those cues. This concept is known as occlusion.

Occlusion:

A phenomenon in which an object closer to a viewer appears to block another object that is farther away from the viewer.

Though we can receive plenty of depth information by using one eye (monocular cues), we get more sophisticated information by using both of our eyes (binocular cues).

 

Because your left and right eyes cover slightly different visual fields, the images from the left and right eyes are slightly different (binocular disparity) and provide the information for depth perception.

© hipproductions/Shutterstock.com

Auditory Cortex

The transduction process in the auditory sensory receptors (e.g., hair cells) codes and transmits the basic sound inputs to the brain.

 

The auditory cortex then assembles these neural signals into meaningful sounds.

The auditory cortex has areas specializing in speech language input such as voice. Speech sounds are given extensive attention in our auditory cortex, such as the left auditory cortex areas, Broca’s and Wernicke’s areas.

Perception of sound information involves a higher-order acoustic information processing. Once the basic auditory input (sound wave) is registered by the hair cells, it is transmitted to the primary auditory cortex, which is the temporal lobe.

© Alila Medical Media/Shutterstock.com

Auditory Cortex: Perceptual Aspects of Audition

Early psychophysicists argued that there is a difference between the objective intensity of physical stimulus and people’s subjective experience of this stimulus.

 

Because the physical intensity of a stimulus has a logarithmic function with our experience, a greater sound intensity will be needed at the higher end of the sound wave spectrum for us to notice a difference (production of JND).

 

This notion is captured in the equal loudness contours, which show the function of loudness and frequencies.

Equal Loudness Contours:

Lines measuring the function of loudness and frequencies of sound waves.

The auditory system is able to distinguish the location/direction—analogous to depth in vision—of a sound source as well as the relative distance of sound-emitting stimuli. This is accomplished by the use of monaural and binaural cues.

 

If two sounds are presented simultaneously, the louder one is perceived to be closer. Also, we can determine that an ambulance is approaching based on the intensity and the pitch of the siren (particular if the observer is stationary). The change in pitch emitted by an object moving horizontally in space, in relation to a stationary observe, is termed the Doppler shift.

Auditory Cortex: Perceptual Aspects of Audition

Monaural Cues:

Auditory depth perception that occurs with just one ear.

Binaural Cues:

Auditory depth perception that occurs with the use of both ears.

Doppler Shift:

The change in pitch emitted by an object moving horizontally in space, in relation to a stationary observe.

© Vecton/Shutterstock.com

The auditory system utilizes the physical/quantitative differences in stimulation that occur between the two ears.

 

One of the important methods used to locate auditory stimuli is by means of interaural time differences.

This means that if the left ear receives a sound first, the right auditory cortex becomes more active, while simultaneously deactivating the left auditory cortex.

 

A binaural process by which the brain can determine the location of a sound in space is the interaural intensity difference. A sound not only strikes the nearer ear first but also delivers a slightly more intense sound to that ear.

 

Another source of information the brain uses to localize sound is the phase difference between the sounds reaching the two ears. Since these sounds are defracted (bent) around the head, they reach each ear at slightly different phases.

 

The dual or two-process theory of sound localization suggests that we localize low-frequency sounds by using time or phase differences, or both, at the two ears; and that we localize high frequency sounds by using the intensity differences caused by the sound shadow produced by the head and differences in their distance from the sound source.

Auditory Cortex: Perceptual Aspects of Audition

Interaural Time Differences:

The slight difference in time sound arrives at one ear before the other ear.

Interaural Intensity Difference:

The slight difference in sound volume at it reaches one ear compared to the other.

Phase Difference:

The slight difference in the degree the sound wave is moving through its wave when it reaches one ear compared to the other.

Dual or Two-process Theory:

The idea that we localize low-frequency sounds by using time or phase differences, or both

Sound Shadow:

The difference in sound intensity due to head blocking/deflecting some of the sound waves.

Created by Bailee Robinson

 
"Looking for a Similar Assignment? Get Expert Help at an Amazing Discount!"