Psychology: Using Sensory for UX

Psychology: Using Sensory for UX

There are many fields of psychology, and we can find many things relevant to user experience in each of them. But this article is focuses on sensory of human body and how we need to use them for greater user experience. As a UX person we need to understand how individuals perceive, understand, feel and act in a situation. There are five areas of Psychology Perception, Motivation, Cognition, Emotion, and Behavior. 

Perception is the process of become aware of something through the senses. We see, hear, and feel the products we are using, sensation and perception are very closely connected.

Our senses are 

  • Vision,
  • Hearing,
  • Touch,
  • Proprioception

Designers should focus on these four elements. Vision because we look at interfaces. Hearing because we listen to audio feedback and spoken text. Touch and haptic, we feel vibrations and the texture of surfaces when we interact with interfaces and devices. And proprioception, we intrinsically know the movement and relative position of our own body parts. Yes, there are other senses, but scent and taste are not yet common forms of interacting with products.

Sensory Filtering:

A tremendous amount of sensory information around us gets filtered out. Of all of the information in the world around us, we pay attention to a very small select amount. As designers, we need to help people focus their attention on what's important, and we need to make it easier for them to understand the meaning of that information.

  1. Most of information around design is actually ignored
  2. Users only pay attention to a small set of information
  3. We should design to help focus attention and process for meaningful information.

Vision

Computer monitors create colors by mixing different ratios of red, green, and blue light. And we represent colors in code as RGB values

Human eye is sensitive to color with a similar system of color receptors. This is described by the trichromatic theory. We have light-sensitive cells in our eyes that detect red, green, and blue light. And our brain combines the different ratios of incoming light into all the colors we see in the world around us. It's estimated that humans can perceive up to 10 million different colors.

Color-Deficient: About 8% of males and <1% of females see color differently.

Not everyone perceives color the same way. About eight to 10% of people, mostly men, have some form of color-deficient vision and experience difficulty distinguishing colors. 

  1. The most common form of color-deficient vision involves the inability to accurately distinguish between red and green.
  2. A less common form of color-deficient vision involves the inability to distinguish between yellow and blue,
  3. And a very rare form involves the inability to perceive any color at all.

Color Cues:

When designing for color-deficient vision, we need to make certain we are not relying solely on color cues, and that important colors can be differentiated.

e.g.:  do not red and green fill colors to convey information in a chart or a diagram. Rather than pure red, add a little yellow, and rather than pure green, add a little blue.

Even minor color adjustments can make a perceptual difference for people with color-deficient vision.

Also, use meaningful text labels, distinct shapes such as icons, or even patterns and textures. Multiple signals make it easier to perceive the information and understand the meaning. We also need to be careful about using certain color combinations, because they may lead to chromostereopsis, a visual illusion of depth caused by specific adjacent colors,

usually red and blue, or red and green. When these colors are side-by-side, the edges or boundaries between them can appear to vibrate or oscillate between foreground and background. Chromostereopsis makes it difficult to discern a clear edge, and it can also lead to visual fatigue and eye strain. 

Hearing

How we Hear

Sound is transmitted as vibrations through the air that are detected by special structures in middle and inner ear. Humans can hear sounds in the frequency range from 20 to 20 000 hertz or vibrations per second. The lower the frequency, the lower the tone, and our ears are especially sensitive to the frequency range of human speech.

Sound design is become increasingly common and more important in digital products, because sound is an important signal about interaction, and is often a form of feedback.

Sound alone is not enough

We need to use multiple cues such as a visual message and a vibration, so the people with hearing loss may also be alerted. As we get older, we start to lose sensitivity to high frequency sounds, and there are many people who suffer from partial hearing loss or deafness.

Additionally, noisy environments make it difficult to hear clearly and discern sounds. Video captions are important for accessibility, but they also provide an alternate way for people to access content when environments are noisy or when people simply want to remain quiet. 

Touch

Touch is a mechanical sense. It is triggered by movement or pressure on sensory cells in the skin, and it involves several different types of tactile information. Temperature, pressure, vibration, and texture.

Our sense of touch provides information whenever we come into contact with an object. Sometimes it is more active such as when we pick up our keys and manipulate them to get one key positioned to unlock a door. Other times, it's more passive such as when you sit down and momentarily feel the cushion or the seat of the chair. Haptic perception is the act of exploration of an interaction with objects through touch.

We use touch and haptic information to explore, identify, and make sense of the world around us. Some of the exploratory actions we perform to help us identify objects include

Motion:  which involves lateral scanning or contour following by tracing our fingers over the surface to sense texture, shape and size.

Pressure:  involves pressing or squeezing to sense consistency and structure.

Enclosure: involves wrapping our hands around an object to sense shape, size and weight.

Proprioception

Proprioception is our internal sense of relative position and movement that we use to keep track of our own body. This is how we're able to close our eyes and quickly, and hopefully accurately, touch the tip of our nose with a finger. Our devices also have sensors that detect location, motion, rotation, and acceleration and deceleration. Proprioception is important as a design consideration because we are creating products that leverage spatial gestures.

We can shake, squeeze, rotate, and point our devices to interact with them. We use our proprioceptive senses to learn and accurately perform these spatial gestures.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics