Bringing Research to LIFE

HUMANS SEE, HUMANS DO
Researcher investigates how sensory information—and sensory overload—impacts the movement of our bodies

By Katie Chalmers-Brooks


Assistant Prof. Cheryl Glazebrook

As a mother of two boys—the youngest only 10 months—assistant professor Cheryl Glazebrook can’t help but wear her researcher cap when observing their interactions with toys.

Most people probably see a baby lying on a play mat, kicking his legs to prompt a stuffed bumble bee that  hangs from above to sing. But Glazebrook, recently returned from maternity leave, sees early evidence of the interplay between auditory cues and body movement.

As our world becomes busier with sounds and lights, there’s a greater need to know how technology affects us. “We have all these systems where noises come from different places. What does that do to our movements?” asks Glazebrook. “We don’t fully understand our nervous system and our technology is changing so quickly that often those changes are made and we haven’t caught up in our understanding of how our brain processes that visual and auditory information.”

A ballet dancer from a young age who spent years perfecting her own movement, Glazebrook now heads the Perceptual Motor Behaviour Lab.  She and her team investigate how our nervous system uses the information we get from our sight, sound and touch senses to perform physical tasks. Using a 3-D motion analysis system, she focuses mostly on arm movements. Study participants wear an electronic marker on their finger, connected to a machine that records information that will later be analyzed to determine factors like trajectory, acceleration and velocity. Participants are given instructions—for example, move a wooden block or point at a target on the screen—while they are exposed to different lights or sounds.

One of Glazebrook’s projects, funded by the Natural Sciences and Engineering Research Council of Canada, is done with the drivers of  tractors, and pilots of helicopters or planes, in mind. Collaborating with an  engineering student, she aims to provide  the industry with insight into how best to design controls that would increase the likelihood these individuals would respond correctly during high-stress situations. In emergencies, they need to react quickly by hitting a control or the brake and mistakes can be costly. “Fifty or 100 milliseconds could be the difference between life and death or critical injury,” Glazebrook notes. “So these are really important questions.”

Hopefully, input into the design of these controls will result in less confusion when drivers and pilots reach for a knob or switch. “It’s making sure that the design of the tools is done in a way that people can respond quickly and accurately without getting confused, so they hit the right dial at the right time,” she adds.

Exercises in her lab mimic real-life scenarios. This could mean shining a brief, distracting light participants would see out of the corner of their eye, perhaps accompanied by an equally distracting sound; at the same time they are asked to complete assigned tasks.

The light and sound can come from  the same or a different location, and before, during or after the movement is performed. Glazebrook throws another variable—wearing black-out goggles—into the mix to figure out how our sense of sight and hearing work independently of each other.

New knowledge coming out of her lab also aims to help people who live with a pins-and-needles sensation in their arm, typically as a result of carpal tunnel, diabetes or stroke. A trained physical therapist with a PhD in motor control, Glazebrook wants to know how a disability like this changes how the body reacts during everyday situations that require a quick and precise response, like tending to a spill while cooking.

These individuals can improve their movements by performing exercises that have them practice related tasks. Glazebrook’s findings will help inform therapists how to design these exercises. The goal?

“To help them perform their movements with enough ease and speed that the movement becomes functional for them so they can use it in their everyday life,” she says.

While still early in her investigations, Glazebrook predicts that removing vision from the equation when trying  to relearn a movement is the way to go. “When we close our eyes, that normal information we get from our limb doesn’t have a chance to re-develop those normal sensory-motor connections because our visual sense is dominant.”

Her research also explores sensory motor reactions in autistic children, which could help better define some of the sub groups of the neural development disorder and develop new or earlier interventions.

Glazebrook was recently awarded a Canada Foundation for Innovation grant for a second 3-D motion analysis system, along with an eye tracker to further investigate coordination between the eyes and hands.

For more information on this or other research at the University of Manitoba, contact Janine.Harasymchuk@ad.umanitoba.ca or 204-474-7300.