Humans not only relish the sweet, savoury and saltiness of foods, but they are influenced by the environment in which they eat.
Cornell University food scientists used virtual reality to show how people’s perception of real food can be altered by their surroundings, according to research published in the Journal of Food Science.
When we eat, we perceive not only just the taste and aroma of foods, we get sensory input from our surroundings – our eyes, ears, even our memories about surroundings, says Robin Dando, author of the study.
About 50 panellists who used virtual reality headsets as they ate were given three identical samples of blue cheese. The study participants were virtually placed in a standard sensory booth, a pleasant park bench and the Cornell cow barn to see custom-recorded 360-degree videos.
The panellists were unaware that the cheese samples were identical, and rated the pungency of the blue cheese significantly higher in the cow barn setting than in the sensory booth or the virtual park bench.
To control for the pungency results, panellists also rated the saltiness of the three samples – and researchers found there was no statistical difference among them.
The purpose of this project was to develop an easy-to-implement and affordable method for adapting virtual reality technology for use in food sensory evaluation, said Dando.
Our environs are a critical part of the eating experience, he said. We consume foods in surroundings that can spill over into our perceptions of the food, said Dando. This kind of testing offers advantages of convenience and flexibility, compared to building physical environments.
This research validates that virtual reality can be used, as it provides an immersive environment for testing,” said Dando. “Visually, virtual reality imparts qualities of the environment itself to the food being consumed – making this kind of testing cost-efficient.