“A study in healthy women providing probiotic yogurt for four weeks showed an improvement in emotional responses as measured by brain scans”

An AI model can quantify rheology, volatility curves, friction coefficients, and ingredient interactions, scanning massive formulation spaces in minutes that once took teams months of iterative experimentation. Within sensory science, digitalization and machine learning are now established topics of discussion among academic and industry researchers alike, with recent symposium papers explicitly considering how AI-assisted analytics and digital sensory methods are reshaping how we integrate descriptive panels, consumer feedback, and instrumental measures across complex product systems (Lawlor et al., 2025).


Today, platforms like Aigora are being adopted to help sensory and consumer science teams bring together vast and heterogeneous datasets, merging panel scores, consumer reported responses, and instrumental signals to reveal patterns and accelerate R&D decisions that once required manual, siloed analysis. Solutions such as Revieve’s AI skincare and makeup advisors analyze skin features and preferences, offering personalized product recommendations and virtual try-ons that shape purchase decisions today.


Across home, personal, and beauty care, AI tools are increasingly used to cluster consumer segments, predict likely preference drivers, and augment traditional sensory analyses with predictive modeling. AI can forecast performance trends, suggest promising formulation directions, and help teams reduce cycle time between concept and prototype.

Can Algorithms Understand Experience?

But can it tell us whether a moisturizer feels “caring” on sensitive skin?


Or whether a laundry detergent smells “like home” after a weekend of kid-sport gear?


Can an algorithm really grasp delight, contextual meaning, or emotional inference embedded in multisensory experience?


Last year in my AChemS talk, I drew on philosopher Thomas Nagel’s classic question (originally from his 1974 essay): “What is it like to be a bat?” Nagel argued that no amount of objective data about a bat’s sonar system could tell us what it is like to experience the world as that bat. In the same way, we can model every instrumental correlate of sensory impact (friction, volatility, frequency spectra, top notes) but that does not mean we have accessed the experience itself.


The same philosophical puzzle appears when we consider AI and consumer experience.


AI can model signals: viscosity curves, volatility patterns, lather density, friction coefficients, and visual properties. These measurements are incredibly valuable for understanding product structure and predicting performance.

Studies show that instrumental measurements of rheology, texture, and tribology can correlate with tactile sensory perception and application performance of cosmetic products (Tafuro et al., 2026; Ali et al., 2022).


But consumers do not experience signals. They experience products.


That tension, between measurement and meaning, is where the frontier of sensory AI meets the lived consumer reality.

Consumer Products Are Multisensory Systems

Home and personal care products are inherently multisensory. A shampoo, moisturizer, or detergent is not experienced as a single variable but as a sequence of sensory interactions unfolding over time.


Consumers notice the visual appearance of a product, the sound of packaging opening, the texture spreading across the skin or fabric, the fragrance evolving during use, and the sensory feel that remains afterward. Research in sensory and consumer science consistently shows that these attributes contribute jointly to product perception, with texture, fragrance, and visual cues interacting to shape judgments of product performance and quality (Chen & Rosenthal, 2015).


Importantly, these cues do not operate independently. They interact dynamically to shape overall impressions such as clean, gentle, nourishing, or luxurious. Crossmodal research has demonstrated that sensory inputs from different modalities can influence one another—for example, visual appearance can alter perceived texture or fragrance intensity, while sound and packaging cues can shape expectations about product efficacy or quality (Spence, 2011).


Psychologically, this reflects processes such as multisensory integration and embodied cognition. The brain does not passively receive isolated sensory inputs. Instead, it integrates signals across modalities and interprets them through memory, expectation, and contextual cues to construct meaning (Barrett, 2017).


In other words, consumers do not simply detect sensory attributes; they interpret them.

Experience Is Interpreted, Not Measured

This creates a challenge for data-driven modeling. Many consumer experiences emerge not from single attributes but from the interpretation of multiple sensory cues within a specific context.


Take the idea of “gentleness,” a common claim in personal care. Gentleness rarely corresponds to a single measurable parameter. Instead, it emerges from a constellation of cues—texture, fragrance character, residue, after-feel, and even packaging signals. Research on expectation effects in product perception has shown that consumers integrate sensory signals with contextual information such as branding, packaging, and prior beliefs when forming product judgments (Deliza & MacFie, 1996).


Similarly, the familiar idea of laundry that smells “like home” reflects emotional associations, cultural learning, and autobiographical memories attached to scent. Work in olfactory psychology has demonstrated that smells are particularly powerful triggers of personal memory and emotional meaning, which can strongly shape how a fragrance is interpreted in everyday contexts (Herz, 2004).

In both cases, meaning emerges from interpretation. AI can identify correlations among the attributes that contribute to these interpretations. But the interpretation itself is a human cognitive process.

Modeling Signals vs Understanding Meaning

This distinction highlights both the strength and the limitation of AI in sensory science. Machine learning excels at detecting patterns within large datasets. It can reveal relationships between ingredient combinations, instrumental readings, and sensory or consumer outcomes.


But human experiences often emerge from nonlinear interactions between sensory inputs, expectations, and emotional context. A thick cream may be interpreted as nourishing by one consumer and heavy by another. A strong lather may signal cleanliness to some users but harshness to others.


These differences are not simply variability in the data; they are reflections of how people construct meaning from sensory signals.

The Opportunity for R&D

None of this diminishes the value of AI in sensory science or product development. AI can dramatically accelerate insight generation, reveal hidden relationships within complex datasets, and help teams navigate formulation spaces that would otherwise be impossible to explore.


The real opportunity lies in integrating AI with the behavioral and sensory sciences that explain human interpretation. AI can help map the architecture of product experience: connecting instrumental signals, sensory attributes, and consumer responses across large datasets. Behavioral science can help explain how those signals are interpreted within the human mind.


For R&D teams in home and personal care, the most powerful future will likely combine these perspectives. Algorithms can help us see the patterns. But understanding what those patterns mean to consumers remains a distinctly human challenge.

References and notes

  • Ali, A., Skedung, L., Burleigh, S., Lavant, E., Ringstad, L., Anderson, C.D., Wahlgren, M. and Engblom, J., 2022. Relationship between sensorial and physical characteristics of topical creams: A comparative study on effects of excipients. International Journal of Pharmaceutics, 613, p.121370.
  • Barrett, L. F. (2017). How emotions are made: The secret life of the brain. Pan Macmillan.
  • Chen, J., & Rosenthal, A. (Eds.). (2015). Modifying food texture: Novel ingredients and processing techniques. Woodhead Publishing.
  • Deliza, R., & MacFie, H. J. (1996). The generation of sensory expectation by external cues and its effect on sensory perception and hedonic ratings: A review. Journal of sensory studies, 11(2), 103-128.
  • Herz, R. S. (2004). A naturalistic analysis of autobiographical memories triggered by olfactory visual and auditory stimuli. Chemical senses, 29(3), 217-224.
  • Lawlor, J.B., Bavay, C., van Hout, D., McEwan, J.A., Dreyfuss, L., Labbe, D., Groeneschild, C., Marcelino, A.S., Rason, J., Worch, T. and Piqueras-Fiszman, B., 2025. Opinion note: Digitalization in sensory and consumer science–Summary perspectives from presentations at the 15th Pangborn sensory science symposium. Food Quality and Preference, 124, p.105372.
  • Nagel, T. (2006). What is it like to be a bat?. Theories of Mind: An Introductory Reader, 186.
  • Spence, C. (2011). Crossmodal correspondences: A tutorial review. Attention, Perception, & Psychophysics, 73(4), 971-995.
  • Tafuro, G., Costantini, A., & Semenzato, A. (2026). Rheology, Texture Analysis and Tribology for Sensory Prediction and Sustainable Cosmetic Design. Cosmetics, 13(1), 25.