our methodology

Build on psychophysics,

Driven by thermaldynamics.

We decode taste by applying first-principle based methods in our system design.

Our key differentiator is using first-party high quality sensory & behavioural data, modelled through principles grounded in real human's decision journey. Unlike traditional language-based insights, we eliminate contextual noise and improve consumer decision relevance.

Our key differentiator is using first-party high-dimensional sensory & sentiment data, modelled through principles grounded in real user's sensory perception. Unlike traditional language-based insights, we eliminate contextual noise and improve decision relevance.

01

Sourcing

We designed and developed our own consumer-facing data pipeline, 16 taste types & 8 foodie personalities, together with our 3rd party qualitative research partners, we collect structured first-party sensory data at scale to train our AI consumers. By combining gamified incentives, community engagement, and IP-driven avatars, we enable users to discover and own their foodie prodile, while contributing high-quality, self-reported data. With every report is delivered back to the user, all data is collected with explicit consent and handled in a privacy-first environment to ensure both compliance and trust.

Our taxonomy

Sensory profiling is structured around four key dichotomies grounded in oral processing mechanics and receptor pathways, forming a matrix of 16 distinct taste types as consumer segmentation

Foodie personality is structured around three key dichotomies grounded in food choice intent / decision model, forming a matrix of 8 distinct foodie types as consumer segmentation

Behavioural values are structured around a 40-minute qualitative interview grounded in food memory, emotional association, and contextual decision triggers, capturing the lived experience layer that quantitative profiling cannot reach, forming an individual value map as the third dimension of consumer segmentation.

All consumer data is fully anonymised at the point of collection. No individual names, identifiers, or personal details are ever stored, shared, or traceable within the system.

Our taxonomy

Sensory profiling is structured around four key dichotomies grounded in oral processing mechanics and receptor pathways, forming a matrix of 16 distinct taste types for consumer segmentation

The profiling measures both hedonic liking and motivational wanting, enabling a more precise mapping of true taste preference beyond surface-level enjoyment

A redesigned flavour taxonomy rooted in perception, consisting of 13 core flavour groups and 52 nuanced sub-clusters, built from the bottom-up based on how consumers actually describe and experience taste

A “compare to people around me” framing is used to calibrate regional benchmarks, allowing users to understand their sensory identity in a localised context and enhancing cross-cohort data resolution.

02

Modeling

Grounded in psychophysics and aligned with food industry ISO & ROI metrics, we developed proprietary Sensory Power Indices to quantify how individuals and populations perceive taste in both JAR peak value approach and their emotion states mapping. Before predicting acceptance, we measure the intensity and preference curves for key sensory and behavioural attributes; Now, this process is fully numerical, comparable and ready for reporting.

Our design principles

Each region or cohort is profiled with more than 1,000 consumer profiles to ensure sufficient statistical depth and fine-grained resolution

Clear algorithmic matrix applied to separate bio-mechanical thresholds (e.g. receptor-level sensitivity) from cognitive perception data (e.g. preference, memory, cultural association)

Unique biophysical parameters transformed into mathematical Sensory Power Indices, allowing cross-cohort sensory perception to compare within the same analytical space

Model outputs supported and validated through controlled environment focus groups

Our design principles

Each region or cohort is profiled with more than 1,000 user's sensory profiles to ensure sufficient statistical depth and fine-grained resolution

Clear algorithmic matrix applied to separate bio-mechanical thresholds (e.g. receptor-level sensitivity) from cognitive perception data (e.g. preference, memory, cultural association)

Unique biophysical parameters transformed into mathematical Sensory Power Indices, allowing cross-cohort sensory perception to compare within the same analytical space

Model outputs supported and validated through controlled environment focus groups

03

Insights

Insights distill high-quality sensory data and external context into decision-ready guidance. At a principle level we map cohort and regional sensory and behavioural identity, quantify acceptance, isolate drivers and barriers, measure momentum, and translate evidence into clear actions for localisation, reformulation, claims, and portfolio moves. Reporting is customisable, privacy-first, comparable across markets, and built to support GTM and brand teams with confidence.

Example insights

Assess market-entry risk for a current formulation in Bangkok versus Tokyo with acceptance forecasts and localisation ranges

Select the winning flavour route for a sparkling citrus RTD with JAR and intensity targets by cohort

Test claim and naming options for a reduced-sugar yogurt among UK families with predicted lift

Benchmark a chili sauce against category leaders in Mexico City to set spiciness thresholds and messaging guidance

04

Intelligence

AI Consumer in TasteNET simulator are trained on proprietary first-party sensory & behavioural data, grounded in real consumer psychology and live market context.

Only by simulating at the individual level (Bottom up) preserving each consumer's full-dimensional taste preferences and state-dependent responses, can help brands to achieve beyond averaged metrics and reach evidence-backed decisions on what to launch, how to position it, and who will actually buy it.


Our AI capacities

Interprets user intent with domain-specific awareness, combining product category, regional context, and sensory variables to surface relevant responses

Synthesizes structured (first-party data) and unstructured (secondary reports, public datasets) sources to generate solid evidence backed recommendations

Operates on high-dimensional, pre-modelled sensory baselines to return results at the resolution of region, cohort, or individual, enabling detailed comparative queries

Accepts proprietary uploads (e.g. product briefs, formulation notes) and processes them securely, with no cross-client data sharing or model retraining exposure, ensuring full confidentiality

Our AI capacities

Interprets user intent with domain-specific awareness, combining product category, regional context, and sensory variables to surface relevant responses

Synthesizes structured (first-party taste data, benchmarks) and unstructured (secondary reports, public datasets) sources to generate solid evidence supported recommendations

Operates on high-dimensional, pre-modelled sensory baselines to return results at the resolution of region, cohort, or individual, enabling detailed comparative queries

Accepts proprietary uploads (e.g. product briefs, formulation notes) and processes them securely, with no cross-client data sharing or model retraining exposure, ensuring full confidentiality

How do we know it works?

TasteNET is validated against 20 industry-standard benchmarks across Exposure, Purchase, Experience, and Advocacy, each grounded in peer-reviewed methodology.

Dual-axis benchmarking

Every simulation is scored on completeness (the breadth of decision factors captured), and accuracy (the fidelity of outputs against ground-truth consumer behaviour). No synthetic data. No averaged personas.

Free from self-report bias

AI Consumers do not modify responses to appear more health-conscious or less price-sensitive. On barrier identification and purchase hesitation, this is a structural advantage over traditional survey methods.

Male scientist in a white lab coat standing confidently in a modern hydroponic lab with pink grow lights and rows of leafy greens.

From insights to intelligence

Measurable taste, Predictive acceptance, Launch with confidence

From insights to intelligence

Measurable taste, Predictive acceptance, Launch with confidence

From insights to intelligence

Measurable taste, Predictive acceptance, Launch with confidence

9 Battery Rd,
Singapore
049910

This website and its contents are provided for informational purposes related to scientific, technological, and commercial applications of sensory intelligence. All materials, including product concepts, visual assets, and trademarks such as TasteNET™, are the intellectual property of Digitaste.

Any unauthorised use is prohibited. Digitaste is committed to protecting user privacy and managing first-party sensory data with transparency and care. For more information, please review our Privacy Policy and Terms of Use.

© 2025 Digitaste Pte. Ltd. All rights reserved.

9 Battery Rd,
Singapore
049910

This website and its contents are provided for informational purposes related to scientific, technological, and commercial applications of sensory intelligence. All materials, including product concepts, visual assets, and trademarks such as TasteNET™, are the intellectual property of Digitaste.

Any unauthorised use is prohibited. Digitaste is committed to protecting user privacy and managing first-party sensory data with transparency and care. For more information, please review our Privacy Policy and Terms of Use.

© 2025 Digitaste Pte. Ltd. All rights reserved.

9 Battery Rd,
Singapore
049910

This website and its contents are provided for informational purposes related to scientific, technological, and commercial applications of sensory intelligence. All materials, including product concepts, visual assets, and trademarks such as TasteNET™, are the intellectual property of Digitaste.

Any unauthorised use is prohibited. Digitaste is committed to protecting user privacy and managing first-party sensory data with transparency and care. For more information, please review our Privacy Policy and Terms of Use.

© 2025 Digitaste Pte. Ltd. All rights reserved.