Features
NextGen Sensory
Intelligence
Our work brings focus to the future of science and technology.


Our History
We’ve grown through partnership, experimentation, and a belief in long-term thinking.
0.0
Scion founded in Vancouver
0.1
First advisory partnership established
0.2
Expanded into systems design consulting
0.3
Built our first biotech platform
0.4
Cross-disciplinary team fully assembled
0.5
AI-driven tools launched for research
0.6
AI-driven tools launched for research
0.7
Global clients across four continents
0.8
Global clients across four continents
0.9
Global clients across four continents
1
01
Primary Sensory Rules (Identity)
Primary Sensory Baseline is the reference spine of TasteNET. It consolidates more than 1,000 first-party sensory fingerprints per cohorts, together with live market signals (ie. eCommerce, menus, social media) into one stable “sensory rule” for each cohort across aroma, taste, texture and emotion. This gives teams a trusted sensory perception when starting every project, so concept screening, recipe tweaks and localisation all align to how that cohort actually experiences flavour, instead of relying on ad-hoc panels or guesswork.
Use cases
Many brands know their target consumer on paper, but never lived in their sensory context. Primary Sensory Baselines turn this into a concrete sensory identity map, so teams see the starting point before pushing a product more sour, sweet, spicy or rich.
When reusing a “Star” formulation across markets, baselines make cross-market fit measurable. They show where cohorts’ taste DNA overlaps or diverges, so teams know when a global hero can travel and when a local variant is needed.
When a launch underperforms, baselines help interpret panel and claim results. Teams can tell whether consumers truly dislike the product, or whether the issue lies in expectation, positioning or usage context, and then adjust the right lever first.
For portfolio planning, baselines expose the stable preferences beneath short-term trends. This lets roadmaps balance chasing what is hot now with building ranges that stay aligned to each cohort’s core sensory identity over time.
02
Dynamic Sensory Trends
Dynamic Sensory Trends track how a cohort’s taste and emotion move over time on top of its stable sensory baseline. These trend signals are captured via first-party sensory fingerprints with external market data such as sales performance, new product launches, menus, reviews and social media conversations. All movement is normalised based on Primary Sensory Baseline, so teams can separate real, cohort-specific shifts from short-lived noise and see which sensory directions are genuinely gaining or losing traction.
Use cases
Teams planning seasonal or limited-time offers can see which flavour spaces are accelerating within a cohort (ie. citrus–herbal refreshment or low-sugar comfort), then design variants that ride this momentum instead of guessing from generic trend reports.
When an existing product slows down, Dynamic Sensory Trends help diagnose whether consumers are tiring of its flavour, its emotion fit, or its on-pack promises by comparing current signals against both baseline and launch-period momentum.
Innovation and marketing teams can monitor how health, sustainability or indulgence narratives intersect with specific flavour and texture cues, and then tune product claims, pack design and comms to match what the cohort is actually seeking this quarter.
For customer and retailer sell-in, brands can use these trends to show that a new concept is not just on-trend in abstract, but aligns with a measurable upswing in concrete sensory directions for the exact target cohort and channel.
03
Sensory Benchmarks
Sensory Benchmarks in TasteNET turn abstract flavour differences into clear visual comparisons. For each cohort, product and sensory attribute, the system calculates normalised scores and plots them in dedicated charts: aroma wheels for detailed aroma families, single-axis benchmarks for traits such as acidity or sweetness, and 3D “comfort bubbles” for expected texture ranges. These visuals provide a shared reference so R&D, insights and marketing can quickly see how far a product sits from a cohort baseline and how it compares across markets or concepts.
Use cases
A regional insights team can quickly understand why a flavour works in one market but stalls in another by comparing the strength and context of edges between each cohort node and the same aroma or texture node.
An innovation team can start from a target occasion, such as “summer evening low-alcohol refreshment”, and traverse connected emotion, aroma and texture nodes to discover candidate flavour territories that are both on-trend and coherent with local sensory identity.
A brand or creative team can click into a specific flavour cue, like “minty & cooling leafy herbs”, and see which emotions, times of day and channels it most strongly evokes, using those patterns to brief claims, visuals and activation moments.
A portfolio or strategy team can scan the graph for strong co-occurrence and exposure edges where current product coverage is weak, surfacing white-space opportunities where consumers already show sensory interest but the brand has no offer yet.
04
Sensory Knowledge Graph

Geography, culture and occasion shape how people experience flavour. The Sensory Knowledge Graph connects cohorts, aromas, tastes, textures and emotions as nodes, with edges such as liking, exposure, comfort, expressed and co-occurs, built from millions of real-time signals. The interface lets teams filter by node type, zoom into a specific region or flavour cue, and inspect each connection to see how strong it is and under which contexts it appears, such as time of day, channel, occasion or preparation method. Instead of static tables, the graph becomes an interactive map of “who likes what, when and why”, turning hidden sensory structure into patterns that can be explored in a few clicks.
Use cases
Interprets user intent with domain-specific awareness, combining product category, regional context, and sensory variables to surface relevant responses
Synthesizes structured (first-party taste data, benchmarks) and unstructured (secondary reports, public datasets) sources to generate solid evidence supported recommendations
Operates on high-dimensional, pre-modelled sensory baselines to return results at the resolution of region, cohort, or individual, enabling detailed comparative queries
Accepts proprietary uploads (e.g. product briefs, formulation notes) and processes them securely, with no cross-client data sharing or model retraining exposure, ensuring full confidentiality
05
AI Agent-Led Queries
TasteNET runs a single AI agent across Sensory Map and Sensory Graph, so every question draws on the same consumer cohorts, flavour and texture indices, emotional data, and market signals. In one conversation, teams can move from understanding consumers and their behaviour, to reading the market landscape, shaping GTM stories, thinking through price/pack and channel roles, and reviewing post-launch demand. Users ask in plain language, and the agent returns grounded, structured answers that link back to the relevant charts, cohorts and metrics inside TasteNET.
Use cases
When a team is shaping a new concept, they can ask the agent who the best-fit cohort is, what that group’s sensory baseline looks like, and which flavour stories feel natural for them. The response links consumer insight, occasions and comfort zones, so early decisions are based on real behaviour rather than generic personas.
During market sizing and portfolio reviews, the agent can summarise how a category is evolving across regions, which competitors or flavour routes are gaining share, and how sensory preferences differ by channel. This turns the market landscape into a sensory-aware map instead of a static slide.
As go-to-market plans take shape, the agent can propose positioning angles, emotional hooks and packaging cues that line up with both the cohort’s taste DNA and the brand’s role, while also outlining sensible price/pack ladders and channel roles based on existing benchmarks in the data. Teams get a coherent story that connects flavour, value and where the product should show up.
After launch, the agent can read back sales, repeat, sentiment and panel results against the original baseline and dynamic trends, highlighting where performance is off and why. It can point out which cohorts are under-converting, whether the issue is sweetness, texture, price, or message, and suggest the next experiments to run instead of leaving teams to guess.
06
Product Based Intelligence
Product-Based Intelligence lets teams bring their own products directly into TasteNET and receive a full, modelled read-out for each SKU. By providing a product overview, users get predicted consumer acceptance %, a narrative summary, advantages and watch-outs, plus aroma, taste, texture and emotion matching against the selected cohort. All uploads are processed in a session-isolated environment and are not retained, shared or used for model training, so concepts and launched SKUs can be stress-tested against market reality without compromising confidentiality.
Use cases
Teams upload early concepts with only basic sensory intent and get an acceptance score and fit to priority cohorts, so they can shortlist which ideas deserve full prototype and panel investment.
R&D adjusts sweetness, aroma focus or texture, reruns the analysis, and sees how acceptance %, advantages and disadvantages shift, helping them land on a formula and benefit story that are aligned.
A global product is analysed against multiple regional cohorts in turn, revealing where it over- or under-delivers on aroma, taste or texture, and guiding precise tweaks instead of guessing at “local flavour”.
When a launched SKU is not meeting targets, teams upload the current product profile and compare its sensory match and narrative with the target cohort, using the AI summary and pros/cons to separate product issues from positioning or channel problems.
Science is evolving— the world model is learning to taste. We are building the taste and smell layer of AI, linking real human sensory experience to machine to understand and helping to produce more nutritious and more tasty future of food.
Our Impact
Shaping the full sensory frontiers of intelligence
We help product and insights teams with sensory-intelligent tools that de-risk launches, unlock new growth spaces, and build products consumers truly love.
100K+
Digitalized taste profile
18M+
Sensory signals
119
Cohorts covered







