🧩

Query Decomposition

Decompose complex natural language queries like 'cheap waterproof running shoes for wide feet in blue' into structured intents (price, feature, fit, activity, color). Powers AI shopping assistants that need to understand multi-attribute requests before searching.

Conversational commerce Users speak in complex, multi-constraint sentences. Cloud LLMs decompose at 200-500ms, LFM at 45ms
Agent enablement Structured decomposition before the agent searches. Faster, more accurate than full-string search
Real-time The user is in a chat, waiting. 200ms cloud LLM delay feels broken; 45ms LFM is invisible

The Problem

AI shopping agents get 'cheap waterproof running shoes for wide feet in blue.' Passing the full string to search produces poor results. Cloud LLMs decompose at 200-500ms.

How LFM Compares

Passing a complex query as-is produces poor results. Cloud decomposition works but at 200-500ms. LFM parses multi-constraint queries into structured intents at 45ms.

What LFM Unlocks

Structured decomposition at 45ms. Complex query → price + feature + fit + activity + color intents.

🤖
Agent Query Decomposition
Break down complex shopping queries into separate intents for AI shopping agents. This is what makes conversational commerce viable.

Enter Complex Shopping Query

This demo is fine-tuned on sample data. Results improve with your data.