Hi Webflow Community,
At Fener Interactive Studio, we’ve been integrating more LLM-based automations into our client builds recently. While Webflow’s native Logic and API are fantastic, we constantly run into the “token tax” issue when sending large CMS datasets to models like GPT-4 or Claude.
As we know, standard JSON is great for compatibility but quite “verbose” for LLMs due to repeated keys and syntax overhead.
I’ve been diving deep into TOON (Token Oriented Object Notation) lately. For those who haven’t seen it, it’s a format designed specifically to minimize token usage (claiming ~30-50% reduction compared to JSON) while remaining machine-readable for AI agents.
I wanted to open a discussion on two points:
-
Middleware Experience: Has anyone here successfully implemented a “JSON-to-TOON” conversion layer (via Make, n8n, or custom code) before sending Webflow CMS data to an AI endpoint? If so, did you notice a significant improvement in latency or cost?
-
Native Support: I know it’s early days, but is there any discussion within the Devs or Community about supporting lighter, token-friendly serialization formats within Webflow Logic in the future?
We are currently testing this via custom middleware, but I’d love to hear if anyone else is exploring this path to optimize their AI-powered Webflow apps.
Best,
Berkay Çınar
Fener Interactive Studio