Cherry Studio AI Core (3): From aiCore to UI — The End-to-End Rendering Chain
5 Jan 2026
4 min read
This is part 3 of the series. Instead of going deeper into tools again, I show the complete chain from aiCore entry to UI rendering, and explain the design decisions that made the chain reliable.
1. Entry point: ModernAiProvider (and why it is the real boundary)
The chain starts at src/renderer/src/aiCore/index_new.ts. This file is the boundary between UI intent and model execution. I kept it thin on the surface, but it actually does three important things before a request hits the model:
- Provider normalization: resolve and adapt provider config once (
providerToAiSdkConfig,prepareSpecialProviderConfig). - Middleware + plugin assembly: build middlewares (AI SDK level) and plugins (app level) in one place.
- Execution path selection: use modern SDK for text, fallback to legacy for image endpoints that need advanced features (AI SDK v5 does not support image editing yet, so we keep legacy for that path).
The modern path looks like this:
const plugins = await buildPlugins(config)
const executor = createExecutor(this.config!.providerId, this.config!.options, plugins)
const streamResult = await executor.streamText({
...params,
model,
experimental_context: { onChunk: config.onChunk }
})File: src/renderer/src/aiCore/index_new.ts
This is why I treat ModernAiProvider as the real boundary: it is the one place that knows how to build the pipeline, while everything downstream just consumes standardized events.
2. Streaming → Chunk adapter (why we still need our own chunk layer)
AI SDK stream events are useful, but they are not shaped for UI rendering. Our UI needs finer granularity and consistent semantics across providers. That is why I convert SDK events into Cherry Studio chunks.
Chunk design goals:
- Smaller, UI-first granularity (text delta, reasoning delta, tool pending, tool complete, image complete, etc.).
- Provider-agnostic contract so the UI never needs provider branching.
- Room for app-specific events (e.g., block lifecycle, metrics, reasoning states).
AiSdkToChunkAdapter is the bridge. It takes SDK stream events and emits our chunk types. That is the point where provider-specific differences stop and the UI contract begins.
File: src/renderer/src/aiCore/chunk/AiSdkToChunkAdapter.ts
3. Chunk → UI blocks (callback design philosophy)
Chunks are consumed by callbacks and turned into message blocks. I do not treat callbacks as “random handlers”; they are the contract between streaming and UI. The design principle is:
- Every chunk type maps to a predictable block mutation.
- Callbacks are specialized so each block type is owned by one callback (tool, text, thinking, error, citation, etc.).
Tool callbacks handle tool blocks and approval cleanup, while base callbacks handle completion, errors, usage, metrics, and notifications. This split keeps the UI logic composable and reduces cross-coupling between block types.
Callbacks involved: toolCallbacks, baseCallbacks
4. Trace integration (AI SDK trace + adapter + existing trace format)
We deprecated the old trace integration path and moved to AI SDK trace + an adapter so we can keep our existing SpanEntity format. The idea is to let the SDK own trace context and spans, then convert those spans into the format our UI and storage already understand.
There are two pieces:
_completionsForTracewraps the modern pipeline and callsaddSpan/endSpanaround the SDK call.AiSdkSpanAdapterconverts AI SDK spans into our SpanEntity format, including token usage, inputs/outputs, and operation type.
File: src/renderer/src/aiCore/index_new.ts
File: src/renderer/src/aiCore/trace/AiSdkSpanAdapter.ts
5. Why this chain is the architecture
The real architecture is the chain itself:
flowchart TD A[ModernAiProvider entry] --> B[buildPlugins + createExecutor] B --> C[executor.streamText] C --> D[AiSdkToChunkAdapter] D --> E[Chunk callbacks] E --> F[BlockManager + UI blocks]
This chain is where I invested most of the design effort. It gives me three guarantees:
- The UI always receives a small, stable set of chunk types.
- Provider quirks stop at the adapter boundary.
- Tool results, reasoning, and images are rendered consistently.
Takeaways
- The entry point is deliberately thin; the runtime pipeline carries the complexity.
- Chunk normalization is the bridge that keeps UI simple.
- Callbacks are the UI contract, not just event handlers.
- The “aiCore → UI” chain is the true architecture in practice.