Skip to content

feat(ai): add streaming support to Conversation#1433

Open
hwbrzzl wants to merge 5 commits intomasterfrom
bowen/#917
Open

feat(ai): add streaming support to Conversation#1433
hwbrzzl wants to merge 5 commits intomasterfrom
bowen/#917

Conversation

@hwbrzzl
Copy link
Copy Markdown
Contributor

@hwbrzzl hwbrzzl commented Mar 31, 2026

Summary

  • Conversation.Stream() sends input to the AI provider and returns a StreamableResponse for consuming tokens incrementally
  • StreamableResponse.HTTPResponse() pipes events to an HTTP response as Server-Sent Events by default, with optional status code and render overrides
  • Option is refactored from func(map[string]any) to func(*Options) for compile-time safety; a separate StreamOption pair configures HTTP rendering

Closes goravel/goravel#917

Why

The Conversation interface previously only supported blocking Prompt() calls that returned the full response at once. Adding Stream() lets applications forward tokens to the client as the model generates them, which is essential for chat-like experiences where users expect output to appear in real time.

StreamableResponse provides three methods: Each for iterating raw events, Then for registering a post-stream callback (e.g. to persist the conversation history after the stream finishes), and HTTPResponse for wiring the stream into a Goravel HTTP handler with SSE headers applied automatically.

func (r *ChatController) Stream(ctx http.Context) http.Response {
	conversation, err := facades.AI().Agent(agent)
	if err != nil {
		return ctx.Response().Json(http.StatusInternalServerError, http.Json{"error": err.Error()})
	}

	stream, err := conversation.Stream(ctx.Request().Input("message"))
	if err != nil {
		return ctx.Response().Json(http.StatusInternalServerError, http.Json{"error": err.Error()})
	}

	// Zero-config: defaults to HTTP 200 with SSE headers (Content-Type: text/event-stream)
	return stream.HTTPResponse(ctx)
}

@hwbrzzl hwbrzzl requested a review from a team as a code owner March 31, 2026 13:44
Copilot AI review requested due to automatic review settings March 31, 2026 13:44
@hwbrzzl hwbrzzl marked this pull request as draft March 31, 2026 13:45
@codecov
Copy link
Copy Markdown

codecov bot commented Mar 31, 2026

Codecov Report

❌ Patch coverage is 90.76923% with 24 lines in your changes missing coverage. Please review.
✅ Project coverage is 68.95%. Comparing base (4b3811a) to head (a6d22c2).
⚠️ Report is 5 commits behind head on master.

Files with missing lines Patch % Lines
ai/streamable_response.go 88.66% 11 Missing and 6 partials ⚠️
ai/openai/provider.go 91.04% 3 Missing and 3 partials ⚠️
ai/conversation.go 96.66% 1 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master    #1433      +/-   ##
==========================================
+ Coverage   68.76%   68.95%   +0.19%     
==========================================
  Files         361      362       +1     
  Lines       27776    28069     +293     
==========================================
+ Hits        19099    19356     +257     
- Misses       7831     7855      +24     
- Partials      846      858      +12     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request adds first-class streaming support to the AI conversation/provider contracts and implements a concrete streaming response type, enabling providers (notably OpenAI) to emit incremental events and optionally render them as an HTTP streaming response.

Changes:

  • Extend contracts/ai with streaming types (StreamEvent, StreamableResponse, StreamOption) and add Stream methods to Conversation and Provider.
  • Implement ai/streamable_response.go (event buffering + Each/Then + SSE-style HTTPResponse) and wire Conversation.Stream() to update history on completion.
  • Refactor conversation option handling from map[string]any to a typed *contractsai.Options struct, updating helpers, mocks, and tests.

Reviewed changes

Copilot reviewed 17 out of 17 changed files in this pull request and generated 8 comments.

Show a summary per file
File Description
mocks/ai/StreamOption.go Adds generated mock for the new StreamOption type.
mocks/ai/StreamableResponse.go Adds generated mock for the new StreamableResponse contract.
mocks/ai/RenderFunc.go Adds generated mock for RenderFunc used by streaming HTTP rendering.
mocks/ai/Provider.go Updates provider mock to include the new Stream method.
mocks/ai/Option.go Updates option mock to match the new typed *ai.Options signature.
mocks/ai/Conversation.go Updates conversation mock to include the new Stream method.
contracts/ai/stream.go Introduces streaming contracts (events/options/response interface).
contracts/ai/provider.go Extends Provider contract with Stream(ctx, prompt) method.
contracts/ai/option.go Replaces map-based options with typed Options and updates Option signature.
contracts/ai/ai.go Extends Conversation contract with Stream(input) method.
ai/streamable_response.go Implements StreamableResponse including Each, Then, and HTTPResponse.
ai/provider_test.go Updates test provider to satisfy the new Provider interface.
ai/option.go Updates option helpers to populate typed options; adds stream option helpers.
ai/option_test.go Updates option tests for the typed options refactor.
ai/openai/provider.go Adds OpenAI streaming implementation using NewStreaming and emits stream events.
ai/conversation.go Adds Conversation.Stream() and appends final messages via Then().
ai/application.go Switches conversation option application to typed *contractsai.Options.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

var (
_ contractsai.StreamableResponse = (*streamableResponse)(nil)

errStreamRunnerRequired = stderrors.New("ai stream runner is required")
Copy link
Copy Markdown
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Move this error to errors/list.go

@hwbrzzl hwbrzzl marked this pull request as ready for review April 11, 2026 05:50
Copilot AI review requested due to automatic review settings April 11, 2026 05:50
Copy link
Copy Markdown
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 23 out of 23 changed files in this pull request and generated 1 comment.

Comment on lines +251 to +259
r.mu.Lock()
r.response = response
r.err = err
r.thenErr = nil
if err == nil && response != nil {
callbacks = append(callbacks, r.thenCallbacks...)
}
r.thenCallbacks = nil
r.mu.Unlock()
Copy link

Copilot AI Apr 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

complete() snapshots thenCallbacks and clears r.thenCallbacks while r.finished is still false (it’s only set later at the end). If Then() is called concurrently after the snapshot/clear (lines 251-259) but before finished becomes true (line 274), it will append to r.thenCallbacks and that callback will never be executed (and may also be overwritten/ignored when err/thenErr are finalized). Consider marking the stream as finished (or introducing a completing/done state) under the lock before releasing it, and/or looping to drain any callbacks appended while callbacks are executing so Then() is never lost during completion.

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Feature] AI SDK Phase 3: Advanced Interactions (Tools & Structured Output)

2 participants