Skip to content

Add Gradio demo with streaming chat and image upload#10

Open
korbonits wants to merge 2 commits intoTencent-Hunyuan:masterfrom
korbonits:gradio-demo
Open

Add Gradio demo with streaming chat and image upload#10
korbonits wants to merge 2 commits intoTencent-Hunyuan:masterfrom
korbonits:gradio-demo

Conversation

@korbonits
Copy link
Copy Markdown

Summary

  • app.py: interactive multimodal chat demo using gr.ChatInterface with gr.MultimodalTextbox (Gradio ≥ 4.44). Streaming output via TextIteratorStreamer in a daemon thread. Sidebar controls for temperature, max new tokens, and thinking mode. Lazy model loading on first request (thread-safe lock). CLI flags for --share, --host, --port; MODEL_PATH env var for local weights.
  • README.md: new Gradio Quick Start section with install and run instructions; ticks the [ ] Online Gradio Demo roadmap item.

Features

Feature Detail
Image upload Single image per turn via gr.MultimodalTextbox
Streaming Tokens streamed via TextIteratorStreamer; no waiting for full response
Thinking mode Checkbox wires directly to enable_thinking in apply_chat_template
Temperature Slider 0–2; 0 → greedy/deterministic
Text-only Works without an image
Local weights MODEL_PATH=/path/to/dir python app.py
Public share python app.py --share for a temporary Gradio tunnel URL

Usage

pip install "gradio>=4.44"
python app.py            # http://127.0.0.1:7860
python app.py --share    # public link

Test plan

  • python app.py starts without error and loads the model on first message
  • Image upload + text prompt returns a streamed response
  • Text-only prompt (no image) works correctly
  • Thinking mode checkbox changes model output style
  • Temperature=0 produces deterministic output across two identical prompts
  • python app.py --share prints a public Gradio URL
  • MODEL_PATH=/local/path python app.py loads local weights

🤖 Generated with Claude Code

@korbonits
Copy link
Copy Markdown
Author

Screenshot 2026-04-13 at 11 40 25 PM

@korbonits
Copy link
Copy Markdown
Author

Screenshot 2026-04-13 at 11 41 26 PM

korbonits and others added 2 commits April 13, 2026 23:54
- app.py: interactive multimodal chat interface (Gradio >= 4.44)
  - gr.ChatInterface with MultimodalTextbox for image + text input
  - streaming output via TextIteratorStreamer in a background thread
  - sidebar controls: temperature, max new tokens, thinking mode toggle
  - lazy model loading on first request (thread-safe)
  - --share / --host / --port CLI flags; MODEL_PATH env var for local weights
  - example prompts for spatial reasoning and robot planning queries
- README.md: adds Gradio Quick Start section; ticks roadmap item

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Gradio 6 introduced three breaking changes from 4.x:
- gr.Blocks(theme=...) removed; theme now passed to launch()
- gr.Chatbot(type="messages") removed (messages format is now default)
- gr.Chatbot(show_copy_button=True) replaced by buttons=["copy"]
- gr.ChatInterface(type="messages") kwarg removed

app.py now detects the installed major version at import time
(_GRADIO_MAJOR) and branches accordingly, keeping compatibility
with both Gradio 4.x and 6.x.

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant