Write Python quickstart examples showing how to use LLMKit.
Primary: Python SDK (NEW)
We now have a Python SDK: pip install llmkit
Examples to write using the SDK:
- Basic chat completion (
from llmkit import LLMKit)
- Streaming response (
client.chat_stream() with CostStream)
- Session tracking (
client.session("my-agent"))
- Cost callback (
on_cost= parameter)
- Standalone cost estimation (
estimate_cost(response) - works without proxy)
- Async client (
AsyncLLMKit)
Secondary: base_url approach (zero-migration)
For devs who don't want to change their existing code:
7. OpenAI client with base_url pointed at proxy
8. With session tracking header (x-llmkit-session-id)
Framework integrations
- LangChain with LLMKit SDK
- LlamaIndex with LLMKit SDK
Each example should be a standalone .py file in examples/ directory. Include requirements.txt.
Proxy URL: https://llmkit-proxy.smigolsmigol.workers.dev/v1
Auth: LLMKIT_API_KEY env var or api_key= parameter
Write Python quickstart examples showing how to use LLMKit.
Primary: Python SDK (NEW)
We now have a Python SDK:
pip install llmkitExamples to write using the SDK:
from llmkit import LLMKit)client.chat_stream()with CostStream)client.session("my-agent"))on_cost=parameter)estimate_cost(response)- works without proxy)AsyncLLMKit)Secondary: base_url approach (zero-migration)
For devs who don't want to change their existing code:
7. OpenAI client with
base_urlpointed at proxy8. With session tracking header (
x-llmkit-session-id)Framework integrations
Each example should be a standalone .py file in examples/ directory. Include requirements.txt.
Proxy URL: https://llmkit-proxy.smigolsmigol.workers.dev/v1
Auth:
LLMKIT_API_KEYenv var orapi_key=parameter