Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 5 additions & 11 deletions .cursor/rules/python/RULE.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,16 +8,11 @@
- `tox run -e unit_tests -- -c pytest-quiet.ini` - Run all unit tests in quiet mode
- `tox run -e unit_tests -- -k test_name` - Run specific test in verbose mode
- `tox run -e integration_tests` - Run integration tests
- `tox run -e type_check_unit_tests` - Type check unit tests
- `make typecheck-python` or `uv run mypy` - Type check all Python code
- `tox list` - List all available tox environments

By default, `pytest` is configured to run in verbose mode. When running a large number of tests at once, ensure that you run in quiet mode to avoid flooding the context window.

### Development Setup

- `tox run -e add_symlinks` - Add symlinks for sub-packages (required after setup)
- `tox run -e remove_symlinks` - Remove symlinks before type checking

### Database

- `tox -e alembic -- upgrade head` - Run migrations
Expand All @@ -39,11 +34,10 @@ By default, `pytest` is configured to run in verbose mode. When running a large

## Workflow

1. Always run `tox run -e add_symlinks` after initial setup
2. Use tox for all testing and linting operations
3. Run `tox run -e clean_jupyter_notebooks` after editing notebooks
4. Type check with: `tox run -e ruff,remove_symlinks,type_check,add_symlinks`
5. Check linter errors with ReadLints tool after making changes
1. Use tox for all testing and linting operations
2. Run `tox run -e clean_jupyter_notebooks` after editing notebooks
3. Type check with: `make typecheck-python` or `uv run mypy`
4. Check linter errors with ReadLints tool after making changes

## Project Structure

Expand Down
96 changes: 10 additions & 86 deletions .github/workflows/python-CI.yml
Original file line number Diff line number Diff line change
Expand Up @@ -337,7 +337,7 @@ jobs:
- run: git diff --exit-code

type-check:
name: Type Check
name: Type Check Python
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Type-check job missing phoenix_client trigger condition

Medium Severity

The deleted type-check-integration-tests job had the condition needs.changes.outputs.phoenix == 'true' || needs.changes.outputs.phoenix_client == 'true', meaning it ran type checking when only packages/phoenix-client/ files changed. The consolidated type-check job only triggers on needs.changes.outputs.phoenix == 'true', which doesn't cover packages/phoenix-client/** changes. This silently skips type checking of integration tests (which import from phoenix-client) when only client code changes.

Fix in Cursor Fix in Web

runs-on: ${{ matrix.os }}
needs: changes
if: ${{ needs.changes.outputs.phoenix == 'true' }}
Expand All @@ -351,10 +351,12 @@ jobs:
uses: actions/checkout@v4
with:
sparse-checkout: |
Makefile
scripts/ci/
requirements/
src/phoenix/
packages/phoenix-client/
src/
packages/
tests/
- name: Set up Python ${{ matrix.py }}
uses: actions/setup-python@v5
with:
Expand All @@ -366,53 +368,15 @@ jobs:
enable-cache: true
cache-dependency-glob: |
pyproject.toml
requirements/ci.txt
requirements/type-check.txt
uv.lock
github-token: ${{ secrets.GITHUB_TOKEN }}
- name: Check types
run: uvx tox run -e type_check
- name: Sync dependencies
run: uv sync --frozen
- name: Type check all Python code
run: uv run mypy
- name: Ensure GraphQL mutations have permission classes
run: uvx tox run -e ensure_graphql_mutations_have_permission_classes

type-check-unit-tests:
name: Type Check Unit Tests
runs-on: ${{ matrix.os }}
needs: changes
if: ${{ needs.changes.outputs.phoenix == 'true' }}
strategy:
fail-fast: false
matrix:
py: ["3.10", "3.13"]
os: [ubuntu-latest]
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
sparse-checkout: |
requirements/
src/phoenix/
tests/unit/
tests/conftest.py
tests/mypy.ini
tests/__generated__/
tests/__init__.py
- name: Set up Python ${{ matrix.py }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.py }}
- name: Set up `uv`
uses: astral-sh/setup-uv@v7
with:
version: "0.9.18"
enable-cache: true
cache-dependency-glob: |
pyproject.toml
requirements/ci.txt
requirements/unit-tests.txt
github-token: ${{ secrets.GITHUB_TOKEN }}
- name: Check types on unit tests
run: uvx tox run -e type_check_unit_tests

unit-tests:
name: Unit Tests (${{ matrix.db }}, ${{ matrix.py }})
runs-on: oss-4-core-runner
Expand Down Expand Up @@ -460,44 +424,6 @@ jobs:
timeout-minutes: 60
run: uvx tox run -e unit_tests -- -ra --reruns 5 --db postgresql -n 4 --dist loadscope

type-check-integration-tests:
name: Type Check Integration Tests
runs-on: ${{ matrix.os }}
needs: changes
if: ${{ needs.changes.outputs.phoenix == 'true' || needs.changes.outputs.phoenix_client == 'true' }}
strategy:
fail-fast: false
matrix:
py: ["3.10", "3.13"]
os: [ubuntu-latest]
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
sparse-checkout: |
requirements/
src/phoenix/
packages/phoenix-client/
tests/integration/
tests/__generated__/
tests/__init__.py
- name: Set up Python ${{ matrix.py }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.py }}
- name: Set up `uv`
uses: astral-sh/setup-uv@v7
with:
version: "0.9.18"
enable-cache: true
cache-dependency-glob: |
pyproject.toml
requirements/ci.txt
requirements/integration-tests.txt
github-token: ${{ secrets.GITHUB_TOKEN }}
- name: Check types on integration tests
run: uvx tox run -e type_check_integration_tests

integration-tests:
name: Integration Tests
runs-on: ${{ matrix.os }}
Expand Down Expand Up @@ -682,9 +608,7 @@ jobs:
- compile-prompts
- check-lockfile
- type-check
- type-check-unit-tests
- unit-tests
- type-check-integration-tests
- integration-tests
- test-migrations
- phoenix-client-canary-tests-sdk
Expand Down
13 changes: 10 additions & 3 deletions AGENTS.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,13 +45,20 @@ uv run pytest tests/unit/test_failed_unit_tests.py::test_failed_test # Runs a p
uv run pytest tests/integration -n auto # Runs integration tests in parallel
```

Type checking can be done via a `make` command or by invoking `uv run mypy` directly:

```bash
make typecheck-python # Type check (checks src/ and tests/)
uv run mypy # Type check (checks src/ and tests/)
uv run mypy src # Type check the main source code
uv run mypy tests/unit # Type check unit tests
uv run mypy tests/integration # Type check integration tests
```

Other commands can be managed through tox

```bash
tox run -e ruff # Format and lint
tox run -e ruff,remove_symlinks,type_check,add_symlinks # Type check (remove/add symlinks)
tox run -e type_check_unit_tests # Type check unit tests
tox run -e type_check_integration_tests # Type check integration tests
tox run -e phoenix_client # Test sub-package
tox list # List all environments
```
Expand Down
2 changes: 1 addition & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Before submitting a pull request, please make sure the following is done. We exp
- Ensure test suite passes (`tox run -e unit_tests` and `npm run test` for app changes)
- Make sure your code is formatted with `tox run -e ruff` and `pnpm --dir app run fmt` for app changes.
- Make sure to your code lints with `npm run lint` for app changes.
- Run type checking with `tox run -e type_check` and `npm run typecheck` for app changes.
- Run type checking with `make typecheck-python` and `npm run typecheck` for app changes.

### Pull Request (PR) Descriptions

Expand Down
20 changes: 9 additions & 11 deletions DEVELOPMENT.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,17 +31,9 @@ The following command installs the main `arize-phoenix` package and all sub-pack
uv sync --python 3.10
```

Some parts of Phoenix, such as `phoenix.evals`, `phoenix.otel`, and `phoenix.client`, are developed as local packages located under the `packages/` directory. These modules are excluded from the standard build process and are not installed automatically.
The sub-packages (`phoenix.evals`, `phoenix.otel`, and `phoenix.client`) located under the packages/ directory are automatically installed in editable mode via the `uv` workspace configuration.

To make these modules available when working from source, run:

```bash
tox run -e add_symlinks
```

This command will create symbolic links inside src/phoenix/ pointing to the relevant submodules.

**Second**, install the web build dependencies.
**Next**, install the web build dependencies.

We recommend installing [nodejs via nvm](https://github.com/nvm-sh/nvm) and then
installing `pnpm` globally to manage the web frontend dependencies.
Expand Down Expand Up @@ -111,7 +103,13 @@ To run unit tests faster using parallel execution:
tox r -e unit_tests -- -n auto
```

Check the output of `tox list` to find commands for type-checks, linters, formatters, etc.
To run type checking:

```bash
make typecheck-python
```

Check the output of `tox list` to find commands for linters, formatters, and other tools.

## Installing Pre-Commit Hooks

Expand Down
13 changes: 4 additions & 9 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ NC := \033[0m # No Color
#=============================================================================

.PHONY: help check-tools \
setup install-python install-node setup-symlinks \
setup install-python install-node \
graphql schema-graphql relay-build \
openapi schema-openapi codegen-python-client codegen-ts-client \
dev dev-backend dev-frontend \
Expand All @@ -56,7 +56,6 @@ help: ## Show this help message
@echo -e " check-tools - Verify required tools are installed"
@echo -e " install-python - Install Python dependencies"
@echo -e " install-node - Install Node.js dependencies"
@echo -e " setup-symlinks - Create Python package symlinks"
@echo -e ""
@echo -e "$(GREEN)Development:$(NC)"
@echo -e " $(YELLOW)dev$(NC) - Full dev environment (backend + frontend)"
Expand Down Expand Up @@ -131,12 +130,7 @@ install-node: ## Install Node.js dependencies
@cd $(JS_DIR) && $(PNPM) install --silent
@echo -e "$(GREEN)✓ Done$(NC)"

setup-symlinks: ## Create Python package symlinks
@echo -e "$(CYAN)Creating Python package symlinks...$(NC)"
@$(TOX) run -q -e add_symlinks
@echo -e "$(GREEN)✓ Done$(NC)"

setup: check-tools install-python install-node setup-symlinks ## Complete development environment setup
setup: check-tools install-python install-node ## Complete development environment setup
@echo -e ""
@echo -e "$(GREEN)✓ Phoenix development environment setup complete!$(NC)"
@echo -e ""
Expand Down Expand Up @@ -219,7 +213,8 @@ test: test-python test-frontend test-ts ## Run all tests (Python + frontend + Ty

typecheck-python: ## Type check Python code
@echo -e "$(CYAN)Type checking Python...$(NC)"
@$(TOX) run -q -e remove_symlinks,type_check,add_symlinks
@$(UV) run mypy
@echo -e "$(GREEN)✓ Type check complete$(NC)"

typecheck-python-ty: ## Type check Python with ty (verify expected errors only)
@echo -e "$(CYAN)Type checking Python with ty...$(NC)"
Expand Down
5 changes: 5 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -156,6 +156,7 @@ dev = [
"types-protobuf",
"types-psutil",
"types-python-dateutil",
"types-PyYAML",
"types-requests",
"types-setuptools",
"types-tabulate",
Expand Down Expand Up @@ -281,6 +282,10 @@ pypi = [
[tool.mypy]
plugins = ["strawberry.ext.mypy_plugin", "pydantic.mypy"]
strict = true
files = [
"src/",
"tests/",
]
exclude = [
"api_reference",
"dist/",
Expand Down
2 changes: 1 addition & 1 deletion src/phoenix/db/engines.py
Original file line number Diff line number Diff line change
Expand Up @@ -122,7 +122,7 @@ def async_creator() -> aiosqlite.Connection:
lambda: sqlean.connect(f"file:{database}", uri=True),
iter_chunk_size=64,
)
conn.daemon = True # type: ignore[attr-defined]
conn.daemon = True
return conn

engine = create_async_engine(
Expand Down
10 changes: 8 additions & 2 deletions src/phoenix/db/types/model_provider.py
Original file line number Diff line number Diff line change
Expand Up @@ -670,12 +670,18 @@ def get_client_factory(
else None
)

def create_client() -> "AbstractAsyncContextManager[GoogleAsyncClient]":
# Wrapped with @asynccontextmanager because Google's AsyncClient has
# a non-standard __aexit__ signature that doesn't conform to
# AbstractAsyncContextManager.
@asynccontextmanager
async def create_client() -> "AsyncIterator[GoogleAsyncClient]":
with without_env_vars("GOOGLE_*", "GEMINI_*"):
return Client( # type: ignore[no-any-return]
client = Client(
api_key=api_key,
http_options=http_options,
).aio
async with client as client_:
yield client_

return create_client

Expand Down
33 changes: 21 additions & 12 deletions src/phoenix/server/api/helpers/playground_clients.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
from abc import ABC, abstractmethod
from collections import defaultdict
from collections.abc import AsyncIterator, Callable, Iterator
from contextlib import AbstractAsyncContextManager
from contextlib import AbstractAsyncContextManager, asynccontextmanager
from functools import wraps
from itertools import chain
from secrets import token_hex
Expand Down Expand Up @@ -2143,13 +2143,19 @@ async def _chat_completion_create(
async for event in stream:
# Update token counts if usage_metadata is present
if event.usage_metadata:
span.set_attributes(
{
LLM_TOKEN_COUNT_PROMPT: event.usage_metadata.prompt_token_count,
LLM_TOKEN_COUNT_COMPLETION: event.usage_metadata.candidates_token_count,
LLM_TOKEN_COUNT_TOTAL: event.usage_metadata.total_token_count,
}
)
token_counts = {}
if event.usage_metadata.prompt_token_count is not None:
token_counts[LLM_TOKEN_COUNT_PROMPT] = (
event.usage_metadata.prompt_token_count
)
if event.usage_metadata.candidates_token_count is not None:
token_counts[LLM_TOKEN_COUNT_COMPLETION] = (
event.usage_metadata.candidates_token_count
)
if event.usage_metadata.total_token_count is not None:
token_counts[LLM_TOKEN_COUNT_TOTAL] = event.usage_metadata.total_token_count
if token_counts:
span.set_attributes(token_counts)

if event.candidates:
candidate = event.candidates[0]
Expand Down Expand Up @@ -2673,10 +2679,13 @@ def create_anthropic_client() -> anthropic.AsyncAnthropic:
"Set the GEMINI_API_KEY environment variable or use a custom provider."
)

# Create factory that returns fresh Google GenAI async client (native async context manager)
# Note: Client(api_key).aio returns the AsyncClient which is an async context manager
def create_google_client() -> "GoogleAsyncClient":
return GoogleGenAIClient(api_key=api_key).aio
# Wrapped with @asynccontextmanager because Google's AsyncClient has
# a non-standard __aexit__ signature that doesn't conform to
# AbstractAsyncContextManager (returns None instead of bool | None).
@asynccontextmanager
async def create_google_client() -> "AsyncIterator[GoogleAsyncClient]":
async with GoogleGenAIClient(api_key=api_key).aio as client:
yield client

client_factory = create_google_client
if model_name in GEMINI_2_0_MODELS:
Expand Down
4 changes: 2 additions & 2 deletions src/phoenix/server/api/routers/utils.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
from datetime import datetime
from typing import Optional, cast
from typing import Optional

import pandas as pd
import pyarrow as pa
Expand All @@ -9,7 +9,7 @@ def table_to_bytes(table: pa.Table) -> bytes:
sink = pa.BufferOutputStream()
with pa.ipc.new_stream(sink, table.schema) as writer:
writer.write_table(table)
return cast(bytes, sink.getvalue().to_pybytes())
return sink.getvalue().to_pybytes()


def from_iso_format(value: Optional[str]) -> Optional[datetime]:
Expand Down
Loading
Loading