Skip to content

feat: Integrate Helicone Provider w LlamaIndex TS#2235

Open
H2Shami wants to merge 6 commits intorun-llama:mainfrom
H2Shami:helicone-integration
Open

feat: Integrate Helicone Provider w LlamaIndex TS#2235
H2Shami wants to merge 6 commits intorun-llama:mainfrom
H2Shami:helicone-integration

Conversation

@H2Shami
Copy link

@H2Shami H2Shami commented Nov 4, 2025

No description provided.

@changeset-bot
Copy link

changeset-bot bot commented Nov 4, 2025

🦋 Changeset detected

Latest commit: 4d01837

The changes in this PR will be included in the next version bump.

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

Copilot AI review requested due to automatic review settings December 8, 2025 19:19
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR integrates the Helicone AI Gateway provider with LlamaIndex TypeScript, enabling users to route OpenAI-compatible requests through Helicone for observability, control, and provider routing.

Key Changes:

  • Added a new @llamaindex/helicone provider package that extends the OpenAI adapter
  • Implemented configuration for Helicone API key and base URL with environment variable support
  • Added comprehensive tests and documentation with usage examples

Reviewed changes

Copilot reviewed 9 out of 11 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
tsconfig.json Added TypeScript project reference for the new Helicone provider package
pnpm-lock.yaml Added dependency entries for the Helicone provider in the workspace
packages/providers/helicone/tsconfig.json TypeScript configuration for the Helicone provider package
packages/providers/helicone/package.json Package manifest defining the Helicone provider module structure and dependencies
packages/providers/helicone/src/llm.ts Core implementation of the Helicone class extending OpenAI with gateway-specific configuration
packages/providers/helicone/src/index.ts Package entry point exporting the Helicone LLM
packages/providers/helicone/tests/index.test.ts Unit tests covering basic functionality, configuration, and error handling
examples/package.json Added Helicone provider as a dependency for the examples package
examples/models/helicone-basic.ts Basic usage example demonstrating Helicone integration with custom headers
docs/src/content/docs/framework/modules/models/llms/helicone.mdx Documentation page explaining installation, usage, and configuration options
.changeset/free-months-lick.md Changeset entry documenting the integration of Helicone with LlamaIndex
Files not reviewed (1)
  • pnpm-lock.yaml: Language not supported

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

"@llamaindex/openai": "^0.4.22",
"@llamaindex/ovhcloud": "^1.0.0",
"@llamaindex/perplexity": "^0.0.35",
"@llamaindex/helicone": "^0.1.0",
Copy link

Copilot AI Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The version specified is "^0.1.0", but this package is being introduced in this PR and hasn't been published yet. This should use "workspace:*" like other provider packages in the monorepo (as seen in pnpm-lock.yaml line 412).

Suggested change
"@llamaindex/helicone": "^0.1.0",
"@llamaindex/helicone": "workspace:*",

Copilot uses AI. Check for mistakes.
Comment on lines +411 to +413
'@llamaindex/helicone':
specifier: workspace:*
version: link:../packages/providers/helicone
Copy link

Copilot AI Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The @llamaindex/helicone entry is not in alphabetical order. It should be placed after @llamaindex/groq (line 410) and before @llamaindex/huggingface (line 414) to maintain consistency with the rest of the file.

Copilot uses AI. Check for mistakes.
Comment on lines +69 to +82
// ---- OR ----

// Use the Helicone AI Gateway
const llm = new Helicone({
model: "gpt-4o-mini",
apiKey: "<helicone-api-key>", //Use if HELICONE_API_KEY isn't set in your env
});

const res = await llm.chat({
messages: [{ role: "user", content: "Hello from Helicone!" }],
});

console.log("Helicone response:", res.message.content);
}
Copy link

Copilot AI Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment "// ---- OR ----" on line 69 creates a code example that shows two different approaches but runs them sequentially in the same function, which is confusing. The second approach (starting at line 72) would duplicate the query execution. Consider restructuring this to show two separate, complete examples or clarify that only one approach should be used at a time.

Copilot uses AI. Check for mistakes.
Comment on lines +37 to +63
const document = new Document({ text: essay, id_: "essay" });
const index = await VectorStoreIndex.fromDocuments([document]);
```

## Query

```ts
const queryEngine = index.asQueryEngine();
const response = await queryEngine.query({ query: "What is the meaning of life?" });
console.log(response.response);
```

## Full Example

```ts
import { Helicone } from "@llamaindex/helicone";
import { Document, Settings, VectorStoreIndex } from "llamaindex";


async function main() {
// Use the Helicone AI Gateway
Settings.llm = new Helicone({
model: "gpt-4o-mini",
apiKey: "<helicone-api-key>", //Use if HELICONE_API_KEY isn't set in your env
});

const document = new Document({ text: essay, id_: "essay" });
Copy link

Copilot AI Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The documentation references an essay variable on lines 63 and 37 that is never defined in the examples. This will cause the code to fail if users try to run it. Consider either importing/defining the essay variable or using a placeholder string with a comment indicating where users should provide their own text.

Copilot uses AI. Check for mistakes.
"@llamaindex/openai": "^0.4.22",
"@llamaindex/ovhcloud": "^1.0.0",
"@llamaindex/perplexity": "^0.0.35",
"@llamaindex/helicone": "^0.1.0",
Copy link

Copilot AI Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The @llamaindex/helicone dependency is not in alphabetical order. It should be placed after @llamaindex/groq (line 31) and before @llamaindex/huggingface (line 32) to maintain consistency with the rest of the dependencies.

Copilot uses AI. Check for mistakes.
Comment on lines +35 to +37
* Convenience function to create a new HeliconeLLM instance.
* @param init - Optional initialization parameters for the HeliconeLLM instance.
* @returns A new HeliconeLLM instance.
Copy link

Copilot AI Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The documentation comment refers to "HeliconeLLM" but the class is actually named "Helicone". This inconsistency should be corrected for clarity.

Suggested change
* Convenience function to create a new HeliconeLLM instance.
* @param init - Optional initialization parameters for the HeliconeLLM instance.
* @returns A new HeliconeLLM instance.
* Convenience function to create a new Helicone instance.
* @param init - Optional initialization parameters for the Helicone instance.
* @returns A new Helicone instance.

Copilot uses AI. Check for mistakes.
{
"name": "@llamaindex/helicone",
"description": "Helicone AI Gateway Adapter for LlamaIndex",
"version": "0.1.0",
Copy link

Copilot AI Dec 8, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] The package version is set to "0.1.0" but the changeset indicates this is a "major" release. For a major release of a new package starting from 0.1.0, the version should typically be "1.0.0" (if following semantic versioning for stable releases) or remain at "0.1.0" if treating as pre-1.0 (where breaking changes are allowed). Consider aligning the version with the changeset type or updating the changeset to "minor" if this is an initial pre-1.0 release.

Suggested change
"version": "0.1.0",
"version": "1.0.0",

Copilot uses AI. Check for mistakes.
@H2Shami
Copy link
Author

H2Shami commented Dec 8, 2025

hey @marcusschiesser! I made a fix that should get the CI tests working.

but it required to set the package version in examples/package.json to workspace:*, to align the pnpm.lock file, which means the package isn't released, but will work locally.

The notion is that there's a CI in this repo that will release the package and update package versions and then publish it. am i mistaken?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants