Skip to content

Conversation

@ksylvan
Copy link
Collaborator

@ksylvan ksylvan commented Dec 4, 2025

Add Abacus vendor for ChatLLM models with static model list

Summary

This PR adds support for the Abacus AI provider to the OpenAI-compatible plugin, including a new static model list mechanism for providers that don't support dynamic model discovery.

Related Issues

Closes #1800

Files Changed

.vscode/settings.json

  • Added two new dictionary entries: "kimi" and "qwen" to the cSpell.words array
  • These additions support spell-checking for new model names and providers referenced in the codebase

internal/plugins/ai/openai_compatible/providers_config.go

  • Added new Abacus provider configuration to ProviderMap
  • Introduced getStaticModels() method to handle static model lists
  • Modified ListModels() method to check for static model list prefix ("static:")
  • Added fmt package import for error formatting

internal/plugins/ai/openai_compatible/providers_config_test.go

  • Added test case for the new Abacus provider to verify it exists in the provider map

Code Changes

Static Model List Mechanism

The most significant change is the introduction of a static model list handler in ListModels():

// If a custom models URL is provided, handle it
if c.modelsURL != "" {
    // Check for static model list
    if strings.HasPrefix(c.modelsURL, "static:") {
        return c.getStaticModels(c.modelsURL)
    }
    // ... existing code for dynamic model fetching
}

This enables providers to specify a static list of models using a special static: prefix instead of requiring an API endpoint for model discovery.

New getStaticModels Method

A new method was added to retrieve predefined model lists:

func (c *Client) getStaticModels(modelsKey string) ([]string, error) {
    switch modelsKey {
    case "static:abacus":
        return []string{
            "route-llm",
            "gpt-4o-2024-11-20",
            // ... 50+ additional models
        }, nil
    default:
        return nil, fmt.Errorf("unknown static model list: %s", modelsKey)
    }
}

Abacus Provider Configuration

Added new provider entry in ProviderMap:

"Abacus": {
    Name:                "Abacus",
    BaseURL:             "https://routellm.abacus.ai/v1/",
    ModelsURL:           "static:abacus",
    ImplementsResponses: false,
},

Reason for Changes

The changes were made to:

  1. Support Abacus AI provider: Integrate Abacus AI's RouteLL M service which provides access to multiple AI models through a unified API
  2. Handle providers without model discovery APIs: Some providers don't expose a /models endpoint or have non-standard model listing mechanisms. The static model list approach provides a fallback solution
  3. Maintain code consistency: Abacus follows the same pattern as other OpenAI-compatible providers in the codebase

Impact of Changes

Positive Impacts

  • Users can now access 50+ AI models through the Abacus AI provider, including models from OpenAI, Anthropic, Meta, Google, Alibaba, DeepSeek, and others
  • The static model list mechanism is extensible and can be reused for other providers that don't support dynamic model discovery
  • No breaking changes to existing functionality

Potential Issues

  1. Model list maintenance: The static model list needs manual updates when Abacus adds or removes models. This could lead to outdated model references if not maintained regularly
  2. No validation: The code doesn't validate whether the static models are actually available through the Abacus API at runtime
  3. Hardcoded models: Having 50+ hardcoded model names increases the binary size slightly and creates a maintenance burden
  4. Error handling: If a user tries to use a model from the static list that Abacus no longer supports, they'll receive an error at request time rather than during model listing

Test Plan

  1. Unit test coverage: Added test case in providers_config_test.go verifying the Abacus provider exists in the provider map
  2. Manual testing should include:
    • Selecting the Abacus provider and verifying the static model list is returned
    • Attempting to use various models from the static list with actual API calls
    • Verifying error handling when an invalid static key is provided
  3. Integration testing: Test the complete flow of selecting Abacus provider, choosing a model, and making API requests

Additional Notes

  • The Abacus provider uses ImplementsResponses: false, which means it follows the legacy response handling approach
  • Future enhancement: Could implement a hybrid approach that tries dynamic discovery first and falls back to the static list

@ksylvan ksylvan self-assigned this Dec 4, 2025
CHANGES

- feat: detect modelsURL starting with 'static:' and route
- feat: implement getStaticModels returning curated Abacus model list
- feat: register Abacus provider with ModelsURL 'static:abacus'
- chore: add fmt import for error formatting in provider code
- test: extend provider tests to include Abacus existence
- chore: update .vscode settings add 'kimi' and 'qwen' contributors
@ksylvan ksylvan force-pushed the kayvan/add-abacus-provider-for-chatllm-models branch from e366217 to 894459d Compare December 4, 2025 13:23
@ksylvan ksylvan merged commit 78fd836 into danielmiessler:main Dec 4, 2025
1 check passed
@ksylvan ksylvan deleted the kayvan/add-abacus-provider-for-chatllm-models branch December 4, 2025 19:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Question]: New AI vendor integration Chatllm

1 participant