Seamlessly integrate AI capabilities directly into your Laravel Eloquent models. Generate vector embeddings, perform semantic search, and augment data on-the-fly using AI.
- Features
- Installation
- Configuration
- Quick Start
- Advanced Features
- Database Support
- AI Service Providers
- API Reference
- Testing
- Contributing
- License
- 🤖 Multiple AI Providers: Support for OpenAI, Anthropic (Claude), Google Gemini, Ollama (local LLMs), and custom providers
- 🔍 Semantic Search: Find records based on conceptual similarity
- 📊 Automatic Embeddings: Generate and store vector embeddings automatically
- ⚡ On-the-Fly Augmentation: Generate summaries, keywords, and more dynamically
- 🏭 AI-Enhanced Factories: Generate realistic test data with AI
- 🎯 Trait-Based Integration: Add AI capabilities with a single trait
- 🔧 Extensible Architecture: Easily add support for any LLM provider
You can install the package via Composer:
composer require dgtlss/synapsePublish the configuration file:
php artisan vendor:publish --provider="Dgtlss\Synapse\SynapseServiceProvider" --tag="synapse-config"After publishing the configuration file, you can configure your AI services by setting environment variables in your .env file:
# Set your default AI service
SYNAPSE_DEFAULT_SERVICE=openai
# OpenAI Configuration
OPENAI_API_KEY=your_openai_api_key
OPENAI_EMBEDDING_MODEL=text-embedding-3-small
OPENAI_CHAT_MODEL=gpt-4o-mini
# Anthropic Configuration
ANTHROPIC_API_KEY=your_anthropic_api_key
ANTHROPIC_CHAT_MODEL=claude-3-sonnet-20240229
# Google Gemini Configuration
GEMINI_API_KEY=your_gemini_api_key
GEMINI_EMBEDDING_MODEL=text-embedding-004
GEMINI_CHAT_MODEL=gemini-1.5-flash
# Ollama Configuration (for local LLMs)
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_EMBEDDING_MODEL=nomic-embed-text
OLLAMA_CHAT_MODEL=llama3.1Note: You only need to configure the environment variables for the AI service(s) you plan to use. See the AI Service Providers section for detailed configuration options.
Follow these steps to get started with Synapse in your Laravel application:
Add the HasAiFeatures trait to any Eloquent model you want to enhance with AI capabilities:
<?php
namespace App\Models;
use Illuminate\Database\Eloquent\Model;
use Dgtlss\Synapse\Traits\HasAiFeatures;
class Post extends Model
{
use HasAiFeatures;
protected $fillable = ['title', 'content'];
}Define which attributes should be embedded and where to store the vector data:
// In your Post model - Array of arrays format (recommended)
protected $aiEmbeddable = [
[
'column' => 'content_embedding', // Database column for the vector
'source' => 'content', // Source attribute to embed
],
// You can add multiple embeddings:
// [
// 'column' => 'title_embedding',
// 'source' => 'title',
// ],
];
// Alternative: Flat array format (backward compatible)
// protected $aiEmbeddable = [
// 'column' => 'content_embedding',
// 'source' => 'content',
// ];Create a migration to add the vector column for storing embeddings:
<?php
use Illuminate\Database\Migrations\Migration;
use Illuminate\Database\Schema\Blueprint;
use Illuminate\Support\Facades\Schema;
use Dgtlss\Synapse\Database\MigrationHelper;
return new class extends Migration
{
public function up(): void
{
Schema::create('posts', function (Blueprint $table) {
$table->id();
$table->string('title');
$table->text('content');
// Add vector column for embeddings (1536 dimensions for OpenAI)
MigrationHelper::addVectorColumn($table, 'content_embedding', 1536);
$table->timestamps();
});
}
public function down(): void
{
Schema::dropIfExists('posts');
}
};Now you can perform semantic searches on your model:
// Find posts similar to a text query
$similarPosts = Post::searchSimilar('artificial intelligence trends', 5)->get();
// Find posts similar to an existing post
$post = Post::find(1);
$similarPosts = $post->searchSimilar(3);Tip: The embeddings will be automatically generated when you save or update your model records.
Generate additional attributes dynamically without storing them in the database. This is perfect for creating summaries, extracting keywords, or analyzing sentiment on-demand.
Configuration:
// In your Post model
protected string $aiAppendableSource = 'content';
protected array $aiAppendable = [
'summary' => 'Summarize the following article in two sentences: {self}',
'keywords' => 'Extract the 5 most important keywords from the following text as a comma-separated list: {self}',
'sentiment' => 'What is the sentiment of the following text (positive, neutral, negative)?: {self}',
];Usage:
$post = Post::find(1);
echo $post->summary; // "AI is transforming Laravel development..."
echo $post->keywords; // "laravel, ai, packages, eloquent, development"
echo $post->sentiment; // "positive"Note: The
{self}placeholder will be replaced with the content from the$aiAppendableSourceattribute.
Generate realistic test data using AI for your model factories:
<?php
namespace Database\Factories;
use Illuminate\Database\Eloquent\Factories\Factory;
use Dgtlss\Synapse\Facades\AIFactory;
class PostFactory extends Factory
{
public function definition(): array
{
$title = fake()->sentence(6);
return [
'title' => $title,
'content' => AIFactory::generate("Write a 200-word blog post about '{$title}'"),
];
}
}Usage in Tests:
// Generate a single post with AI content
$post = Post::factory()->create();
// Generate multiple posts
$posts = Post::factory()->count(10)->create();For optimal performance with vector operations, use PostgreSQL with the pgvector extension:
-- Install pgvector extension
CREATE EXTENSION vector;The MigrationHelper will automatically create proper vector columns:
// This creates a proper vector column in PostgreSQL
MigrationHelper::addVectorColumn($table, 'content_embedding', 1536);For MySQL and SQLite, the package automatically falls back to JSON/TEXT columns for storing vector data:
// Automatically handled by MigrationHelper
// Creates JSON column in MySQL, TEXT column in SQLite
MigrationHelper::addVectorColumn($table, 'content_embedding', 1536);Performance Note: While PostgreSQL with pgvector provides the best performance for vector operations, MySQL and SQLite implementations will work for smaller datasets and development environments.
Synapse supports multiple AI providers. Configure your preferred service in the .env file:
# .env
SYNAPSE_DEFAULT_SERVICE=openai
OPENAI_API_KEY=your_api_key
OPENAI_EMBEDDING_MODEL=text-embedding-3-small
OPENAI_CHAT_MODEL=gpt-4o-miniModels:
- Embeddings:
text-embedding-3-small,text-embedding-3-large,text-embedding-ada-002 - Chat:
gpt-4o,gpt-4o-mini,gpt-3.5-turbo
# .env
SYNAPSE_DEFAULT_SERVICE=anthropic
ANTHROPIC_API_KEY=your_api_key
ANTHROPIC_CHAT_MODEL=claude-3-sonnet-20240229Models:
- Chat:
claude-3-opus-20240229,claude-3-sonnet-20240229,claude-3-haiku-20240307
# .env
SYNAPSE_DEFAULT_SERVICE=gemini
GEMINI_API_KEY=your_api_key
GEMINI_EMBEDDING_MODEL=text-embedding-004
GEMINI_CHAT_MODEL=gemini-1.5-flashModels:
- Embeddings:
text-embedding-004 - Chat:
gemini-1.5-pro,gemini-1.5-flash,gemini-pro
# .env
SYNAPSE_DEFAULT_SERVICE=ollama
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_EMBEDDING_MODEL=nomic-embed-text
OLLAMA_CHAT_MODEL=llama3.1Popular Models:
- Embeddings:
nomic-embed-text,mxbai-embed-large - Chat:
llama3.1,llama3.2,mistral,codellama
You can easily add support for any LLM provider by implementing the CustomAIServiceInterface:
<?php
namespace App\Services;
use Dgtlss\Synapse\Contracts\CustomAIServiceInterface;
class MyCustomAIService implements CustomAIServiceInterface
{
private array $config = [];
public function generateEmbedding($text): array
{
// Your embedding logic here
// Return an array of float values
}
public function generateCompletion(string $prompt, array $options = []): string
{
// Your completion logic here
// Return the generated text
}
public function getName(): string
{
return 'my-custom-service';
}
public function isConfigured(): bool
{
// Check if your service is properly configured
return !empty($this->config['api_key']);
}
public function getConfig(): array
{
return $this->config;
}
public function setConfig(array $config): void
{
$this->config = $config;
}
}Register and Use:
// Register your custom service
app(\Dgtlss\Synapse\Services\AIServiceManager::class)
->registerCustomService('my-custom', MyCustomAIService::class);
// Use it like any other service
$result = Synapse::service('my-custom')->generateCompletion('Hello!');The HasAiFeatures trait adds AI capabilities to your Eloquent models.
| Property | Type | Description |
|---|---|---|
$aiEmbeddable |
array |
Configuration for automatic embedding generation (array of arrays or flat array) |
$aiAppendable |
array |
Configuration for on-the-fly attribute augmentation |
$aiAppendableSource |
string |
Source attribute for appendable features |
| Method | Description | Parameters |
|---|---|---|
searchSimilar(string $query, int $count = 5) |
Static scope for semantic search | $query: Search text, $count: Number of results |
searchSimilar(int $count = 5) |
Instance method to find similar records | $count: Number of results |
Example:
// Static search
$posts = Post::searchSimilar('artificial intelligence', 10)->get();
// Instance search
$post = Post::find(1);
$similarPosts = $post->searchSimilar(5);The main facade for interacting with AI services:
use Dgtlss\Synapse\Facades\Synapse;
// Generate embeddings
$embedding = Synapse::generateEmbedding('Your text here');
// Generate completions
$result = Synapse::generateCompletion('Your prompt here');
// Use specific service
$result = Synapse::service('ollama')->generateCompletion('Your prompt');
// Check if service is configured
$isConfigured = Synapse::service('openai')->isConfigured();Generate AI content for your model factories:
use Dgtlss\Synapse\Facades\AIFactory;
// Generate content for factories
$content = AIFactory::generate('Write a blog post about Laravel');
// Generate with specific service
$content = AIFactory::service('anthropic')->generate('Write a product description');The MIT License (MIT). Please see License File for more information.