Skip to content

[FEATURE] Ollama local AI model support. #287

@AJaySi

Description

@AJaySi

@Ratna-Babu

1). Ollama will help ALwrity run open source AI models on end user laptops.
This provides free access to AI models like llama, gemma, deepseek etc.
This also provides data privacy and cost savings.

2). Since, Alwrity is Analytics data driven, there are lot of external API and AI calls that happen. Commercial AI models will prove to be very expensive, very soon.

3). One AI model should not be used for all the tasks. AI data driven analysis can happen on smaller AI Models and complex tasks on reasoning models.

4). As our target audience are content creators and digital marketing professionals, we will need to abstract all these details. The end user will only tick mark an option 'Run Free AI models Locally' and then we will set up everything for them, in the backend.

5). We also need to explore ollama cloud and litellm and if they provide a generous free API calls per months, then using that will be easier for setup and maintenance.

6). Also, well crafted AI prompts with right context can generate comparable, if not better, results compared to best commercial AI models.

7). Later on, Ollama + Unsloth will also become our gateway to Fine tuning Open source AI models for our end users. This would mean providing new AI models trained on end users data, social media and blogs and a SME in digital marketing. So, we will end up making our own ALwrity AI model for digital marketing.

8). LLMs are overkill for most of the tasks as they are generic and cater to all domains of knowledge and ALwrity only needs AI model trained and SME only end user's digital Prescence + Digital marketing knowledge.

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions