pfrankov45k downloadsA hub for setting AI providers (OpenAI-like, Ollama and more) in one place.
⚠️ Important Note: This plugin is a configuration tool - it helps you manage your AI settings in one place.
Think of it like a control panel where you can:
The plugin itself doesn't do any AI processing - it just helps other plugins connect to AI services more easily.
This plugin is available in the Obsidian community plugin store https://obsidian.md/plugins?id=ai-providers
You can install this plugin via BRAT: pfrankov/obsidian-ai-providers
ollama pull gemma2 or any preferred model from the library.Ollama in Provider typegemma2)Additional: if you have issues with streaming completion with Ollama try to set environment variable OLLAMA_ORIGINS to *:
launchctl setenv OLLAMA_ORIGINS "*".OpenAI in Provider typeProvider URL to https://api.openai.com/v1API key from the API keys pagegpt-4o)There are several options to run local OpenAI-like server:
OpenRouter in Provider typeProvider URL to https://openrouter.ai/api/v1API key from the API keys pageanthropic/claude-3.7-sonnet)Google Gemini in Provider typeProvider URL to https://generativelanguage.googleapis.com/v1beta/openaiAPI key from the API keys pagegemini-1.5-flash)LM Studio in Provider typeProvider URL to http://localhost:1234/v1gemma2)Groq in Provider typeProvider URL to https://api.groq.com/openai/v1API key from the API keys pagellama3-70b-8192)Docs: How to integrate AI Providers in your plugin.
Quick reference (details in SDK docs):
try {
const finalText = await aiProviders.execute({
provider,
prompt: "Hello",
onProgress: (chunk, full) => {/* stream UI update */},
abortController
});
// use finalText
} catch (e) {
// handle error / abort
}
Tool-calling loops are available via toolsExecute() (SDK 1.7.0 / Service API v4). It is message-only and uses top-level tools / tool_choice, returning an OpenAI-style assistant message (role, content, tool_calls) that you can append directly to messages history.
Removed callbacks: onEnd / onError — promise resolve/reject covers them (only onProgress remains for streaming). Legacy chainable handler also deprecated.