Search...Search plugins and themes...
⌘K
Sign in
  • Get started
  • Download
  • Pricing
  • Enterprise
  • Account
  • Obsidian
  • Overview
  • Sync
  • Publish
  • Canvas
  • Mobile
  • Web Clipper
  • CLI
  • Learn
  • Help
  • Developers
  • Changelog
  • About
  • Roadmap
  • Blog
  • Resources
  • System status
  • License overview
  • Terms of service
  • Privacy policy
  • Security
  • Community
  • Plugins
  • Join the community
  • Discord
  • Forum / 中文论坛
  • Merch store
  • Brand guidelines
Follow us
DiscordTwitterBlueskyThreadsMastodonYouTubeGitHub
© 2026 Obsidian

AI Providers

pfrankovpfrankov45k downloads

A hub for setting AI providers (OpenAI-like, Ollama and more) in one place.

Add to Obsidian
  • Overview
  • Scorecard
  • Updates21

⚠️ Important Note: This plugin is a configuration tool - it helps you manage your AI settings in one place.

Think of it like a control panel where you can:

  • Store your API keys and settings for AI services
  • Share these settings with other Obsidian plugins
  • Avoid entering the same AI settings multiple times

The plugin itself doesn't do any AI processing - it just helps other plugins connect to AI services more easily.

image

Required by plugins

  • Local GPT

Supported providers

  • OpenAI
  • OpenRouter
  • Anthropic
  • Google Gemini
  • Mistral AI
  • Together AI
  • Fireworks AI
  • Perplexity AI
  • DeepSeek
  • xAI (Grok)
  • Cerebras
  • Z.AI
  • Groq
  • 302.AI
  • Novita AI
  • DeepInfra
  • SambaNova
  • LM Studio
  • Ollama (and Open WebUI)
  • OpenAI compatible API

Features

  • Fully encapsulated API for working with AI providers
  • Develop AI plugins faster without dealing directly with provider-specific APIs
  • Easily extend support for additional AI providers in your plugin
  • Available in 11 languages: English, Spanish, French, Italian, Portuguese, German, Russian, Chinese, Japanese, Korean, and Dutch

Installation

Obsidian plugin store (recommended)

This plugin is available in the Obsidian community plugin store https://obsidian.md/plugins?id=ai-providers

BRAT

You can install this plugin via BRAT: pfrankov/obsidian-ai-providers

Create AI provider

Ollama

  1. Install Ollama.
  2. Install Gemma 2 ollama pull gemma2 or any preferred model from the library.
  3. Select Ollama in Provider type
  4. Click refresh button and select the model that suits your needs (e.g. gemma2)

Additional: if you have issues with streaming completion with Ollama try to set environment variable OLLAMA_ORIGINS to *:

  • For MacOS run launchctl setenv OLLAMA_ORIGINS "*".
  • For Linux and Windows check the docs.

OpenAI

  1. Select OpenAI in Provider type
  2. Set Provider URL to https://api.openai.com/v1
  3. Retrieve and paste your API key from the API keys page
  4. Click refresh button and select the model that suits your needs (e.g. gpt-4o)

OpenAI compatible server

There are several options to run local OpenAI-like server:

  • Open WebUI
  • llama.cpp
  • llama-cpp-python
  • LocalAI
  • Obabooga Text generation web UI
  • LM Studio
  • ...maybe more

OpenRouter

  1. Select OpenRouter in Provider type
  2. Set Provider URL to https://openrouter.ai/api/v1
  3. Retrieve and paste your API key from the API keys page
  4. Click refresh button and select the model that suits your needs (e.g. anthropic/claude-3.7-sonnet)

Google Gemini

  1. Select Google Gemini in Provider type
  2. Set Provider URL to https://generativelanguage.googleapis.com/v1beta/openai
  3. Retrieve and paste your API key from the API keys page
  4. Click refresh button and select the model that suits your needs (e.g. gemini-1.5-flash)

LM Studio

  1. Select LM Studio in Provider type
  2. Set Provider URL to http://localhost:1234/v1
  3. Click refresh button and select the model that suits your needs (e.g. gemma2)

Groq

  1. Select Groq in Provider type
  2. Set Provider URL to https://api.groq.com/openai/v1
  3. Retrieve and paste your API key from the API keys page
  4. Click refresh button and select the model that suits your needs (e.g. llama3-70b-8192)

For plugin developers

Docs: How to integrate AI Providers in your plugin.

Quick reference (details in SDK docs):

try {
    const finalText = await aiProviders.execute({
        provider,
        prompt: "Hello",
        onProgress: (chunk, full) => {/* stream UI update */},
        abortController
    });
    // use finalText
} catch (e) {
    // handle error / abort
}

Tool-calling loops are available via toolsExecute() (SDK 1.7.0 / Service API v4). It is message-only and uses top-level tools / tool_choice, returning an OpenAI-style assistant message (role, content, tool_calls) that you can append directly to messages history.

Removed callbacks: onEnd / onError — promise resolve/reject covers them (only onProgress remains for streaming). Legacy chainable handler also deprecated.

Roadmap

  • Docs for devs
  • Ollama context optimizations
  • German translations
  • Chinese translations
  • Update to latest OpenAI version and embedding models
  • Russian translations
  • Groq Provider support
  • Passing messages instead of one prompt
  • Anthropic Provider support
  • Shared embeddings to avoid re-embedding the same documents multiple times
  • Spanish, Italian, French, Dutch, Portuguese, Japanese, Korean translations
  • Incapsulated basic RAG search with optional BM25 search

My other Obsidian plugins

  • Local GPT that assists with local AI for maximum privacy and offline access.
  • Colored Tags that colorizes tags in distinguishable colors.
71%
HealthExcellent
ReviewCaution
About
Manage AI provider settings and API keys from a single control panel. Store and share credentials and configuration for multiple AI services so other Obsidian plugins can connect without re-entering details. Do not perform AI processing; serve as a centralized, extendable provider hub compatible with OpenAI-style APIs.
AIIntegrationsDevelopers
Details
Current version
1.10.0
Last updated
3 weeks ago
Created
Last year
Updates
21 releases
Downloads
45k
Compatible with
Obsidian 0.15.0+
License
MIT
Report bugRequest featureReport plugin
Author
pfrankovpfrankov
github.com/pfrankov
GitHubpfrankov
  1. Community
  2. Plugins
  3. AI
  4. AI Providers

Related plugins

BRAT

Easily install a beta version of a plugin for testing.

Smart Composer

AI chat with note context, smart writing assistance, and one-click edits for your vault.

MCP Tools

Securely connect Claude Desktop to your vault with semantic search, templates, and file management capabilities.

Terminal

Integrate consoles, shells, and terminals.

Local REST API

Unlock your automation needs by interacting with your notes over a secure REST API.

BMO Chatbot

Generate and brainstorm ideas while creating your notes using Large Language Models (LLMs) such as OpenAI's "gpt-3.5-turbo" and "gpt-4".

Global Proxy

Configure network proxies for users in areas with restricted networks.

Ollama

Enable the usage of Ollama within your notes.

Smart Connections

AI link discovery copilot. See related notes as you write. Lookup using semantic (vector) search across your vault. Zero-setup local model for embeddings, no API keys, private.

Copilot

Your AI Copilot: Chat with Your Second Brain, Learn Faster, Work Smarter.