Apr 17, 2026 by Arnaud Stoz | 218 views
https://cylab.be/blog/502/neovim-teaming-up-codecompanion-with-a-remote-open-webui-instance
If you are using Neovim, you have likely come across the plugin Code Companion. This plugin allows you to integrate Large Language Models (LLMs) into Neovim, much like you would expect from any modern IDE.
While Code Companion offers a plethora of features, this blog post will focus on a specific use case: configuring the plugin to work with a self-hosted instance of Open WebUI. Open WebUI is a popular, open-source web interface for interacting with LLMs, and integrating it with Neovim can provide a seamless and powerful coding experience.
This blog post assumes you are using LazyVim. If you’re not, please refer to the official Code Companion documentation for alternative installation methods.
To install the plugin with LazyVim, add the following code to your configuration:
return {
"olimorris/codecompanion.nvim",
version = "^19.0.0",
opts = {},
dependencies = {
"nvim-lua/plenary.nvim",
"nvim-treesitter/nvim-treesitter",
},
},
Congratulations! You have now installed the plugin. You can now proceed to configure it to use your remote WebUI instance.
Configuring Code Companion to work with Open WebUI involves three main steps:
First:
Settings menuAccount sectionThere you can create and copy you API key. Copy the value of the key (it should start with sk-...).
Unfortunately, there is no out-of-the-box adapter to connect to a WebUI instance. You will need to write a custom one.
In
codecompanion.nvim, an adapter is an abstraction layer that translates the plugin’s generic instructions into the specific API format required by a particular Large Language Model (LLM) provider. In simpler terms, this is what connects you to your AI provider.
Since Open WebUI follows the OpenAI API specification, you will need to create a new adapter called openwebui that extends the openai_compatible adapter from Code Companion. You will then need to provide your API key and the URL of your Open WebUI instance. This new definition needs to be done in the opts table.
The adapters definition is shown below.
adapters = {
http = {
openwebui = function()
return require("codecompanion.adapters").extend("openai_compatible", {
env = {
url = "OPEN_WEBUI_URL",
api_key = "OPEN_WEBUI_API",
}
})
end,
},
}
As you can see, the URL and the API key are not hardcoded into the configuration file directly. They are stored in two environment variables which will be automatically read by the plugin.
You can now use this adapter to set up your interaction with Code Companion.
If you want to know more about what are interaction, refer to the official documentation
When you define an interaction, the two main things you need to specify are:
Add the following code in the opts table after the adapters definition.
Feel free to use whichever model you want to.
interactions = {
chat = {
adapter = "openwebui",
model = "qwen3-coder:30b",
},
inline = {
adapter = "openwebui",
model = "qwen3-coder:30b",
}
}
The complete configuration file looks like the one below:
return {
"olimorris/codecompanion.nvim",
version = "^19.0.0",
keys = {
{ "<leader>cc", "<cmd>CodeCompanionChat<cr>", desc = "CodeCompanion Chat"},
{ "<leader>ca", "<cmd>CodeCompanionAction<cr>", desc = "CodeCompanion Action"}
},
opts = {
adapters = {
http = {
openwebui = function()
return require("codecompanion.adapters").extend("openai_compatible", {
env = {
url = "OPEN_WEBUI_URL",
api_key = "OPEN_WEBUI_API",
}
})
end,
},
},
interactions = {
chat = {
adapter = "openwebui",
model = "qwen3-coder:30b",
},
inline = {
adapter = "openwebui",
model = "qwen3-coder:30b",
}
}
},
dependencies = {
"nvim-lua/plenary.nvim",
"nvim-treesitter/nvim-treesitter",
},
}
Congratulation you have now setup codecompanion to use your self hosted Open WebUI instance.
Take a look at the official documentation as there is a lot of other tweak possible
This blog post is licensed under
CC BY-SA 4.0
AI Data Sovereignty
AI Data Sovereignty
Running a local Large Language Model (LLM) is sometimes something we want to avoid being fully dependent on the big AI actors, such as OpenAI, Anthropic, and Google, and so on.Data Sovereignty AI