The latest release adds MCP support — the emerging standard for connecting AI assistants to external tools. This means Claude, ChatGPT, and other MCP-compatible assistants can now talk directly to your Roon system.
How it works
The extension exposes these capabilities to AI assistants:
Search your library, TIDAL, or Qobuz
Play or queue what it finds
Control playback (play, pause, next, previous)
Adjust volume up, down, or to a specific level
The AI handles the natural language part. The extension handles talking to Roon.
Hi @Muness. Just connected your MCP server to OpenClaw. It works great I’m impressed. To play Tidal songs without to have them add to my library, is a great feature. This isn’t possible with RoonCommandLine, which i used before.
Yes, resolved. I was first testing it in a shell, which raised some errors. After i switched to Telegram and let the LLM figure out the MCP queries, it works great.
Note:Full MCP (Model Context Protocol) support, including modify/write actions, is rolling out in beta to ChatGPTBusiness, Enterprise, and Edu plans, Functionality, UI, and permissions may change as we iterate.
On my (Business) account I can’t use MCP (though I thought I could before).
I’ve been able to get Open WebUI to connect to the bridge. Using llama3.1 and gemma3, it will list Roon zones but when I try to play something I get “Play error: Browse request timed out”. Everything works fine with Codex, is this a Open WebUI problem?
Hmmm. Having some trouble getting the OpenAI API working in open WebUI, but I think you’re right that it’s the LLM. However, Gemma3 tells me local LLMs are getting better all the time!