AI Assistant Integration for Roon

Control your Roon system with natural language — ask Claude or ChatGPT to play music, build queues, and adjust volume.

What you can do:

  • “Play some Keith Jarrett”
  • “Queue Hotel California after this”
  • “Search for albums by Patricia Barber”
  • “Turn the volume down a bit”
  • “What’s playing right now?”
Backstory

Backstory

A few months ago I released firmware to turn a physical Waveshare knob to a great controller with album art display. To make it work, I built GitHub - open-horizon-labs/unified-hifi-control: Source-agnostic hi-fi control bridge, a Roon extension designed to be generic enough for other controllers.

The latest release adds MCP support — the emerging standard for connecting AI assistants to external tools. This means Claude, ChatGPT, and other MCP-compatible assistants can now talk directly to your Roon system.

How it works

The extension exposes these capabilities to AI assistants:

  • Search your library, TIDAL, or Qobuz
  • Play or queue what it finds
  • Control playback (play, pause, next, previous)
  • Adjust volume up, down, or to a specific level

The AI handles the natural language part. The extension handles talking to Roon.

Setup

  1. Install or update GitHub - open-horizon-labs/unified-hifi-control: Source-agnostic hi-fi control bridge (Docker, Synology, QNAP, or binary)
  2. Add to your AI assistant’s MCP config:
  {
    "mcpServers": {
      "unified-hifi-control": {
        "type": "http",
        "url": "http://<your-bridge-ip>:8088/mcp"
      }
    }
  }
  1. Ask it to play something!

Tested with: Claude Code, ChatGPT desktop, BoltAI

Limitations: Search and play are Roon-only for now. Transport controls work with all adapters.

Happy to answer questions.

1 Like

From Bolt AI:

Claude Code:

Hi @Muness. Just connected your MCP server to OpenClaw. It works great :+1: I’m impressed. To play Tidal songs without to have them add to my library, is a great feature. This isn’t possible with RoonCommandLine, which i used before.

1 Like

Amazing! Thank you for the feedback. Looks like the issues earlier (deleted) were resolved?

Yes, resolved. I was first testing it in a shell, which raised some errors. After i switched to Telegram and let the LLM figure out the MCP queries, it works great.

1 Like

Oh yes, laundry isn’t so boring with an AI assistant playing the tunes…

How do I add Unified Hi-Fi Control MCP server to ChatGPT Desktop? ChatGPT tells me I can’t.

You can’t add Unified Hi-Fi Control’s MCP server directly to ChatGPT Desktop.

ChatGPT Desktop is not an MCP client and does not support MCP configuration

I wonder if they’ve removed that feature from ChatGPT Desktop.

Developer mode, and MCP apps in ChatGPT [beta] | OpenAI Help Center says:

Note: Full MCP (Model Context Protocol) support, including modify/write actions, is rolling out in beta to ChatGPT Business, Enterprise, and Edu plans , Functionality, UI, and permissions may change as we iterate.

On my (Business) account I can’t use MCP (though I thought I could before).

Fortunately, Codex has a section for MCP servers:

I am using that with GPT-5.2.

Thanks, I’ll give that a look.

-J

I had Codex setup in VS Code and was able to add your MCP there. Worked. Then tried the CLI and the desktop version. Worked there as well.

Tried to get it working with Clive using llama and gemma3 but no joy.

1 Like

I’ve been able to get Open WebUI to connect to the bridge. Using llama3.1 and gemma3, it will list Roon zones but when I try to play something I get “Play error: Browse request timed out”. Everything works fine with Codex, is this a Open WebUI problem?

Without looking my guess is that those LLMs are having trouble with sending the request correctly.

Can you configure it to use ChatGPT models? That would help show if it’s the app or the model.

Hmmm. Having some trouble getting the OpenAI API working in open WebUI, but I think you’re right that it’s the LLM. However, Gemma3 tells me local LLMs are getting better all the time! :grinning_face_with_smiling_eyes: