So, what you’re going to have to do is go into the port forwarding settings in your router and you want to forward port 55000 to the IP address of your Unraid server. You may just have to look up how to do that for your particular router. If that doesn’t work for you and you have something called CGNAT (which happens a lot with T-Mobile Home Internet for example), you can use Tailscale (which is built into Unraid with an official plugin) and install the Tailscale app on your phone and just leave it on as an always-on VPN. That will get ARC to work for you.
By the way, I’d really appreciate it if you could share here in a week or so if you had to do any restarts. I’m curious if the lower memory usage prevents that or if perhaps an out-of-date dependency was causing some issues with Roon in your case. I think the main issue with Steef’s container is that it hasn’t been updated since 2023, so a lot of those dependencies those Roon requires to run are just much older than they should be. I was personally running into some really transient issues where tracks would just not play on ARC when they were being transcoded. It would be the same song at the same point, and then it would just stop happening suddenly. It was really weird. Ever since I made my own container with FFMPEG updated, that has stopped.
Thanks for the tip on Roon ARC. I will give it try.
As for restarting there is no need to wait a week since I had to restart Roon just this morning.
First the CPU and memory load were both increasing. The CPU load up to around 8% and the memory load up to around 29GB. After restarting the loads are less than 1% CPU and 23GB memory.
Second, as is often the case, Roon was very slowly adding a new favorite from Qobuz to my library. Once this happens Roon just slows down to crawl when performing any library function. Playback is not effected.
Third, as usual I was doing some library maintenance, in this case trying to manually identify some unidentified albums. I have over 11K of unidentified albums.
I’m thinking that these performance issues have little or nothing to do with the Roon container and more to do with the way Roon handles its database. My large library issues are not unique to me as I see similar issues happening to other Roon users with large libraries (over one million tracks). In other words, not your problem
Are you running Roon on an NVMe drive and particularly a very fast NVMe drive? Like, top of the line. You also may want to consider just getting a very fast NVMe drive. Making a specific BTRFS pool just for Roon and pointing all of Roon’s files and assets in the container just to that specific drive. You also may want to consider just simply running a regular Linux distro and running Roon natively on Linux. Since you have a very large library, it might be better to run Roon on bare metal.
Roon on fast NVMe drive - yes
Roon’s database is also on an NVMe drive.
As for running Roon directly on Linux, that’s way, way, way above my pay grade.
Plus that might required having to use Roon’s unreliable backups, so that’s a non-starter for this gun sky Roon user. You know fool me once…
You can set it to run on a custom schedule, and this crontab will run it every day at midnight.
0 0 * * *
That way Roon will restart automatically, and you won’t have to touch it. Just make sure it’s not happening at the same time as your Appdata backup plugin. You might have to adjust the crontab, and if you just go into ChatGPT and ask it to set a crontab for whatever time or schedule you want, it’ll just spit it out for you.
@mackid1993 - do somehow have control of how much RAM is allocated to the roonserver container on UnRaid?
The amount of RAM being used by @Jazzfan_NJ’s container is very large. Suspiciously so. I would expect it to be under 8GB even with a huge library.
Roon is written in C#. Allowing Roon to run with a very large amount of memory will cause fewer garbage collections to happen but, when they do happen, they will take much longer and can potentially cause behavioral issues in the app when they do. You’re not Roon any favors by allowing it to have this much memory.
That said, I closely track the amount of memory Roon uses over time using a combination of Glances (for telemetry), InfluxDB (time-sequence data storage), Grafana (visual display). I don’t post about this elsewhere because I don’t want to debate it with people, but the May 13 issue of Roon caused Roon’s base memory usage to be a little higher and pretty clearly introduced a memory leak that causes memory usage to grow over time. What I mean by this is that you can see in a memory usage graph over time when garbage collection happens - it’s a sawtooth profile where memory goes up, hits a peak, comes down. When there’s a memory leak in a managed app, the peaks increase, the low points after GC do to, So I do see memory leaks over time. It’s a slow leak - not the kind of thing I would expect to cause the need for daily resets.
I think the real question is why is this container using so much memory in the first place.
You can add extra parameters to limit the usage. I don’t really believe it’s an Unraid thing, but it can be done through the GUI, and you just add a flag. This Reddit post explains how to do it.
It also sounds like he has a ridiculously large library, but even so, I think it’s probably corrupted in some way. There’s something wrong there. I’ve also never had an issue with Roon’s native backup functionality, so I would imagine there’s definitely something wrong with his database. It might be time for a fresh start.
For reference, mine is below. After a restart, Roon somewhere in the neighborhood of 2.5 GB.
I don’t know what issue @Jazzfan_NJ has with backups but I, and numerous others, have had backups go corrupt. In my case, that showed up as Roon suddenly being unwilling to open its database and, when attempting to restore, backups were corrupt going back a few weeks. I was able to go far enough back to find a backup that worked because I keep Roon’s backups in my offsite (Backblaze) archive.
You advised him to use UnRaid to backup. The primary LevelDB fork does not support live backups which means that if UnRaid backed up LevelDB files in use, there is no guarantee that they will be functional or internally consistent. If you want to back up Roon’s database yourself, stop the server first, do the backup, then start it.
You gentlemen are starting to sound like modern tech support: blame the user,not the software or hardware (unless it’s from another vendor).
The memory usage was high even after a fresh start, which I just had to do back in January when I first started using the server.
The more time I spend with a large library the more I believe that Roon just struggles with libraries over a certain size but like all software companies just tries to shift the blame elsewhere, like on the user who pays them for their broken software.
When I moved my Roon core over to the UnRaid server I tried to restore from backup only to have the restore fail. Roon support identified the cause of the failure to be a missing “file” from the backup. What really upset me (besides losing all the edits I made over five plus years to my Roon database) was that there was no indication from Roon that the backup sets were corrupt. Again a software error, not a user error.
Not me, my friend. Roon chose a database platform (LevelDB) that is prone to corruption. That was a bad choice. Roon does a poor job of recognizing corruption when it occurs, and happily goes on “backing up” corrupt databases. Database “restore” can fail when the backup is of a damaged database. It’s a mess.
I’m not blaming you for your memory usage or database issues - I’m just trying to help in spite of those problems.
Music library: over a million tracks, with about 80% local files and 20% Tidal/Qobuz
I didn’t mean to disparage you or Mackid, it was just my overall frustration coming out. By the way, “Roon chose a database platform (LevelDB) that is prone to corruption.” is the first time that I’ve seen anyone call out Roon for their poor choice of a database platform. Basically by continuing to use LevelDB Roon is making their paying users with large libraries jump through hoops. Roon has two choices:
Port everything over to a database platform without these corruption and backup issues
Admit that their software has limitations on the size of one’s library.
Instead, like any self respecting software company, they blame the user and/or the user’s hardware. I had similar issues when I was using Squeezeboxes and Logitech Media Server but at least the software was FREE.
I totally understand your puzzlement with Roon’s memory load on my server. I also run Plex on the same server and both Plex and Roon share the same local music files, plus Plex also deals with all my video files (which are also substantial). As you can see below Roon is just a mess while Plex is really wonderful. Granted Roon is quite different from Plex and has lots of additional features and functionality but 1GB versus 30GB, really???
They clearly state in the specifications that they don’t test with databases larger than 250k tracks and that they expect larger databases to work by throwing lots of hardware incl. RAM at it, but essentially that’s a best guess and users are on their own.
Of course, support is usually helpful with analyzing issues in larger libraries anyway, but the description in the specs wouldn’t make me very confident when going to more than 4 times that size. At least I would expect that I need very high performance hardware.
Well I stand corrected. By the way I do have lots of RAM and high performance hardware.
Look it’s not that Roon doesn’t work with large libraries, it’s just that the larger the library the more performance suffers. Yes I restart Roon once the performance and speed slowdowns become noticeable and yes, these performance issues only affect the user library interface. There are no issues with playback, DSP, audio zones, etc., just with interacting with the library, as in long waits on searches and slow response times when editing the library. I love all the playback flexibility and the DSP functions, so I will continue to use Roon and I will continue to push the limits on library size, however I will do my best to limit my complaining
Plus the Roon Community is filled with some really knowledgeable and very helpful people.
That’s many times larger than anything I have experience with.
Oh this has been discussed quite a bit on the forum. Just search for “LevelDB”.
I hear you. In my experience, Roon works best when you adjust your usage, equipment, library, and expectations to match its abilities and to avoid its shortcomings and issues. Your library size is almost certainly a challenge for it and if you have unidentified tracks, which you probably do with a library of that size, that will just create more problems. I’m not suggesting you change anything - this is my form of commiserating
You forgot option three, which is “selective deafness”. That’s the thing where you pretend that you can’t hear it when people say things you don’t like.
Unraid Appdata Backup stops the container before it backs up /data. It’s not live. I see under 2 GB of usage. Also, Roon is not meant to deal with a million tracks at all. I mean that’s like an insane number, so I’m not surprised he’s using so much RAM and it’s locking up. It’s just not designed for that. I mean, at a library that scale, this should be running on a dedicated Linux-based server on bare metal, not in Docker. Part of the problem is the virtualization I would say.
@Jazzfan_NJ what you can do is go in and rename the data folder to data.old with the Docker container stopped, and then start it up again which will create essentially a fresh database. Let it load in all of your tracks and see if the memory and CPU usage is lower. Renaming the folder to data.old, you know, obviously won’t hurt your old database. When you want to switch back, just delete the new one that you created and rename data.old back to data. That’ll rule out if your database is hosed, at least. Obviously, there’s no risk because you’re not deleting anything, you’re just renaming a folder. Moreover, you have backups, so you should be fine.
Regarding unidentified tracks (actually “albums” since that the level at which Roon identifies items), why do you fell that unidentified albums “create more problems”? The way I understand things, an identified album creates more internal links (credits and other release information) than an unidentified album, which creates much fewer internal links. Or am I not understanding the differences between identified and unidentified with respect to their impact on Roon’s database.
And then there are all the albums, mostly newer releases, which are identified by Roon but contain absolutely no useful metadata. Plus there are those albums, which I’ve tagged to show useful metadata but once identified by Roon, only to have that useful metadata replaced with garbage metadata?
Here’s what I mean (not a real album, just making things up for this example):
Album: Joe Blow Quartet - Hot Air and Other Favorites
In the “artists” file tag field: Joe Blow, Tom Wind, Alex Breeze, Fred Gust
Composer tag also filled on a track level
When unidentified Roon shows all the four members for this blowing quartet under the “credits” section.
Once this album is identified by Roon, the “credits” now shows only “Joe Blow Quartet” and all the composer credits are gone.
So which version of the album, “identified” or “unidentified”, has more of impact on Roon’s database?
I’m not trying be a troll but the handling of identified versus unidentified albums with respect to Roon’s database has never been clearly defined. I’ve also addressed the lack of metadata for new releases elsewhere on this Community. Please do not suggest that I manually add metadata to some online resource so that Roon can then access and use that metadata because I don’t volunteer my time to a for profit company. I’m paying Roon to do that work.
The theory (perhaps established) is that Roon pathologically tries to identify unidentified albums. The consequence is high CPU utilization. It also could impact memory consumption.
I’m not the expert on this problem - I don’t have it personally because I don’t have unidentified albums. Some people are very confident that it is a source of issues.
If you’re unfamiliar with this issue, just search for “unidentified albums”.
Well if that is the case then Roon does a terrible job identifying unidentified albums since many of the over 11K unidentified albums in my library can be identified manually. In other words, these albums are in Roon’s “master” identified album database but Roon still fails to automatically identify them. So much for artificial intelligence.
I just finish manually identifying over 400 unidentified Phish live downloads that matched the identified albums 100% - same title, tracks and times, but Roon failed to identify them in spite of over 5 months of “trying”.
Roon checks the online metadata for unidentified albums periodically, and this seems to be quite costly in resources. Not a problem if you have a few of them, but can bog down Roon if there are many. This has been well documented.
Unidentified albums can have as many and more links as identified ones, depending on how much work the user puts into tagging personnel.