Huge 100K albums database (Roon Requirements)

… Or Linux.
I’d seriously consider whether Roon is the right tool for the job, disabling disks so that it’s useable sounds like a workaround from day one.
Look at LMS, Lyrion Music Server, it’s free, whilst not as polished, works well with correctly tagged files and sounds as good as Roon, IMO.
It depends on your endpoints, I use a WiiM streamer who support LMS natively and play through DNLA to a Chord Poly Mojo2 and Kef active system, it works great.
LMS is free so you can try it and dump it if it’s not for you or fails to work on your large library.

1 Like

Ubuntu-Server, i9-12900K, 128GB RAM (although RoonServer „only“ uses between 30 and 45GB of RAM, it is definitely more stable than 64GB), very fast NVMe SSD, less than 10,000 unidentified albums and daily restarts of RoonServer will make Roon quite usable in a case comparable to original poster’s case that I have seen.

And there is the proof it’s not up to the job as we have said. Having to restart a database every day to make it work isn’t correct and shows there is a serious problem dealing with large libraries.

2 Likes

And how many albums overall?

If I remember correctly, around 110k albums.

Thanks. And here we are at a size that is approx. 5 times larger than Roon expected, and even just the unidentified albums are as many as what Roon thought would be a „normal“ library for software that is a music player and not the repository of humanity’s aggregate music output. Therefore, literally this is correct…

… but I still think you can break anything in this way. It may not be up to a job it wasn’t made for, but even Google would break if you feed it the internet not once but ten times

You do realize the actual volume of music releases right? You’re looking at sometimes 75-150k unique releases in a given day.

Roon is unfortunately poorly optimized for scale, but in fairness, a very small % of users actually ever reach anywhere near the point where it becomes untenable IMO

The “output of humanity” was a joke. The “you can break anything if you use it in unintended ways” was serious. The expected library size is stated in the specification. I wouldn’t expect it to work very well with 5 times more.

Sorry, I was just having this conversation with my buddy, basically he was saying “there’s no way streaming services have over a 100 million songs, do 100 million songs even exist?” and it’s like… you have no clue how much music drops every day… it’s insane and AI gen’d stuff has inflated the numbers pretty heavily in the last ~year.

As for the expected library size, I was recently talking to some guys (Roon users and Roon-curious) and I was pretty shocked to hear that like, most of them were looking for something that would handle a under 1000 to a few thousand albums. I really think this ‘massive library’ thing is a niche of users within an already niche product.

I would delete 90 percent of those albums. Nobody can listen to 75,000 hours of music.

Yeah, but they cause a disproportionate amount of complaints

1 Like

I never bothered making many complaints. It became clear that the scaling problems I was experiencing were very much ‘on me’ due to being well outside the average requirements of a Roon user.
It’s somewhat absurd to expect them to support some tiny minority of users. If it’s that big a deal, build your own toys lmao.

1 Like

In fairness, those complaints are often met with very incomprehensible, misguided theories about how “your network isn’t up to the task” or ramblings about the “complexity” of Roon’s metadata, or some nonsense about incapable hardware :roll_eyes:. It’s not rocket science, it’s just architected in a manner that doesn’t scale particularly well.

shrug

6 Likes

Well, networks and hardware often ARE not up to the task, in addition to inherent issues

I dunno, someone who has like 50, 100k albums… they don’t have a reliable home network (at that point I kind of expect a bit of enterprise gear too)? Seems unlikely but then again, I’ve fixed up some really disastrously laid out client networks before, in places where you’d really expect someone to do things at least… somewhat correctly lmao.

1 Like

To be honest I don’t know how many of the 500k+ folks specifically have been told it’s their network unless their server was on wifi. But we have definitely seen some woefully underpowered computers, compared to the official recommendations.

I have had some real headscratchers over the years. I liked the cascading 5 port gigabit switches fiasco. ~24 5 port unmanaged switches… just… there, daisy chained (so you got however many out of their main routing solution and then from there on daisy chained) in a machine room, covering the networking for an entire production house.

3 Likes

Hi my last two cents for this thread as a quite experienced IT pro as consultant

My experience with other applications is that for very large databases the Roon developer would have to consider to offer the option to connect to an external database server like a dedicated MariaDB server where there are many options for tuning with large databases

Connection between Roon and MariaDB server 10 GB (in the meantime there are 10 GB unmanaged LAN switches available), this should scale for a while, even spending the rest of a long life never listening to a track twice

even better than a database could be to integrate a part of the information in a directory service with an LDAP interface or something, directory services are optimized for fast read of millions of records like searches for one Name in the phone book of Mexico City

1 Like

I don’t understand the drama some people make out of such a simple question. From my point of view, I can only say: Yes, it makes sense to use roon, no matter how big the library is. You just have to expect that it will take a few weeks for roon to read your archive. Also, like me, you will need a lot of time to make manual corrections. But anyone who has created such a large collection will also have the desire and time to make the data suitable for Roon…

2 Likes

I read these discussions to try and help where I think I can but I also have to believe it’s often a case of non-optimized setups, server and network.

I run a 100,000+ track Roon setup on a 12 year old CPU hosting Synology DSM 7.x with 16GB ram, a 3 drive Hybrid array of modern 16TB Sata (RAID 5) plus a 256GB slow Sata SSD for read-only cache, also from 10 years ago. All spares from years gone by save for the 16TB drives. It flat out screams in that it’s perfectly responsive, couldn’t ask for faster, except when the Roon-cloud calamity attacks during searches and for no apparent reason causes my Roon core to begin ingesting/refreshing metadata that on my end hasn’t changed. A quick restart stops the agony and resume extra speedy.

I’m being dramatic, the system is still perfectly usable but I sense the increased latency particularly when adding a new album from Qobuz but then Qobuz can be quick or latent itself. It’s hard to separate them to isolate, aka not possible.

The 2nd part is a solid network. I only use 1Gb non-managed switches along with old Orbi 3 node mesh network AND a few TP-Link Wifi extenders. I don’t use them to extend though, only to provide non-Wifi devices with a wired Ethernet port.

My headphone streamer uses one of these TP-Link Ethernet bridges and has yet to experience any dropouts, etc. I run it at its max supported 176.4kHz/192kHz but I’ve also used a HQP NAA endpoint to feed it via USB and the Wifi extender has zero issues support a DSD512 stream or ~50Mbps! Wifi can work brilliantly if tuned for the local environment. There are some inexpensive ways to determine the best channel to use and whether 2.4GHz or 5GHz and now 6Ghz is preferred. Yep, all of these things require knowledge, testing and documentation.

I also host my own caching DNS servers. Consistent <1ms response from DNS is critical to reduce a plethora of network latency issues. Pi-Hole is a great option that runs happily on RPi4 and/or in Docker on Synology. Caching and extensive filtering…but I digress.

However, I spent my career selecting, implementing, designing, managing and tuning large scale enterprise networks, storage systems, cyber security systems, VPNs, virtual environments, cloud hosted systems, et al. But one does NOT need all this experience, just good, solid troubleshooting skills and the ability to document changes and results. Both of these can be learned through practice and perhaps a little study early on to not make the cardinal sin of more than one change at a time.

My one piece of guidance to all suffering with performance issues - One must reduce, simplify the variables to find and fix the root cause of the issue. Break it down to the most simple setup possible, then add another layer…live with it for 10 days, add another layer…document your findings along the way because you will forgot the details of what you did and what you experienced. Don’t cascade too many switches and don’t forgot to check/optimize your wireless network from time to time as the environment will likely change around you. Scheduling a restart of Roon Server during off hours also goes along way to avoiding issues. Sure, one shouldn’t have to do this but it’s free and ‘easy’ completely mitigating memory leaks or other ‘funky’ things that occur with flat-file based databases living on less than stellar storage systems.

I hope this helps just one person have a better experience with Roon…and all your home server/storage and networking trials and tribulations.

4 Likes