My hopes and fears are similar.
Commonalities (with many other customers)
Problems with a large library, but not always these unbearable delays. Problems are temporarily gone, but come back. A new database doesn’t really help when the old large amount of data gets back in there and likes and playlists disappear. Remote still responds, but also sluggish when the core hangs or shows major delays. New database entries and data analysis make everything slower. With 16 GB, the start-up phase of Roon-Core (1.4 million titles) on a Windows 10 machine still succeeds with 96% RAM utilisation. In continuous operation it is then constantly around 70% and even with that I have already triggered a lot of gaming joy, searching and finding…triggered. Just not permanently and free of problems.
Differences:
In my case, everything is connected directly to the router and I don’t operate a high-performance NAS or a modern player.
My analysis strategy to find backgrounds:
Use several operating systems, several core machines and several libraries alternately in the same network. Turn off IPv6 and use any other network advice in this forum. Do nothing else with the Roon Core at the same time. Use powerful processors and lots of RAM.
Results for 60,000 to 70,000 selected titles (favourites).
Small libraries are used responsively under macOS, Windows10 (Home and Pro), Linux (Manjaro, Ubuntu, Mint…) and Android (tablet+phone) are also responsive. Manual title changes in frequent succession slow down the system when Qobuz or Tidal are involved.
Building a large library is very fast from the first directly connected hard drive, regardless of whether 300,000 or 400,000 titles are read in. With each additional hard drive, the performance drops considerably. From the 4th hard disk onwards, it becomes a waiting game that can take weeks. This important observation applies to Roon 1.7 and 1.8 at the same time. The new metadata optimiser seems to need some more resources.
I do not observe an explosion of memory requirements on the fly. Other users (MacOS and Linux based on Debian/Ubuntu) report this.
Contrary to what is described here, I have no NAS, no switch and no modern network player or high-performance DAC in use. Here, the good old sound comes directly from the PC jack or via a 30 € USB DAC to old HIFI components (Sonoro, Grundig V7000, Beyerdynamic DT 770 Pro). I can always expand these endpoints later when the big system is running smoothly. Fascination and frustration are close together, but the problem description and solution only comes in dialogue with affected customers. This is conducted here convincingly and patiently.
Certainly, the server capacity, the database size, the object and analysis concept with great but processing-intensive link references etc. play an important role in finding a solution. The time and duration of use (perhaps hinting at Roon valence) does not seem uninvolved either.
For the time being, I have adjusted to the fact that major fluctuations in performance will continue and am temporarily creating better responsiveness on my own second system by deregistering the large library and registering the smaller library. I wonder if this will end well today.
I would be delighted if it succeeds and does not start all over again every day.
The proposed principle of renaming data Roon to Roon_old and restarting can also be changed to Roon_big and Roon_fast. The same applies to each data source that could become its own database Hard disk 1 = Roon_1, hard disk 2 = Roon_2…if you have all the data of an artist, genre or other desired classification on one hard disk, you can then also use them alternately in a more responsive way and Likes, playlists remain there without performance problems.
Translated with www.DeepL.com/Translator (free version)