Do regular (daily) restarts of roon have a negative impact?

Yes, I agree, but I suspect you would also agree (?) that if something is in a streaming service, it’s a heck of a lot more likely to be in a metadata service versus, say, something like this, which I have a handful of, but nothing of magnitude (of my 4,593 albums, only 161 are not identifiable):

It’s mysterious to me how this works. There’s lots of weird stuff on streaming that most likely isn’t on MB/TV and yet it shows as identified in Roon

I just think it ignores the identification process for streaming but obviously attempts matches to add stuff from AllMusic. Even if I add the Qobuz version to me library and group them it doesn’t change that my own version is unidentified. But the question is if no Allmusic does Roon get into the same rescanning loop it does with local files? If not then why did they choose logic to do this for local files.

Audio analysis is done and the resulting data goes somewhere: to roon´s internal database. And so does all the metadata, text snippets, references to artists, composers, other tracks, albums and alike of all your tagged, non-identified tracks. That’s a lot of data and zillions of references.

The moment you click on a composition list or album list of a an artist or type a word into the search window, from how I understand the algorithm your local server has to search this complete database for similarities, references and priorities. Completely for almost every single command you issue to roon.

I understand your frustration with the user experience but I think it is easily understandable that crawling and searching the database of solely the 150k unidentified tracks plus matching everything with roon´s cloud-based references and dealing with the fact that some of this underlying data is missing whenever you issue a command like ´Play´ is too demanding for any hardware and a software which is not meant for this purpose.

As mentioned before, helping roon manually to identify more albums plus migrating particularly problematic albums (such as bootlegs, huge boxes with >1000 tracks per album and alike) to a folder which I usually keep deactivated solved the issue in my case.

1 Like

Certainly true but not really a problem as content being known by roon with transparent metadata on their servers seems not to be such a problem as long as you have a capable server. Some people have libraries of 500k albums and more and it is running smoothly.

I do not know it for sure but when I added a lot in Qobuz it seemed not to affect performance as if roon has direct access to the metadata and analysis even if files are not on MusicBrainz and alike. And there were definitely several albums having full metadata from Qobuz while not being identifiable as local files no matter how hard I tried.

Another thing I have learned: Automatic recognition and identification process definitely has limits. If one takes the times, in many cases it is easy to identify an album which roon has not identified automatically. That is especially true to ripped vinyl and SACD deviating slightly in track duration to the original CD or download metadata. With DVDs and blu-rays it is slightly more complicated as disc numbers, title and artists in many cases do not match either.

1 Like

Still doesn’t really explain why its handling of local files cannot be the same. Qobuz isn’t a metadata service, for albums like this it carries no more metadata than my local files. Infact some have been downloaded from them… It’s only as good as what it’s given from the labels which are equally as poor at maintaining it. I can say my own files of these examples contain more accurate metadata than Qobuz as I added it myself, yet they are spurned by Roon. They should not be treated any differently and if they are then they need to sort it out.

Enough of the excuses for this shortcomings which only after 9 years of existence they seem to own up to as being a problem. There have been users of Roon with large libraries for a long time but it’s only started to be an issue more recently.

Anyway this topic isn’t going anywhere and none of know the inner workings or why Roon treats Qobuz differently so leaving this now. If people want to accept Roons excuse then so be it. I will not but unlikely they will change their stance.

1 Like

I am past frustration, actually. The heading of this thread points to the solution I found. All I wanted to know was if restarts do any harm to (the integrity of) Roon. I posted a script somewhere else that does that on a daily base. Using this Roon works for me.

Instead of answering that simple question Roon decided for a different kind of reply, acknowledging severe limitations of the system for the first time.

That’s an easy one.

  • acknowledge there is a problem
  • gather all available information on problem
  • analyze problem
  • find solution
  • fix problem

The first step is leaving the state of denial. Don’t see this here.

Problem: A perfectly normal operation (1) brings down the system (2) after a given amount of time operating perfectly well (3).

(1) Lookup of unidentified albums/tracks
(2) Inresponsive UI, slow searches, stuttering playback
(3) >= 24 hours

That means all components of the systems are just fine until something happens. This something maybe known to Roon. My best guess would be throttling communication with Roon servers due to repeatedly high load / unsuccessful queries combined with a system architecture that relies heavilly on coms with the mothership.Resilience was not part of the design targets, or so it seems.

If there were frustration it would be about the way Roon handles the problem. Which is none of my business. It is theirs. Roon has to aquire and retain customers, not me. So no frustration. :slight_smile:

The capable server part is what Roon used to tell the world. No word about “…with transparent metadata on their servers…”

The issue we are discussing here cannot be cured by any kind of hardware. It’s a software design (forbid a fundamental software architecture) problem. Roon just does not scale well.

Think of it this way: No reasonable developer would ask for more memory to fix a memory leak. (Which is not the culprit here).

2 Likes

From how I understand it, the main difference is whether all this data including all the interdependencies, audio analysis results and references is on roon´s own servers or not. If you, it helps supporting cloud support - which is the case with all identified albums as well as those from Oobuz or Tidal. About your local unidentified files roon knows nothing so your local machine has to crawl everything, search for matches and calculate relevancy every time you do anything on roon.

If I am not mistaken it got revealed as the software architecture was changed moving more things like search (and probably calculating relevancy) to the cloud. Unfortunately that seems to mean some side effects for a few people among those with big or inconsistent libraries as well as those with underpowered hardware or connection.

All this started roughly a year ago and I immediately noticed problems as both my hardware back then was insufficient and my local library was inconsistent i.e. high number of unidentified albums. Before everything was running smoothly so I knew I had to do something.

I am not so much interested in accepting excuses but in finding a potential solution or workaround, hopefully for everyone. Just by doing experiments and exchanging hardware I managed to find a solution for myself last year. Everyone who is willing to do the same is welcome to exchange thoughts. Admittingly, a certain degree of effort in terms of tagging/identifying/cutting the library is inevitable. I am personally pretty happy with the result i.e. roon running faster than ever and core library being well-organized and the limitations are minimal if not on the positive side.

ASFAIK it’s always been the case, Roon will occasionally update albums where metadata as become available. So it must do a rescan to determine if any unidentified albums can now be identified

One of the hazards of ‘Obscure albums’

Last count I had a 100 or so most of which I can explain as “splits” from box sets

That’s not really a significant number of unidentified although it might be slowing down procedures.

Lost discs from box sets can usually be identified and fixed easily by renumbering the disc to the correct number and then merging the now-separated 1-disc album with the rest of the box. I had the feeling that box sets in general are contributing to roon´s sluggishness due to the high number of tracks and references, at least that had been my experience with an underpowered machine and lots of ridiculously fat boxes (´Bach´s complete works´, ´Karajan´s complete recordings´ and alike). Had several of them exceeding 300 discs and 2000 tracks per album easily.

Initially had in the region of 2000 albums unidentified when my previous machine collapsed. After migrating the aforementioned ´boxes of ridiculousness´ or splitting them into their standard albums (I did so with ´Verdi operas on 120 discs´) as well as doing some detective work on the core collection I am down to 52 unidentified albums and everything is smooth as butter.

Box sets are an Achilles heel, the revitalised Focus helped but a box of >50 discs (a nominal number in my experience) starts to slow down. Boxes with Volumes like Karajan Dg, Bach 2000 etc means an album disc count of a manageable size (unfortunately you chose bad examples).

Brendel 114 is about unusable, Mozart 225 equally but Roon’s metadata sources show a volumed version with low disc counts

So the number of real showstoppers is actually quite low,

The structure to allow proper handling of boxes is simply not on Roon’s radar and I doubt it ever will be.

The best advice is leave boxes alone don’t try to split them , you finish up with un-identifiable albums

If any of your boxes are >50 you may or may not struggle depending on the sever hardware, and the number of them. They certainly were causing search to struggle

1 Like

100% agreed about the flag to skip the identification process. I understand that some albums will not be identified by Roon’s metadata resources, and that is OK. I have the ability to tag my music files using MP3Tag and the music itself using Roon’s available tools. I should be able to tell Roon to stop trying to identify an album, thus slowing performance of the software as a whole.

1 Like

Why was this thread moved to tinkering? It’s not about tinkering, it’s about a problem with Roon that has persisted for years. First Roon moves the feedback section far down. Obfuscating individual threads too. Point being, operation of this forum isn’t at all objective and sometimes feels like an attempt to hide or at least obscure negative feedback.

EDIT: I am going to add something that I deleted before because I didn’t want to create a complaint firestorm. But with moving the thread down, I want to point out that it feels like Roon is hiding this issue.

Just how long has Roon known that unidentified albums and their own metadata process is a primary cause of server slowdowns? I have spent, and many have spent, literally thousands of dollars (and maybe hours?) trying to address server slowdowns.

Roon support pointed us in every direction but this issue. So just how long have we been barking up the wrong tree trying to solve a problem that is actually solvable by Roon with a change in architecture? Perhaps every change to the library doesn’t need to result in a rescan for metadata purposes? Maybe Roon could ask the user first?

Roon uses this community and its user base generally as its beta testers. Many have experienced the downside of that. But this isn’t even beta - this fundamental issue, experienced over and over by so many users, if the metadata scan/update that happens with every change to the library really is the primary cause and unidentified albums being the major processing burden, being kept hidden by Roon is to me a large trust issue.

3 Likes

Roon answered your question in this thread (about 15 days ago). You probably won’t be happy with it, because it is rather defensive, uncooperative and blaming the user.

I am equally as frustrated and wondering why this is just coming to light now. I went back and forth with support for weeks while they supposedly looked at my logs and there was no mention of the unidentified albums being a cause. See the response I got to the new support topic I posted this week when I included the info about my unidentified tracks:

Nucleus is very slow and takes several minutes to reboot

This was never an issue before the advent of Roon 2.0 (at least not in my case - I had plenty of unidentified music before and no performance issues), so the product was certainly able to manage this well from the beginning until they went to cloud-based services, so while “Roon wasn’t ever built to be a file management system to handle large amounts of non-identifiable tracks”, it actually was able to at one time. Just maybe not intentionally? And there’s no mention of this in the marketing materials or FAQs. So I can’t help but think that they are justifying this issue which they only just found out about. Or else why wouldn’t they have made this point during the launch of 2.0?

Not suspicious at all that neither “Feedback” nor “Tinkering” topics show up under “Latest”… :rofl:

3 Likes

@benjamin and some people at Roon seem to care. That’s great news.

Ah, thank you. I didn’t recall that the original email that disclosed that Roon’s architecture is the problem also moved the thread to tinkering. So, now, having a large library is “tinkering.”

What’s next? Wanting boxed sets to be organized well is “tinkering?” Adding albums is “tinkering?”

1 Like

Yes, they are acknowledging it and hopefully it can be addressed in some way. I still think that a flag that we can set to say “this album is unidentified and never will be, so leave it that way” can exclude that type of content and address the issue.