5 posts were split to a new topic: Issue with skipping tracks with large library
Thanks for your observations and suggestions. I’ll contact Andrew but my budget is currently “burnt”. If I can glean ideas I’d prefer to try say a different OS (Linux) rather than commission another system build.
Yes it’s disheartening not solving the problem, but I loved the technical challenges. My fear is that it’s not currently resolvable using this release of Roon, maybe with it’s current core architecture, not using additional available system resources to combat software bottlenecks in it’s mainly single-threaded software design.
I’m tempted to go back to the beginning with Roon Rock setup and leave Tidal out!
It might be worth putting in a Support ticket to Roon. If nothing else they may be able to see something happening in your logs that points to the nature of the slowdown. No doubt Roon would be interested in knowing what is going on, if they are able to see it. You have nothing to lose in doing so, and it may not only help you but might point to an issue Roon needs to look at.
There do seem to be some odd issues occasionally where Roon starts to leak resources. What you are experiencing might be an extreme example.
Yes I always intended to contact support but though I’d get answers here. Maybe now is the time as you suggest.
I still think you should not be striping your SSD, if anything is likely to be detracting from performance.
Your comments re your ram and CPU usage would do my thinking point to how Roon manages it’s resource workload.
That’s an easy thing to try, I’ll re-configure and try that at the weekend.
Do you have any more detail on the operation of Roon apart from what has been released on the Roon technical repository?
I emailed support yesterday, I’m still awaiting a reply.
I’m corresponding with Andrew who has provided some information. If I do find a solution I’ll post it here for interested parties.
Don’t email - post your request in the Support category of this forum, where it will be seen and responded to by the support team
I hadn’t realised that, thanks I’ll do that now.
It’s a while ago, and I read this by chance. But reading this makes me a little uncertain about how roon’s core functions are actually implemented. I’m still not sure what should be so difficult to handle a million or more tracks.
For example, roon strongly recommends using a separate SSD for the library database. So, it seems that there is a problem in the database system itself if it demands so much I/O performance. And roon always refers to the structure and amount of meta data what makes it so demanding.
But, to be honest, structured and full-text databases und retreival are a well-known technology for at least 20 years - a time with slower spindles and CPUs, few memory and no SSDs ;-).
Does anyone know if the library core is self-made or a third-party product? It’s hard to believe that the technical base is an adequate implementation here.
And what difference would it make either way. The database is what it is.
If you search the forum this old pony has been trotted out many times.
It would make a difference if roon listened and invested more effort here. Instead of solving the problem in their S/W they tell customers to invest in more H/W…
My understanding is they are using a a noSQL object store (as opposed to relational) database. I believe they’ve mentioned the name of the database engine they use elsewhere in these forums, but it’s been a while. Maybe your searchfu is better than mine.
a million tracks? no issues there.
the problem comes when each of those tracks has credits with roles and those credits have connections to other albums, tracks, and compositions. Also, all those artists who are related to other artists, and how. Plus the links in bios and reviews. Plus album level credits and all the connections at that level.
Suddenly your million tracks turns into a hundred million objects with even more links. You are thinking about this problem like a traditional music player, when you should be thinking about it like wikipedia.
You remember the past very differently than I do. Things used to be slow and very limited by RAM capacity. SSDs changed that considerably.
Today, no one would ever run their “structured and full-text databases” on spinning disks. Spinning disks are a relic of the past, used for archival and low-touch data. To romanticize the past 20 years in this manner is deceptive and misleading. Seriously, what application did you use on your PC 20 years ago that handled a million tracks gracefully?
Looking at an article from 2011 that compares the performance of one of the fastest spinning drives to a lame SSD from that era, which is only 9 years ago, we are looking at the spinning disk being 60x slower.
Given that the price of the SSD today that we ask for is under $25 shipped, I’m going to stick with the idea that it’s a fair thing to ask that you not use antiques to run modern software.
It is self made. The disk storage is based on leveldb, and the rest is in-memory indicies.
I’m curious why you think it’s hard to believe?
Roon has always been about having decent hardware and focusing on pushing that hardware with software. It’s never been interesting to us to run on the hardware of generations past to save a few bucks.
Thanks for your response.
Certainly, it’s not only a single track information. I’ts rather similar to linked web pages, as well.
Many companies still use spindles for such tasks . I do not romanticize the past 20 years, because in the IT business it was and still is hard work to overcome boundaries. What I mean is that esp. with limited H/W resources it’s always a challenging job to get things done. But in many cases great things had been achieved despite of this. Today, IT S/W development in general - if not for embedded solutions - tends to move things like performance from S/W solution to growing H/W requirements. It’s not against using current technologies.
With roon you have entered the private PC as a host. And such PCs aren’t always at the top of the line. You can make high H/W recommendations here, but you can’t control it like with an embedded solution. And what may work very well in an embedded system may have it’s problems if transferred to a PC platform or vice versa.
And that’s what at least some of your customers obviously feel. And it’s not only about money, but simply about the need to upgrade their hardware. Maybe such customers are not your key customers, but at least this makes it more difficult to convince a broader range of customers, here.
When you read about the roon universe, it sounds so nice and easy. But if you look deeper for a concrete personal solution things can get really complicated very quickly. And most people are simply looking for solutions. They usually don’t want to spend much time in elaborating solutions .
But please, don’t get me wrong. I like roon for organizing and presenting music. And in general it’s amazing how many possibilities you have to build your own roon universe. But we should never forget that it’s about organizing, presenting and listening to music, not more and not less .
As some sort of IT nerd I like to walk through all the possibilities to get a working, slim and easy solution. But that’s nerdy, not the regular case .
Agreed with everything you said. To clarify our positions:
We’d rather build for the future and not the past… we could spend resources on supporting older hardware, but it’s not where we think our time is best spent.
In regards to customer acquisition, fewer and fewer have this problem as hardware gets upgraded as time goes on. It’s hard to quantify how many customers we could acquire by supporting older gear, but our feeling is that we have more growth opportunity by moving forward.
Additionally, more and more people are finding that general purpose computers are pretty annoying and they’d rather have an appliance-like experience. Yes, they must buy hardware for ROCK or Nucleus, but that’s what it takes to get that experience.
Just a bit of perspective, I WAS running the core on an antique (2010 mac mini core 2 duo with an SSD) when in my trial period and it ran really well with my 35K tracks.
I assume you meant to send this to @Matthias_Dauelsberg, since your mention of “with an SSD” is exactly the perspective I have. SSD makes more a difference than anything else.
Very curious: have you ever experimented (or considered to do so) how this knowledge (networked relationships) would integrate/ perform using a Graph Database?