RoonServer memory monitoring

I was reading this thread and thought I’d set up a simple monitoring job on a recent NUC build.

  • 8th gen i7 NUC
  • 16 GB RAM
  • Debian buster running the official realtime kernel and stripped of cruft using this guide
  • RoonServer installation running as root
  • that’s it, the box is headless and dedicated

Simple approach, running a shell script hourly as a cron job that appends to a naive log file:

2020-08-12T02:00:01
PID       OWN            MEM            CMD
611       root           6644K          /bin/bash
622       root           417468K        /opt/RoonServer/RoonMono/bin/RoonServer
692       root           5587400K       /opt/RoonServer/RoonMono/bin/RoonAppliance
693       root           3692K          /opt/RoonServer/Server/processreaper
755       root           908672K        /opt/RoonServer/RoonMono/bin/RAATServer
18515     root                          grep

When I’m awake I’m usually listening to something so it gets some hammer, now watch and wait.

1 Like

Roon emits memory usage to its log file every fifteen seconds… Granted, that’s just the RoonServer process and not RAAT. But do the numbers jive with what your script is recording? (On my Roon Core, the numbers Roon records match with what the OS says RoonAppliance is consuming. But I also haven’t observed runaway memory leaks, either.)

Also, your output suggests you are reporting the virtual alloc, not actual physical used. On my Roon Core, virtual alloc is ~10x actual resident physical used…

(FWIW, I run my Roon Core on a CentOS 7 system.)

Hi @cwichura and thanks for putting my lazy effort on the spot :wink:

My figures are indeed the virtual allocation and I see the log figures you mention:
08/12 13:27:10 Info: [stats] 5459mb Virtual, 1228mb Physical, 403mb Managed, 0 Handles, 72 Threads
The log number for Virtual tracks my RoonAppliance figure, virtual memory for RoonServer and RAATServer are constant so far. I have a TODO or two in the script anyway and have been thinking of log scraping to piece together activity. Now thinking that might be all I need. Expect a better effort before the weekend.

1 Like

So I’m going to write a better memory reporter this evening, a good chunk of the way there already. There is already enough from the original to make some observations. Of the four Roon processes three seem to be basically well behaved, RoonServer, processreaper, and RAATServer have the same virtual mem allocation as they did 36 hours ago. The same can’t be said for the RoonAppliance process, this started with 5575056K and is now at 6563156K. While the figure does fall back on occasion there appears to be an upward trend.

That’s exactly my observation. What I haven’t been able to determine is what activity on the server exactly is what impacts most… just playing back, following cross-referenced metadata, adding music to the library, etc.

I think there may be a garbage collector at work when memory alloc falls back, but nevertheless the longer it runs, the more memory will be allocated. The other processes don’t vary a byte…

I can confirm that it seems to loose a little size when I’m asleep, but that’s a pretty small window. While it’s in use there’s a gradual accumulation.

One thing I’ve noticed is when playing from Qobuz, Roon downloads the entire track into memory and then frees it when it moves to the next track. So depending on the length of the track, you get ~100MB pendulum swings as each track plays. But it very clearly releases the memory for each track when its done playing the track. (I suspect Roon probably loads the full track into memory no matter what the source is, even when coming from local storage.)

If the case of the garbage collector, you’d see the “Managed” number in the log outputs increase and then drop when GC runs.

top -p `ps -ax |grep "RoonAppliance" | grep -v 'grep' |sed 's/^\s*\([0-9]\+\).*/\1/'`

will get you real-time monitoring of the memory usage of the RoonAppliance process. Or

top -p `ps -ax |grep "Roon" | grep -v 'grep' |sed ':x {s/\(\([0-9]\+,\?\)\+\).*/\1/; N; s/\s*\n\s*/,/; bx};'`

will do the same for all the Roon processes.

P.S.: There’s absolutely no question that the RoonAppliance process leaks memory.

(Edit: Simplified a Regexp.)

2 Likes

Done the work to update the memory usage, and tidy the processes reported on, not done with log scraping yet but will get some time soon. It now looks like this:

2020-08-14T05:26:43
PROC        TOT MB    RES MB    CMD
root:604    408       42        /opt/RoonServer/RoonMono/bin/RoonServer
root:623    6249      2771      /opt/RoonServer/RoonMono/bin/RoonAppliance
root:624    3         0         /opt/RoonServer/Server/processreaper
root:683    953       29        /opt/RoonServer/RoonMono/bin/RAATServer

@cwichura these are now identical to the log figures but I’m parsing them from /proc/<pid>/statm not the log as it’s independent verification. I’m pretty sure my pendulum’s not true.

@Jacques_Distler thanks for the input, I’ve already seen the figure tick up long enough to be fairly sure. I’m curious to see if it’s possible to narrow down the activities that cause it to happen.

Now to sleep for a few hours rest for both me and RoonAppliance.

One advantage of watching the memory usage in real-time is that you can watch RoonAppliance gradually chew up more memory, even when it is purportedly idle (no music playing, no tracks being analyzed, …).

1 Like

I get the impression I could get a fair way down this road and you’d be there waiting :wink:

There’s a fair bit of metadata munging and churning looking at the logs…

:slight_smile:

Replace “top” by “top -b -d 3600” in the above, and redirect the output to a file. You’ll get a snapshot of RoonAppliance’s memory usage once ever hour. (Change “3600” to some other interval if you want more/less frequent measurements.)

1 Like