How Big Is Your Library

That’s an architectural issue. Roon builds a database based on what it’s been allowed to scan. Once scanned it’ll find the stuff. It will not, however, traverse your network without being given permission to do so.

1 Like

Not had any luck with batch conversion at all. There is always something that needs to be post-edited and I just find it impossible to keep track of a big batch. I find easier to do it one by one in CUETools but it will take for ever so I have given up. We just use several players.

Well, I’ll just take my 2 TB library and slink away…

1 Like

Just add a 1 TB a year and in 48 years you’ll be up to 50TB!

I tend to the split before I add to the library, slow but it doesn’t mess stuff up

CUE tools works fine for me.

Just a pain if you’re trying to catch up

Mike

There are pretty easy means of tagging using info in the cuesheet. What anomalies could that yield?

There is no one to one mapping between the CUE sheet data model and the roon data model.

That is seriously more than impressing. What an amount of work it might have cost to rip, catalog, tag etc. all your collection. Do you care to give more details about your PC setup?
Hope you do frequent backups though :slight_smile:

Thanks, I have 2 1/2 sets of backups. The half is the SHR Synology (similar to RAID5) for my main storage (actually a 12 bay Synology NAS with 12 8TB drive). The second is a bunch of 4 TB external HD’s which are kept separately at home, and the third is a bunch of 8TB HD’s in two safe deposit boxes in the local bank. In addition, my vinyl and tape rips are mirrored on another Synology NAS drive which doesn’t have the capacity for my other downloads.

I started the project a couple years after I retired a decade ago. My wife brought up the subject of my record and tape collection - 15K records and 1K tapes (almost all classical), and what would happen when I died. We decided on a legacy project for our daughter and her family (now including 2 grandkids). I decided that there was a core of about 10K records and tapes that I wanted to preserve (including very complete collections of Decca (London in the US) and EMI, RCA, Mercury, etc., almost all originals). Since everything would be in recorded and ripped in real time plus any additional processing, cataloguing, I estimated it would take about 5 years (turned out to take six years).

I first went to see what the situation was for high quality ripping. I found an excellent consultant, Tim Marutani, who worked with me to choose the equipment I would use. We worked with some top flight recording and mastering engineers to pick the best available equipment, including doing a shootout in 2009 with top equipment and software available at the time. I settled on a pro system centered on the Pacific Microsonics Model Two which can rip at 192/24 (using dual wire) and the Merging Technologies Pyramix Software and Mykerinos card. We added Izotope RX software for post processing (declick, etc) which evolved during the project to RX3 Advanced. Art Kelm installed his Ground One system for my electronics and I had Bottlehead build a custom balanced phono preamp with variable EQ to feed the Model Two. It was and is not a cheap system, but I would be spending 6 years doing this, so I didn’t want to do it more than once. I also bought a flatbed scanner so I could scan in hirez the 10,000 album covers and backs and most of the inserts (Plustek OpticPro A320).

For playback, the Model Two is superb, but is definitely not consumer friendly. It is a pro machine, and can do just about anything that a recording or mastering engineer would want to do. But not for a person who just wants to sit back and push the play button to tap the icon. My playback software and DAC evolved through the years as I incorporated mch files and DSD into my system. I first met Chris Connaker of Computer Audiophile a decade ago and he has built two computers for me. The first was the fanless ripping computer in 2010, with Windows XP Pro, Pyramix software and Mykerinos Card. It is still running today for the occasional rip. The second was in 2017, when I needed a high powered computer to do mch files upconverting 6 channels to DSD256 for HQP and my mch NADAC. Working with Jussi (Miska) of HQP he built the PC I currently use for playback (prices from Spring of 2017):

Chris highly recommended Roon to work with HQP (I hadn’t used Roon and am very happy with it). We ended up not using AO.

CPU $1,649.99 i7-6950X
MB $218.86 GA-X99M-Gaming 5 (rev. 1.1)
RAM $300.99 Savage Memory Black - 32GB Kit*(4x8GB) - DDR4 2400MHz Intel XMP CL12 DIMM
Video $545.84 ROG STRIX-GTX1080-A8G-GAMING
PSU $199.99 Seasonic SSR-850TD PRIME 850 W Titanium
Disk $629.99 SSD 960 PRO NVMe M.2 MZ-V6P1T0BW
Case $137.66 Fractal Design Define R5
Cooler $89.93 Noctua NH D15
Fans $16.99 Phanteks 140mm Cooling Fan (PH-F140SP_BK)
OS $139.00 Microsoft Windows 10 Pro OEM (64-bit)
Apps $129.00 Audiophile Optimizer 2.0 beta 15 (using “strip down Windows 10” service tool option) $120.00 Roon server
$150.78 HQplayer (sends to NAA)

7 Likes

My goodness, Larry, what a huge project! Respect, lots of respect!

Can I be adopted😀

1 Like

What tags do you typically have in your cue files?

There are a few others but the basic ones I have tended to use are:

PERFORMER
SONGWRITER
TITLE
TRACK

But I think we might be talking at cross-purposes. For me getting a CUE into roon is a 4 stage end-to-end process with a lot of feed-back loops and dependencies.

  1. Conversion (CUETools)
  2. ID3v2 TAG edits (mp3tag)
  3. Import (album identification) (roon)
  4. Merge (composition identification) (roon)

There are usually a lot of manual steps after (1) so personally I would just rather work off the CUE’s one by one rather than work off another directory of split CUE’s one by one.

When you split using cue tools does it not write the tag metadata from the cue to the flac files?

Well, the short answer is that in the simplest case of course CUETools writes the tag metadata to the FLAC.

The problem is that in many cases that simple default option is not the best option in terms of work saved. There are often better options but those have consequences. In particular if you have a SONGWRITER tag when you convert with CUETools in many scenarios both composers and artists will not be conserved. The root of the problem is that CUE’s do not have a concept of a COMPOSER. As a result, many put the composer in the SONGWRITER field and the artists in the PERFORMER field. Many players including JRiver recognise this convention. But not all conversion processes respect this convention in reverse.

There are many examples but a common one in Europe would be that your composers/compositions/artists etc. were in original foreign language. I know from experience that roon prefers anglicized versions and album and composition identification will often fail without these. It is a lot of work to re-tag a complex multi-work, multi-part album, a Bach canatata for example originally tagged in German, so what I normally prefer to do is a lookup in freeDB or CCDB in CUETools prior to conversion. Often a version is found that saves all that work. However, now the SONGWRITER tag will be ignored and it will be hit and miss where the composer is put. Worst case is that CUETools overwrites all your performers with the composer in the ARTIST tag. Sometimes you end up with both performers and composers in the ARTIST tag. Either way you must re-edit but it’s the best of a bad job and better than the alternative of anglicizing all the composition titles.

This is just one example to illustrate. Personally, I gave up trying to convert systematically some time ago.

Do you have a problem with scanning CD covers (for example) that have shiny or reflective sections? On my relatively cheapie HP Officejet Pro 8600, those areas come out black/grey in the resulting .jpg files (for example, Verdi’s Aida under James Levine on Sony; Wagner’s Das Rheingold under Levine on DGG; Poulenc’s Mass in G Minor + Motets under Robert Shaw on Telarc)). I did some looking around for an answer, and all I found was a remark about “defractive light” or something like that.

I haven’t found a simple solution. As very labor-intensive approach is to photograph the cover (Sony DSC-RX100), copy the resulting .jpg file to a desktop and hack around with Corel Photo-Paint and the original HP scan of the cover until I get something presentable in the way of a final .jpg file. It’s amusing to play with such a thing, but it’s not scalable.

Reading all that I’m not surprised. If you have access to Linux I could suggest a quick way to do all that writing the tags just as they appear in the .cue files. You could do all the splitting and then ingest one album at a time. With a tag editor like Puddletag you can easily copy and paste values from one field to another, do search and replace or even highlight as may rows in a field as you want to change, make the change once and it cascades across all files you’ve selected. Might make the job easier for you, but it runs on Linux or Mac only.

Enough from me on this topic.

I don’t scan CD covers, only record covers and some tape boxes. So far I’ve found that Roon does whatever it does to find cover art, either from the metadata in my flac rips of my CD’s or from their database. Only about 10%+ of my albums are CD’s. I do have some problems with very dark record album covers, but I put pieces of blank white paper above and below the covers and then crop the excess white area after the scan.

I have 25602 Albums and 253140 tracks in my library spread over 4 drives in my Dual Xeon E5-2384V4 (16c/32 thread) dedicated Roon Core Server.

My only real frustration with Roon is the lack of MQA decoding.

Andrew

As an experiment, I just now scanned the back of a CD cover into a .pdf file, and I placed the file in the directory containing the music. In Roon, I now see “1 PDF.” If I click on that (in Roon), Roon causes the .pdf to be shown in Microsoft Edge. In Edge, I can zoom, and I can easily read the file (scanned at 600 dpi) even after some zooming.

2 Likes