Will adding and removing a lot of duplicates leave a lot of debris in the database?

An oddball question, the answer will be of interest to nobody else:

When I moved from my desktop to the newly built NUC, I took the opportunity to “tidy up” my content library, which had accumulated over the years with messy directory structure, some exported from Sooloos, some not.

Well, “tidying up” meant that I know I do not have all of the old content, the album count is less. I have done some manual checking and filled holes, but this is a lot of work. So I came up with this idea:

  1. Copy the old desktop directory to a separate directory on the NUC
  2. Make Roon watch this directory
  3. This will create a lot of duplicates; what I want to do is find the non-duplicates. So focus on the new directory and on non-duplicates, should be about 100 based on the count disparity
  4. Salvage those files over to the main directory, and clean up directory structure at the same time
  5. Unwatch and delete the extra directory

Given how Roon has added the more advanced content recognition, will this process gum up the database with all those deleted duplicates (about 1400 albums)? Is that a problem in practice? I could snapshot the database before I do the whole process, and then restore it.

This is what the “Clean Up Library” button is for. It’s there for a reason…

Really? It will make Roon forget the data it uses to recognize albums when they show up in a new place? @mike ?

Correct – it removes the orphaned signatures from the database. Just be aware you’ll lose any edits/play counts/favorites/etc associated with those files.

Great. Thanks @mike and @Ludwig.
It’s a heavyweight approach, but I don’t have a smarter way.
The price of messing up your library…