Mp3tag is wonderful for small / medium number of files, but on the magnitude of thousands, Mp3tag becomes rather unwieldy. Sometimes I want to do a batch action for all my files (>10000 tracks), say, backing up one field to another, then it takes a lots of time and attention on my part:
first, I can't feed all the files into mp3tag at once, mostly for the fear of any kind of glitches that will waste all my waiting time for the tag scan and the difficulty in finding out what were processed and what were not, but also for the huge memory consumption, which inevitably increases the chance of glitches.
Then feeding files in smaller chunks has its own problems as well. In very small chunks, say 200 files each time, works fine by itself, but then I have carefully keep track of what were processed and what were not, and the number of clicks and keypresses can be very annoying and means a lots of work for me. Feeding in larger chunks (say, 1000) sacrifices efficiency since I'll have to wait forever for the initial tag scan and have to pay attention to the progress.
So much for my complaints. I'm sure I'm not the only one with large music collection nor the only one who has problem with mp3tag's performance issue with large lists. I wonder if there can be some better way to deal with large lists?
Batch files w/ command line can work very well, but since command line options are not part of the future development plan...
Maybe there can be a batch function that takes a lists of filename and an action/action group, so that the user can let the batch runs in background. Scan the each file for tag only before processing it and discard the tag afterward, so there hopefully won't be a memory hit. Write a log file so that the user can stop at any time and pick up the progress later on.
Just my 2 cents...