Improvement suggestion: optimization for large # of files


Mp3tag is wonderful for small / medium number of files, but on the magnitude of thousands, Mp3tag becomes rather unwieldy. Sometimes I want to do a batch action for all my files (>10000 tracks), say, backing up one field to another, then it takes a lots of time and attention on my part:

first, I can't feed all the files into mp3tag at once, mostly for the fear of any kind of glitches that will waste all my waiting time for the tag scan and the difficulty in finding out what were processed and what were not, but also for the huge memory consumption, which inevitably increases the chance of glitches.

Then feeding files in smaller chunks has its own problems as well. In very small chunks, say 200 files each time, works fine by itself, but then I have carefully keep track of what were processed and what were not, and the number of clicks and keypresses can be very annoying and means a lots of work for me. Feeding in larger chunks (say, 1000) sacrifices efficiency since I'll have to wait forever for the initial tag scan and have to pay attention to the progress.

So much for my complaints. I'm sure I'm not the only one with large music collection nor the only one who has problem with mp3tag's performance issue with large lists. I wonder if there can be some better way to deal with large lists?

Batch files w/ command line can work very well, but since command line options are not part of the future development plan...

Maybe there can be a batch function that takes a lists of filename and an action/action group, so that the user can let the batch runs in background. Scan the each file for tag only before processing it and discard the tag afterward, so there hopefully won't be a memory hit. Write a log file so that the user can stop at any time and pick up the progress later on.

Just my 2 cents... :sunglasses:


Are you running this on a 286 or something? :slight_smile: I have close to 16,000 MP3s and I load them all in at once (takes about 5 minutes) do whatever editing I need to, then save them (takes about 2 or 3 minutes, tops.) Exporting them is a b*tch, as it takes about 10 minutes... but I just go to the bathroom or something and come back.

What 'glitches' are you referring to? To be honest, I find MP3TAG to be AMAZINGLY fast, compared to the amount of time Winamp, iTunes or Amarok/Rhythmbox takes to initially scan my directory. I've never had a problem.


I also think MP3tag is acceptably fast, when it initially loads a list of 10000 files.

however, it is incredibly slow, then I want an export of data which are all already displayed on screen. Not only slow, but also CPU-intensive, so I usually decrease its CPUpriority to "belowNormal", so that I can continue doing uther stuff while export does its thing. <_<

worse, if the files resides on a network share, because all files are read once again and it generates network traffic just to obtain data once again. This also makes it a pain, when I want to verify that my active MP3 directory is a synch subset of my complete/backup MP3 directory - I have to execute MP3tag export locally on each PC to get it finished "today"... :huh:


To synch a directory (with mp3 or whatever) with another directory, just use a nice synch-tool like ROBOCOPY with the Switch /MIR or RSYNCH. They do all the nice little things like comparing date/time, size, last modified or whatever you need.


I´m agree....
That´s will be a good improvement.



The only thing I noticed about loading 10,000+ files is that it takes a LOT of ram....After editing this many files I looked at my usage and it was just under a gig of memory being used. This could be contributed in some way to how Vista handles memory, but I would like to see better resource managment when editing large # of files.


My files can have exactly same date and size, and still be different, because MP3tag kindly enough lets me update ID3 tags padded to 2 KB (i.e. same filesize, which is a performance trick in MP3tag, by the way) and without changing file date. That makes it a perfact companion with 1-by-1 player, where I usually sort my files by date.

Just "save what you show", so "I get what I see" :stuck_out_tongue:
And I'm looking at MP3 tags, not file statistics...


Yes, it takes only about 20 minutes to load my ~30000 files, but it eats up a tons of RAM and honestly my Windows response time is SO slow that I really can't do anything at all, not mentioning editing the tag and save them! I have 1GB + 256MB of RAM, which is by no mean small. See my attachment - look specifically for the virtual memory (VM) column, and notice Mp3tag is unresponsive on the screen as well - and see how huge the RAM consumption is.

I totally understand the RAM consumption (where else can those data go?), but it is totally inefficient when handling such huge # of files. That's why I wonder if there can be a better solution. I don't use my computer to do tagging only, and when I need to do other task I will be oblige to terminate Mp3tag forcefully since my computer is unresponsive due to RAM overload. Changing tag, which is the whole point of having mp3tag scan my files, cannot be accomplished b/c it will take forever AND prevent me from normally using my computer.

And talk about glitches, my computer threw a "delay write" error message for a music file (which has been, is being, or will be read by mp3tag as it is part of the folder being scanned) on my external drive. And voila, I cannot access the files on the drive and have to disconnect/reconnect the drive again. Probably not mp3tag's fault, but in a nearly unresponsive environment anything can happen.


I have been using mp3tag with 15000 files at a time, having 1 GB of ram, and been very impressed with the performance! But I guess there is a limit for everything, which in this case seem to be closer to 30000 files...

But why not take the easy way and load your files in two batches, if you know your system chokes on everything at once?