SUGGESTION: Count the lengths of files with accordance to reality

I made a simple experiment concerning the way Mp3tag adds lengths of files


I created a WAV file that was 44 Hz stereo 16-bit. And I made difference version of it on account of its lenght. And then multiplied it and loaded to Mp3tag. The results were

A] 1.000 seconds long file x 60 = 60 seconds in Mp3tag
B] 1.001 seconds long file x 60 = 60 seconds in Mp3tag
C] 1.499 seconds long file x 60 = 120 seconds in Mp3tag
D] 1.501 seconds long file x 60 = 120 seconds in Mp3tag

This might seem unimportant: but not when you are working with something like hundreds of 4-6 second files, that stacked tohetger create a playlist. You get then result off by whole minutes. And this is plain wrong and misleading


That little experiment of mine was conducted in Mp3tag 2.95 on Windows 7 x64

Is this calculation correct?
Isn't that extra bit in

just a thousendth of a second? So you would need 1000 files to be 1 second off?!
So I wonder how

I would think that you would need 60,000 files for a deviation of 1 minute.
Listening to 60,000 files with a length of 6 seconds would then lead to 360,000 seconds of listening time or
6000 hours
or 250 days.
I think that after that time the deviation is negligible.

Are you sure of this calculation result?:wink:

I did some testing in v2.97. It seems that Mp3tag rounds off each file separately down to the nearest second before adding up the total time, which means that the total rounding error will increase with the number of files.

The error could be nearly 1 second for each file, which means that those 60.000 files could have an error of 60.000 seconds (> 16 hours) in theory.

As poster pointed out, those calculations are probably incorrect: 60x1,5=90 but Mp3tag showed 60 seconds when I tested, not 120. Still a rounding error though.

But personally I don't see the big problem. In my library most tracks are several minutes long, so a 1 second rounding error per track (worst case scenario) is negligible.

The problem seem to be very short files where naturally, the percentage of deviation is biggest in respect to the total length.
But do other programs do it better?
I took the value display from MP3tag, Foobar from the list, Foobar raw from the properties sheet and Windows Explorer.

Here a selection

File Mp3tag Foobar Foobar raw Explorer
1 00:02 00:02 0:02.250 (99 207 samples) 00:02
2 00:04 00:04 0:04.213 (185 795 samples) 00:04
3 00:03 00:03 0:02.913 (128 467 samples) 00:03
4 00:03 00:03 0:02.740 (120 844 samples) 00:02

In total the calculation is 116 thousandth of a second off the real value.
It looks as though MP3tag does not always round to the last whole second.
Actually, the MP3tag data matches fairly accurately that of the other 2 programs.
Except ... that the WE shows 1 second less for the last file.
So I am pretty sure that the more files you have, the closer the result will be to the actual time as it will level out on average.

So the statement

is not quite right as apparently other programs do exactly the same thing.

Strange - when I tested with a bunch of short files, Mp3tag always rounded off downwards.

Out of curiosity, I tried to match your files 3 & 4 - except mine got slightly longer.

Details in Foobar:

0:02.915 (128 542 samples)

0:02.740 (120 851 samples)

In spite of them being slightly longer than your 3-second files (3 & 4), Mp3tag nevertheless displayed them as being only 00:02.

I have no clue why your results differ from mine, but in reality it doesn't matter. Just odd.

Mine were all CBR, don't know if that matters

Possibly, I used wav files.

Edit / update:

I converted the files in my previous post to compare file formats. For the nearly 3-second files, FLAC and mp3 (cbr & vbr) formats were rounded off upwards but the wav files were rounded off downwards in Mp3tag. In Foobar though, all file formats were rounded off upwards.

A bit interesting, but nothing I would worry about.