WMA v WMA Lossless

Is there any way of inspecting the codec details to dostinguish WMA from WMA Lossless?

%codec% unfortunately reports both as Windows Media Audio:

What does %_bitrate% say?

DD.20140917.0939.CEST

A number denoting bitrate, from which codec cannot be reliably inferred. E.g. silence has a low bitrate regardless.

I had the hope we could discover a remarkable difference between the constant bit rate and the variable bit rate, for example, constant 32, 48, 64, 96, 128 kbit/s, or variable 241, 309, 674, 933 kbit/s.
From the first series there are numbers, which modulo by 16 is always zero.

Do you have control about the encoding procedure?
If so, then persuade the encoder to write his encoder settings into a tag-field, e. g. ENCODERSETTINGS.

I have some WMA files, where I did so.
While LAME encoding, I store the complete commandline into ENCODERSETTINGS.

DD.20140917.1636.CEST

Understood, but that's just the kind of unsafety I am working at avoid. Especially having just discovered that MediaMonkey's Bitrate inspection has arithmetic errors giving false results.

Yes, but I do not want to re-encode there 22,000 tracks :slight_smile:

I am attempting to get Mp3tag to do that! :slight_smile: Or make it unnecessary, by reading it directly.

You are right. FB2K can show such technical values, Mp3tag should do it too.

DD.20140917.1827.CEST

I made a feature request here: /t/16151/1 .

Please contribute any thoughts.