Data fade issues countermeasures: does coping and opening data from HDD / SSD make that data more secure by being refreshed?

In regards to repelling data fade, I finally got time to try out a proper piece of software - plus I have a piece of hardware that I do not care about thus can run various tests on it

That first part of the above equation is a freeware [at least for home usage] and is called Disk Fresh [by Puran Software] - while the second one is my old slow 8 TB HDD that I will be selling soon. The help file of Disk Fresh says that HDDs should be scanned and rewritten - while SSDs only scanned so that they do not wear out unnecessary. And that is when it hit me:

if for rewriting of the data first a scanning process is required - then does a simple act of coping such data refresh it in the same way; thus makes usage of such software unnecessary?

As I said, I will be selling that HDD, but wanting to retain the data it currently holds - and so I already copied all that data from it to another drive. So if I were after all to not sell now that HDD but instead of that keep it, would then scanning it now with software like Disk Fresh be totally pointless? Because I already "scanned" it during the copying process? If yes- then is also using all of the data [i.e. all files] on some drive an equivalent of such professional anti data fade scanning?

A real life example:

Lets say I have an SSD that holds all of my music to which I listen to everyday, both in a randomly generated playlists and also in some organized / systematic way. Would such reading of each and every single file at some point in time [at least once every couple of months when it finally gets send to the audio player either by random or by hand] do the trick for that tiny part of the drive holding those audio files? And to be very precise: depending on the audio format / player / plugin - it might be that such file would have to be actually listened from the very beginning of it to its end; plus it would also need to have all of its tag fields [i.e. its whole metadata] read - so that every single byte of such file be accessed for 100% thus constitute as a proxy-anti-data-fade-scanning? [In other words: loading all of those files to e.g. Mp3tag would not count as 100% scanning because Mp3tag only reads metadata? And also just loading all of those files to Winamp would help refresh only a little bit the stored data?]

And one last logical question: do NVMe disks also succumb to data fade - and should be treated like SSDs in this regard?

And also: I am eager to start using Disk Fresh - but at the same time afraid that I can somehow destroy terabytes of my data without realizing it months or even years later. Because there is no way I will test in some other way all of the files that will be rewritten by software of such kind, right? Or am I wrong because I can simply save a checksum of the whole drive or [to make it easier] of one folder at a time - and then compare that pre-test checksum with a checksum generated after Disk Fresh will be finished with rewriting? But then again: I would need a second drive that would host a copy of the data that is about to be rewritten [in case it gets damaged] - which again would not only make the whole scanning and rewriting totally unnecessary, but also on top of that would require from me a doubled number of archive storage space