I have a client that distributes large binaries inside. They also transmit checksums of md5 files and apparently check files against the checksum before using as part of their workflow.
However, they claim that โoftenโ they encounter corruption in files, where md5 still says the file is good.
All that I read suggests that this should be extremely unlikely.
Maybe that sounds? Will another hashing algorithm provide better results? Do I really have to look at process problems, such as they claim to verify the checksum, but don't really do it?
NB, I do not yet know what โoftenโ means in this context. They process hundreds of files per day. I do not know if this is a daily, monthly or annual event.
md5 checksum
Gareth simpson
source share