Compressed output differs from Go to Ruby Implementation
I'm implementing a program that deflates a file into a git blob and stores it appropriately.
I'm attempting to implement this in go here
However, I'm running into an issue where the stored compressed data differs slightly with each implementation.
vbindiff shows that the first 2 bytes are identical (as run from this test script) (If I'm reading this right). These bytes store the compression method and flags, and flags respectively (as per https://tools.ietf.org/html/rfc1950). The third byte is where the difference begins, this is either the dictionary ID or the start of the original input data. The data remains similar until near the end of the file. I'm assuming this is probably the difference in the ADLER32 checksum.
The data appears identical.
I'm not sure if there's an implementation error in the libraries or if I'm just missing something.
Why are these outputs different?
The deflate algorithm as defined in RFC 1951 (which is used in the zlib format defined by RFC 1950 and also in gzip defined by RFC 1952) allows variations in the implementation which might lead to different results when compressing. But these results will still decompress to the same value. This allows for a tradeoff of compression time to compression level and makes also programs like zopfli possible which achieve better compression than the original zlib library (at the cost of significantly larger compression time).
Go uses its own implementation of the deflate algorithm written in Go while ruby uses the zlib library. This is the reason your examples create different compressed output on the same input. But if you take the output from the Go or Ruby program and decompress (no matter if done with Ruby or Go or whatever standard-conforming implementation) it again it will result in exactly the same value.