Our company receives many huge JSON data and we used to Protocol Buffers (Protobuf) to efficiently persist those JSON on disk. This came with new challenges since Protobuf needs a definition, which needs to be maintained. First, we wanted to generate the definition manually. Then we discovered package gob from the Go standard library.
Gob is a super easy to use binary encoder. You just pass in ordinary Go values and in return, you get the binary encoding of those values.
An important factor for our company is read/write speed of JSON serialization. For that, I created benchmark tests comparing Protobuf read/write and gob read/write I'd like to share: https://github.com/ndabAP/proto-vs-gob-bench
BenchmarkGobWrite-8 121 9679732 ns/op BenchmarkGobRead-8 60 19264177 ns/op BenchmarkProtobufWrite-8 13 88254390 ns/op BenchmarkProtobufRead-8 26 46792472 ns/op