Why does a binary serialization of a float use 55 bytes?


I have a task to take millions of floats and store them in the database in batches of 5,000, as binary. This is forcing me to learn interesting things about serialization performance.

One of the things that surprises me is the size of the serialized data, which is a factor of ten above what I expected. This test shows me that a four-byte float is serialized to 55 bytes and an eight-byte double to 59 bytes.

What is happening here? I expected it to simply split the float value into its four bytes. What are the other 51 bytes?

private void SerializeFloat()
    Random rnd = new Random();
    IFormatter iFormatter = new BinaryFormatter();

    using (MemoryStream memoryStream = new MemoryStream(10000000))
        memoryStream.Capacity = 0;
        iFormatter.Serialize(memoryStream, (Single)rnd.NextDouble());
        iFormatter.Serialize(memoryStream, rnd.NextDouble());

Binary serialization is type safe. It makes sure that when you deserialize the data, you'll get the exact same object back.

To make that work, BinaryFormatter adds additional data about the types of the objects that you serialize. You are seeing that extra overhead. You can see it by serializing to a FileStream and looking at the generated file with a hex viewer. You'll see strings back, like "System.Single", the type name, and "m_value", the name of the field where the value is stored. A good way to cut down on the overhead is to, say, serialize an array instead.

BinaryWriter is the exact opposite, very compact but not type-safe. Plenty of alternatives are available in between.