Packing/unpacking binary data in C - doubles, 64 bits

Ben Scott dragonhawk at gmail.com
Wed Sep 9 22:55:22 EDT 2009


On Wed, Sep 9, 2009 at 4:16 PM,  <bruce.labitt at autoliv.com> wrote:
> I think I may have been responsible for 90% of the message
> traffic for the last couple of months.

  But it's been an informative couple of months!  :)

>> The network node does the math real fast but is I/O poor.
>
> As for I/O, I have a mere 1Gbit/sec private LAN between my PC and the
> Cell.

  That's not so bad, but the lack of any kind of decent local disk I/O
has been a source of much difficulty for you, or so I gather.

> The key is to send large blocks of data, not to send line by line.

  That's generally a rule of thumb to follow.  It's faster to do a lot
of one thing than do several things repeatedly.  A single large write
is a single buffer, a single library call, a single function call.
Calling fprintf() repeatedly is going to mean multiple buffers (or
re-filling the same buffer), multiple function calls, and likely
multiple system calls.  System calls in particular suck.  The
processor has to change all kinds of state around to go from user to
kernel and back again.

>>  Sending binary doubles over the wire can be very complicated.
>
> Why?  Encode to a standard.

  Standards are great -- there are *so many* to choose from.  ;-)

>  IEEE-754 would be logical.

  IEEE-754 doesn't specify endianness (that would have been too easy,
I guess).  On some machine's it's big-endian, some little endian, same
as for integers.

>> 5:  If at all possible, consider encoding your protocol in ASCII text
>> rather than sending binary stuff on the wire.
>>
> Again, why?   Sure it is easier to debug.

  You just answered your own question.  :)

  ASCII is very portable -- it's a single byte and the same
everywhere, and systems are full of routines to handle it and to
transfer it properly, and humans can look at it and easily make sure
it's sane and spot any problems, and it's easy to fix-up systemic
malfunctions (sometimes even by hand).

  Binary is opaque and every platform is different and broken binary
data looks like just proper binary data until you decode it.

> For performance, I always end up using binary.

  Yup.  Size and speed are the only things binary has going for it.
(Oh, that's all?)

-- Ben



More information about the gnhlug-discuss mailing list