Sending double with tcp from java to c #

I have a Java SocketServer that sends a doubling to a C # client. The gateway sends doubles using DataOutputStream.writeDouble() , and the client reads double using BinaryReader.ReadDouble() . When I send dos.writeDouble(0.123456789); and clear it from the server, the client reads and outputs 3.1463026401691E + 151, which is different from what I sent. Are C # and Java doubles each coded differently?

+5
source share
2 answers

In Java, DataOutputStream.writeDouble() converts double to long before sending, writing it first with a high byte (Big endian).

However, C #, BinaryReader.ReadDouble() reads in Little Endian Format.

In other words: the byte order is different, and changing one of them should fix your problem.

The easiest way to change the byte order in Java from Big to Little Endian is to use ByteBuffer, where you can specify the type endian: eg:

 ByteBuffer buffer = ByteBuffer.allocate(yourvaluehere); buffer.order(ByteOrder.LITTLE_ENDIAN); // add stuff to the buffer byte[] bytes = buffer.array(); 

Then use DataOutputStream.write()

+3
source

The problems are actually related to the encoding, in particular endianness . Java uses the big-end format, which is standard network content, while your C # client uses a low-risk format.

So here's what happened: 0.123456789 is stored in the IEEE754 double-precision format as 0x3FBF9ADD3739635F . When it is read in C #, the byte order is switched, so it is saved as 0x5F633937DD9ABF3F . This corresponds to the decimal number 3.14630264016909969143315814746e151 .

Check out this one to see how to change the byte order on the client side of C #

+2
source

All Articles