I am parsing NMEA GPS data from a device that sends timestamps without milliseconds. As far as I heard, these devices will use a specific trigger point when they send a sentence with a timestamp of .000 timestamp - afaik $ in the GGA sentence.
So, I parse the GGA sentence and take the timestamp when getting $ (I compensate for any other characters that are read in the same operation using the serial port speed).
From this information, I calculate the offset to adjust the system time, but when I compare the time set on some NTP servers, I get a constant difference of 250 ms - when I fix it manually, I am within the 20 ms deviation, which is suitable for my application.
But, of course, I'm not sure where this offset comes from, and if it is anyway specific to the GPS mouse I use, or my system. Am I using the wrong $ character, or does anyone know how exactly this should be handled? I know this question is very vague, but any clues about what might cause this bias will be very helpful!
Here are some examples of data from my device, with the $ symbol, which I will take as the marked time offset:
$GPGSA,A,3,17,12,22,18,09,30,14,,,,,,2.1,1.5,1.6*31
$GPRMC,003538.000,A,5046.8555,N,00606.2913,E,0.00,22.37,160209,,,A*58
-> $ <- GPGGA,003539.000,5046.8549,N,00606.2922,E,1,07,1.5,249.9,M,47.6,M,,0000*5C
$GPGSA,A,3,17,12,22,18,09,30,14,,,,,,2.1,1.5,1.6*31
$GPGSV,3,1,10,09,77,107,17,12,63,243,30,05,51,249,16,14,26,315,20*7E
$GPGSV,3,2,10,30,24,246,25,17,23,045,22,15,15,170,16,22,14,274,24*7E
$GPGSV,3,3,10,04,08,092,22,18,07,243,22*74
$GPRMC,003539.000,A,5046.8549,N,00606.2922,E,0.00,22.37,160209,,,A*56
-> $ <- GPGGA,003540.000,5046.8536,N,00606.2935,E,1,07,1.5,249.0,M,47.6,M,,0000*55
$GPGSA,A,3,17,12,22,18,09,30,14,,,,,,2.1,1.5,1.6*31
$GPRMC,003540.000,A,5046.8536,N,00606.2935,E,0.00,22.37,160209,,,A*56
-> $ <- GPGGA,003541.000,5046.8521,N,00606.2948,E,1,07,1.5,247.8,M,47.6,M,,0000*5E