How to prevent buffer overflow / array overflow?

I recently wrote code for a custom serial communication protocol. What I did, I used a portion (8/16 bit) of the received data to indicate how large the frame size is. Based on this data, I expect no data to follow. I use Crc to accept or decline a frame. But I will not be able to include the frame length data in Crc, since on the receiving side I need to know how much data to expect before processing the frame.

The problem I ran into is sometimes that the frame length data gets corrupted, and it tricks the receiver into receiving this number of bytes, while the size of the received array is much smaller. This distorts the many critical system variables that are present in sequential memory cells.

How to prevent buffer overflow? My thoughts on this 1) Reject the frame length data if it goes beyond a certain value. 2) use a data type that limits the maximum number. Similar to using a short circuit limiting the array index area to 256 memory locations and creating a buffer with 280 bytes. 3) allocate memory in a separate place so that it does not affect critical system variables.

One thing I used to prevent getting stuck in the receive loop is using a timeout. But I did not pay attention to this aspect of the problem. It seems to me that I need a lot of time to confirm and reproduce the problem, because, thus, the code is part of a larger system code, and I am not an expert here.

How to safely deal with such problems?

Also: what general considerations or standard methods should be used when using an array to prevent it from overflowing?

+4
source share
2 answers

, , . , , - , .

, . , , 256, 0xFF uint8_t, , , 256 0. , , , , crc .

, , - , . , , , . , , .

, . , , . , , CRC , , . , .

+1

,

, . ,

+1

All Articles