Recently, we have been decoding a lot of XML using golang and encoding/xml . We noticed that after several files our boxes run out of memory, begin to replace, and, as a rule, die from an unfortunate death. Therefore, we made a test program. Here he is:
package main import ( "encoding/xml" "io/ioutil" "log" "time" ) // this XML is for reading AWS SQS messages type message struct { Body []string `xml:"ReceiveMessageResult>Message>Body"` ReceiptHandle []string `xml:"ReceiveMessageResult>Message>ReceiptHandle"` } func main() { var m message readTicker := time.NewTicker(5 * time.Millisecond) body, err := ioutil.ReadFile("test.xml") for { select { case <-readTicker.C: err = xml.Unmarshal(body, &m) if err != nil { log.Println(err.Error()) } } } }
All he does is decrypt the XML file over and over again. Our boxes show the same symptom: binary memory usage grows without restriction until the box begins to replace.
We also added to some profiling code that runs after 20 seconds in the above script and got the following from pprof top100 :
(pprof) top100 Total: 56.0 MB 55.0 98.2% 98.2% 55.0 98.2% encoding/xml.copyValue 1.0 1.8% 100.0% 1.0 1.8% cnew 0.0 0.0% 100.0% 0.5 0.9% bytes.(*Buffer).WriteByte 0.0 0.0% 100.0% 0.5 0.9% bytes.(*Buffer).grow 0.0 0.0% 100.0% 0.5 0.9% bytes.makeSlice 0.0 0.0% 100.0% 55.5 99.1% encoding/xml.(*Decoder).Decode ...
Starting this later, before there is not enough memory in the field, we get a higher overall, but almost the same percentage. Can anybody help us? What are we missing?
Thanks in advance!
profiling xml memory-leaks go
Mike dewar
source share