"...">

Odd Logstash Date Parsing Error

I get the following error from Logstash:

{: timestamp => "2013-12-30T17: 05: 01.968000-0800" ,: message => "Failed to parse the date from the field" ,: field => "message" ,: value => "2013-12 - 30 17: 04: 59,539.539 INFO 14282: 140418951137024 [foo.lib.base.onResults: 152] - / 1.12 / media - \ "getMediaStoreUrl \":, 10.101.AA.BB, 10.101.19.254 accepted 0.170675992966, returning https: / /foo.s3.amazonaws.com/foo/customerMedia/1009238911/23883995/image? Signature =% 2BfXqEdNWtWdhwzi% & * YEGJSDDdDFF% 3D & Expires = 1388455499 & AWSAccessKeyId = NOIMNOTTHATSTUPID exception> .IllegalArgumentException: Invalid format: "2013-12-30 17: 04: 59,539.539 INFO 14282: 140418951137024 ..." is distorted when ".539 INFO 14282: 140418951137024 ...",: level =>: warn}

Obviously, the error concerns the date format, which comes to me as:

2013-12-30 17:04:59,539.539 INFO 14282:140418951137024...

And my template is as follows:

    date {
        match => ["message", "yyyy-MM-dd HH:mm:ss,SSS"]
    }

I read in the Joda-Time Library and I think I have a higher format. It is strange to me that the error message contains twice the SSS (milliseconds): ", 539.539" (our logs output for some reason). I intentionally did not put the second part of ".539" in my template, because I want it to be ignored.

I also successfully use the following template in another filter:

(?<pylonsdate>%{DATESTAMP}\.[0-9]+)

I'm just not quite sure where this error came from. Any ideas what I need to do to fix this? Do I need to mutate @timestamp? Any help is appreciated!

+4
source share
2 answers

, "" api parsing . : INFO 14282: 140418951137024...

grok api , api/

grok { 
     match => ["message","%{DATESTAMP:logtime}\.[0-9]+"]
}
date {
     match => ["logtime","YY-MM-dd HH:mm:ss,SSS"]
}

. . , .

+6

FWIW, Logstash.

, , .

0

All Articles