What should be the logstash grok filter for this log4j log?

I was asked to merge our log4j log files (NOT using Socket calls for now) into a Logstash JSON file, which I will then transfer to Elasticsearch. Our code uses RollingFileAppender. Here is an example of logging.

2016-04-22 16:43:25,172 ERROR :SomeUser : 2 [com.mycompany.SomeClass]  AttributeSchema 'Customer |Customer |Individual|Individual|Quarter|Date' : 17.203 The Log Message.

Here is the ConversionPattern value in the log4j.properties file

<param name="ConversionPattern" value="%d{ISO8601} %p %x %X{username}:%t [%c] %m %n" />

Can someone please help me write a Logstash Grok filter that will analyze the line? I have the following:

filter {
  if [type] == "log4j" {
    grok {
        match => ["message", "%{TIMESTAMP_ISO8601:logdate} %{LOGLEVEL:loglevel} %{GREEDYDATA:messsage}"]
    }
    date {
        match => ["logdate", "yyyy-MM-dd HH:mm:ss,SSS", "ISO8601"]
    }
  }
}

But of course, it takes everything after priority, like a message. I want to further isolate AT LEAST the following fields (defined in Log4j Pattern Layout )

  • User (% X {username})
  • Classpath ([% c])
  • Subject (% t)
  • Nested Diagnostic Materials (% x)
  • Message itself (% m)
+4
1

.

filter {
    mutate {
      strip => "message"
    }
    grok {
      match => {
        "message" => "%{TIMESTAMP_ISO8601:logdate} %{LOGLEVEL:loglevel} :%{DATA:thread} : %{NUMBER:thread_pool} \[(?<classname>[^\]]+)\] %{SPACE} %{GREEDYDATA:msgbody}"
      }
    }
    date {
      match => ["logdate", "yyyy-MM-dd HH:mm:ss,SSS", "ISO8601"]
    }
}

.

. "" "" . , ERROR 5 , INFO - 4, , ERROR, INFO?

+8

All Articles