9994 } } filter{ if [type] == ...">

Logstash and Json logs from log4net

Given the following configuration:

input { udp { type=> "genericJson2" port => 9994 } } filter{ if [type] == "genericJson2" { json { source => "message" } } } output { elasticsearch{ } } 

Next entry:

 {"date":"2018-02-27T13:21:41.3387552-05:00","level":"INFO","appname":"listenercore","logger":"Main","thread":"1","message":"test"} 

I get the following result:

 { "_index": "logstash-2018.02.27", "_type": "doc", "_id": "AWHYfh_qDl_9h030IXjC", "_score": 1, "_source": { "type": "genericJson2", "@timestamp": "2018-02-27T18:19:59.747Z", "host": "10.120.4.5", "@version": "1", "date": "{\"date\":\"2018-02-27T13:20:02.2113", "message": "{\"da", "logger": "{\"da", "thread": "{", "level": "{\"da", "appname": "{\"date\":\"201" }, "fields": { "@timestamp": [ "2018-02-27T18:19:59.747Z" ] } } 

What do I need to do to properly process my json logs?

EDIT

I went a little deeper. doing this from cmd line

 sudo bin/logstash -e "input{stdin{type=>stdin}} filter{json {source=>message}} output{ stdout{ codec=>rubydebug } }" 

Desired output

 { "@timestamp" => 2018-02-28T02:07:01.710Z, "host" => "Elastisearch01", "appname" => "listenercore", "logger" => "Main", "@version" => "1", "type" => "stdin", "date" => "2018-02-27T13:21:41.3387552-05:00", "level" => "INFO", "thread" => "1", "message" => "test" } 

So, I wrote a quick python udp server to find out what happens through the wire, here is what I captured:

 { " date " : " 2 0 1 8 - 0 2 - 2 7 T 2 1 : 0 6 : 0 4 . 7 4 6 1 3 4 6 - 0 5 : 0 0 " , " level " : " INFO " , " appname " : " listenercore " , " logger " : " M ain " , " thread " : " 1 " , message " : " test " } 

There are additional spaces between each character, I study text encodings, but not sure yet.

EDIT

The encoding problem has been pretty much verified. If I commit, decode and relay the logs with this python script, it solves the problem:

 import socket UDP_IP_ADDRESS = "10.254.18.166" UDP_PORT_NO = 9993 serverSock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) serverSock.bind((UDP_IP_ADDRESS, UDP_PORT_NO)) clientSock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) while True: data, addr = serverSock.recvfrom(1024) clientSock.sendto(data.decode('utf-16'), ("127.0.0.1", 9994)) 

How can I run logstash to access UTF-16 inputs? I tried this and it does not work:

 bin/logstash -e "input{udp{port=>9994 type=>stdin codec=>plain{charset=>'UTF-16'}}} filter{json {source=>message}} output{ stdout{ codec=>rubydebug } }" 
+7
logstash log4net
source share
2 answers

Can you check with UTF-16LE instead of UTF-16?

+5
source share

{"_index": "logstash-2018.02.27", "_type": "doc", "_id": "AWHYfh_qDl_9h030IXjC", "_score": 1, "_source": {"type": "genericJson2", "@ timestamp ":" 2018-02-27T18: 19: 59.747Z "," host ":" 10.120.4.5 "," @version ":" 1 "," date ":" {\ "date \": \ "2018 -02-27T13: 20: 02.2113 "," message ":" {\ "da", "logger": "{\" da "," thread ":" {"," level ":" {\ "da" , "appname": "{\" date \ ": \" 201 "}," fields ": {" @timestamp ": [" 2018-02-27T18: 19: 59.747Z "]}}

How to change "_type": "doc" to "_type": "my_type". You know? I can change "_index": "logstash-2018.02.27" to "_index": "my_index-YYYY.MM.dd" when I configure logstash.conf

 output { elasticsearch{ index => "%{my_type}-%{YYYY.MM.dd}" } } 

but i don't know the _type changes. Please help

0
source share

All Articles