Getting the IP address of a Logstash-forwarder machine

I installed the Elasticsearch, Logstash, Kibana log viewer tools on my systems. There are two machines in my configuration now (Amazon EC2 instances):

  • 54.251.120.171 - Logstash server on which the ELK is installed
  • 54.249.59.224 - Logstash-forwarder - sends the log "/ var / logs / messages" to the Logstash server

On a logstash server, it looks like my configs (in different files): -

input { lumberjack { port => 5000 type => "logs" ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt" ssl_key => "/etc/pki/tls/private/logstash-forwarder.key" } } filter { if [type] == "syslog" { grok { match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" } add_field => [ "received_at", "%{@timestamp}" ] add_field => [ "received_from", "%{host}" ] } syslog_pri { } date { match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ] } } } output { elasticsearch { host => localhost } stdout { codec => rubydebug } } 

In logstash-forwarder, what my configuration file looks like, it forwards / var / log / messages, / var / log / secure logs to logstash-server

 { "network": { "servers": [ "54.251.120.171:5000" ], "timeout": 15, "ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt" }, "files": [ { "paths": [ "/var/log/messages", "/var/log/secure" ], "fields": { "type": "syslog" } } ] } 

Here's what my Kibana interface looks like after it has retrieved indexed logs from Elasticsearch. enter image description here

So my question is: I need a way to get the IP address of the log-forwarding, i.e. 54.249.59.224, in the event of a log event.

The reason I ask about this is the real scenario, we can have many log-forwarders-forwarders (say 10), and all 10 of them send logs to our log server. Therefore, I need to somehow mark all the log events so that I can determine which logstash server sent this log event.

I will need to use the IP address (possibly other information) to search for log events in the Kibana interface.

Can someone please help me do this? :)

Or someone has a better idea how to do it effectively in a different way, you are very welcome!

+7
elasticsearch logstash kibana logstash-grok logstash-forwarder
source share
2 answers

To do this, you need to change the input filter and the Lumberjack server. See the following differences:

https://github.com/stanhu/logstash-input-lumberjack/commit/0861c9d95caa46370f17a82353710bc78ea2c82e

https://github.com/stanhu/logstash-forwarder/commit/b711d273a24ab3fe1b4a7b6005d2f26983cac859

Until these changes are merged into logstash-forwarder and logstash-input-lumberjack, the easiest way to use this right now is to modify your installed files directly. If you are using current Debian packages, they are here:

 /opt/logstash/lib/logstash/inputs/lumberjack.rb /opt/logstash/vendor/bundle/jruby/1.9/gems/jls-lumberjack-0.0.20/lib/lumberjack/server.rb 

In the input filter configuration file (e.g. / etc / logstash / conf.d / 01-lumberjack-input.conf) add:

 client_address_field => "client_address" 

Your logstash configuration should look something like this:

 input { lumberjack { host => "localhost" port => 5000 type => "logs" ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt" ssl_key => "/etc/pki/tls/private/logstash-forwarder.key" client_address_field => "client_address" } } 

Restart logstash on the machine and check if the client_address field is client_address for each message.

+3
source share

This is the host field (which you also copy to received_from) for.

0
source share

All Articles