Logstash: Limit message packet size to 10 MB

Problem

I have a server with Logstash as a logging mechanism. The Logstash instance is configured to save its logs to an AWS Elasticsearch instance.

A few hours ago I stopped browsing the logs in the Kibana ES cluster:

enter image description here

There are many similar errors in the logstash log file:

{:timestamp=>"2016-02-25T14:39:46.232000+0000", 
 :message=>"Got error to send bulk of actions: [413] 
 {\"Message\":\"Request size exceeded 10485760 bytes\"}", 
 :level=>:error}

I spoke with AWS support and they confirmed that their ES machines limit the request size to 10 MB.

What i tried

  • Set flush_size => 50in configuration

Question

How can I limit the size of a Logstash request to the limit of 10 MB set by ES?

+4
source share
2 answers

, , flush_size => 100 : elasticsearch logstash.

" " ES .

Edit:

, logstash :

/opt/logstash/bin/logstash -f /etc/logstash/conf.d/
+2

github: https://github.com/awslabs/logstash-output-amazon_es/issues/55

flush_size => 10, . , , , , : retry_max_items => 1

logstash API, logstash , , :

curl -XGET 'localhost:9600/_node/stats/pipeline?pretty' -s | jq . | awk 'BEGIN {FS="{|}|: |\," } /"in":/ { i=$2 ; next }; /"out":/ { o=$2 ; next } /"id":/ { id=$2 }; /name/ { print $0, i-o , id } '

in-out - , amazon_es

10 . , , , .

, :

filter {
(...)
# truncate big fields events to ~1MB
# need the logstash-filter-truncate plugin
  truncate {
    length_bytes => "1073741824"
    fields => [ "message", "error" , "error_orig" ]
  }
(...)
}
output {
(...)
      amazon_es {
        id => "producer"
        hosts => "${ES_HOSTNAME}"
        index => "logstash-%{+YYYY.MM.dd}-%{[type]}-%{[environment]}"
        document_type => "%{[type]}"
        flush_size => 10
        # if still fails, flush one by one until success
        retry_max_items => 1
      }
(...)
}

, -

0

All Articles