Is the @timestamp field necessary when using Logstash for storage in Elasticsearch?

I have the following setup: I have a Java tool that sends JSON messages to RabbitMQ. They look like this:

{
"a": 0,
"b": 1,
"c": 2
}

Now I use Logstash to read the RabbitMQ queue and store them in Elasticsearch, so I can analyze the data using Kibana. The JSON stored in Elasticsearch is as follows:

{
"a": 0,
"b": 1,
"c": 2,
"@version": "1",
"@timestamp": "2014-01-22T19:05:19.136Z"
}

I do not think the @timestamp field will be useful for what I am doing. When I use cURL to store the same JSON in Elasticsearch, only the @version field exists, the @timestamp field is missing. Is there a way to configure Logstash to save @timestamp?

+4
2

Logstash, Logstash Event. . , @timestamp .

, @timestamp, . Logstash elastics.

Exception in thread "LogStash::Runner" org.jruby.exceptions.RaiseException: (NoMethodError) undefined method `tv_sec' for nil:NilClass
    at RUBY.sprintf(file:/tmp/logstash-1.2.1-flatjar.jar!/logstash/event.rb:239)
    at org.jruby.RubyString.gsub(org/jruby/RubyString.java:3062)
    at RUBY.sprintf(file:/tmp/logstash-1.2.1-flatjar.jar!/logstash/event.rb:225)
    at RUBY.receive(file:/tmp/logstash-1.2.1-flatjar.jar!/logstash/outputs/elasticsearch.rb:153)

@-prefix , @timestamp .

+2

@ -prefixed Logstash. .

, Logstash 1.3.3:

input { 
    generator {
        type => "timestrip"
        message => "This is a test message."
        count => 1
    }
}

filter {
    mutate {
        remove_field => ["@timestamp"]
    }
}

output {
    elasticsearch_http {
        host => "127.0.0.1"
        flush_size => 1
    }
}

ES " " :

NoMethodError: undefined method `tv_sec' for nil:NilClass
+1

All Articles