Data type conversion using logstash grok

Basicis a floating point field. The specified index is not found in the elastics search. When I run the configuration file with logstash -fI do not get an exception. However, the data reflected and entered into elasticsearch shows the display Basicas string. How to fix this? And how to do this for multiple fields?

input {  
      file {
          path => "/home/sagnik/work/logstash-1.4.2/bin/promosms_dec15.csv"
          type => "promosms_dec15"
          start_position => "beginning"
          sincedb_path => "/dev/null"
      }
}
filter {
    grok{
        match => [
            "Basic", " %{NUMBER:Basic:float}"
        ]
    }

    csv {
        columns => ["Generation_Date","Basic"]
        separator => ","
    }  
    ruby {
          code => "event['Generation_Date'] = Date.parse(event['Generation_Date']);"
    }

}
output {  
    elasticsearch { 
        action => "index"
        host => "localhost"
        index => "promosms-%{+dd.MM.YYYY}"
        workers => 1
    }
}
0
source share
1 answer

You have two problems. Firstly, your grok filter is specified before the csv filter and because the filters are applied so that there is no "Basic" field for conversion when the grok filter is applied.

-, , grok . ,

grok{
    match => [
        "Basic", " %{NUMBER:Basic:float}"
    ]
}

no-op. overwrite => ["Basic"], , , mutate :

mutate {
    convert => ["Basic", "float"]
}
+3

All Articles