Use CSV filter Logstash not working

I tried to use the CSV filter in Logstash, but it can load the values โ€‹โ€‹of my file. I am using Ubuntu Server 14.04, kibana 4, logstash 1.4.2 and elasticsearch 1.4.4. Then I show the CSV file and the filter that I wrote. Am I doing something wrong?


CSV file:

Joao,21,555 Miguel,24,1000 Rodrigo,43,443 Maria,54,2343 Antonia,67,213 

CSV Logstash Filter:

 #Este e filtro que le o ficheiro e permite alocar os dados num index do Elasticsearch input { file { path => ["/opt/logstash/bin/testeFile_lite.csv"] start_position => "beginning" # sincedb_path => "NIL" } } filter { csv { columns => ["nome", "idade", "salario"] separator => "," } } output { elasticsearch { action => "index" host => "localhost" index => "logstash-%{+YYYY.MM.dd}" } stdout { codec => rubydebug } } 

When I run the filter, the following appears: using the "input plugin" plugin in the milestone 2 file ... and using the "csv" input plugin ... and the OK message does not appear.

Can anybody help me?

+4
csv logstash elk-stack
source share
1 answer

I solved the problem of adding the sincedb_path field to the input file.

Here's the CSV Logstash filter:

 input { file { path => "/opt/logstash/bin/testeFile_lite.csv" type => "testeFile_lite" start_position => "beginning" sincedb_path => "/opt/logstash/bin/dbteste" } } filter { csv { columns => ['nome', 'idade', 'salario'] separator => "," } } output { elasticsearch { action => "index" host => "localhost" index => "xpto" cluster => "SIC_UTAD" } stdout { codec => rubydebug } } 
+4
source share

All Articles