Hello i am creating a config file to import the data from a csv file.
when i call the config file as it is, its working properly and data is coming in kibana.
But when i am changing the data type of a field in logstash config file its not working.
Below is the config file format:
input {
file {
path => "C:/elk/user.csv"
start_position => beginning
sincedb_path => "Null"
}
}
filter {
csv {
columns => [
"customerid",
"installationid",
"stageid",
"serviceid" ,
"time",
"active",
"blocked",
"created"
]
separator => ","
mutate {convert => ["active" , "integer"]}
mutate {convert => ["blocked" , "integer"]}
mutate {convert => ["created" , "integer"]}
}
}
output {
stdout
{
codec => rubydebug
}
elasticsearch {
action => "index"
hosts => ["127.0.0.1:9200"]
index => "usertest"
}
}
Please tell me what is wrong with this.
thanks in advance