I'm facing a problem when creating an elasticsearch index. I'm new to this, so please let me know what I could do to fix this.
I'm pasting the two config files which I've used for reference. The one at the top is working, which I used just to see whether a simple load is working or not.
This seems to be working fine. However when I use the below block, since I need to create a kibana visualization based on certain values from the fields, I'm using this
looks to me the error is around the else if conditions and the way to set the add field, instead you should use nested else conditions, try this below,
if "data_one" in [data] {
mutate {
add_field => { "Desc" => ["Snap" ] }
}
} # end if "data_one"
else {
if "data_two" in [data] {
mutate {
add_field => { "Desc" => ["Snaptwo" ] }
}
} # end if "data_two"
else {
if "data_three" in [data] {
mutate {
add_field => { "Description" => ["Snapthree" ] }
}
} # end if "data_three"
} # end 2nd else
} # end 1st else
@Badger No, that is not a problem here, as I stated above, in both of the codes, I've used a similar path for the input file and I get the index generated for the 1st one, whereas couldn't get that in the second one. So clearly there is something wrong in the later part of it.
@pranayv66 - here is a conf that is working in my end, a sample index created with the if and columns conversion executed.
input {
file {
path => "C:/Users/pranay/data.csv"
start_position => "beginning"
sincedb_path => "C:/Users/pranay/.since.sample.log"
}
}
filter {
csv {
columns => [ "Desc","time","util","data" ]
convert => {
"time" => "float"
"util" => "float"
}
}
if "data_one" in [data] {
mutate {
add_field => { "Description" => ["Snap" ] }
}
} # end if "data_one"
else {
if "data_two" in [data] {
mutate {
add_field => { "Description" => ["Snaptwo" ] }
}
} # end if "data_two"
else {
if "data_three" in [data] {
mutate {
add_field => { "Description" => ["Snapthree" ] }
}
} # end if "data_three"
} # end 2nd else
} # end 1st else
mutate {
remove_field => [ "column1","column2","column3","column4" ]
}
}
output {
stdout { codec => json }
elasticsearch {
hosts => ["localhost:9200"]
index => "test"
}
}
sample csv - don't include column/header names in the file, just the data to ingest
abc,20.8,19.75,data_three
xya,34.5,19.5,data_two
as suggestion, don't forget to delete your sincedb path file when you're running testing (reload objects), this file track the last object/row processed.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.