My application works when i try to post something using my URIs, ES automatically create a new index.
And now when i try to put some more data over that index with same name i get an indexing issue.. even tho I am posting same fields with same datatypes.
Here is my logstash configuration:
input {
file {
path => "C:\Users\USERS\ELK\SpringBootElasticSearchAPP\DATABASE\CSV\DOCTOR\doctor.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => ["id", "age", "name", "profession"]
}
mutate {convert => ["id","integer"]}
mutate {convert => ["age","integer"]}
ruby { code => "event.remove('@version')" }
ruby { code => "event.remove('message')" }
ruby { code => "event.remove('path')" }
ruby { code => "event.remove('@timestamp')" }
ruby { code => "event.remove('host')" }
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "doctor"
}
stdout { codec => rubydebug }
}
Well i changed my logstash index name to doctest like so ( index => "doctor") and it worked fine creating a new index just to see the difference it could be something with mapping order or fields names?
Index created using Api request (index: doctor)
{
"mapping": {
"doctor": {
"properties": {....}
}
}
}
Index created using Logstash (index: doctest)
{
"mapping": {
"doc": {
"properties": {....}
}
}
}
Proprieties are the same with same order..
I just want to fix this so i could put multiple documents from my csv file into my elasticsearch using LS and keep posting with my Api requests.
i really need help am so lost ...