I can successfully get logstash to write records to an elasticsearch index if the index doesn't exist in the first place. But I can't seem to get logstash to write records to an index if it already exists.
For example, this scenario works PERFECTLY:
SCENARIO 1
I have a file called /root/setup/data/csv/data.csv
with the following contents:
order_id
1
2
And I have a file called /root/setup/logstash/csv.conf
with the following contents:
input {
file {
path => "/root/setup/data/csv/data.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
mode => "read"
exit_after_read => true
file_completed_action => "log"
file_completed_log_path => "/root/logs"
}
}
filter {
csv {
autodetect_column_names => true
}
mutate {
convert => {
"order_id" => "integer"
}
}
}
output {
elasticsearch {
hosts => ["elastic.myserver.net:9200"]
ssl => true
user => "elastic"
password => "PASS123456789"
index => "csv"
}
}
I then have a brand new installation of elastic and kibana v.8.6. And then I ran this command
/usr/share/logstash/bin/logstash -f /root/setup/logstash/csv.conf
Everything works perfectly, I see a new index called csv
with two records with the proper data in each records.
SCENARIO 2
I make a brand new installation of elastic and kibana. Then I run this command:
curl -X PUT -u elastic:PASS123456789 "https://elastic.myserver.net:9200/csv"
This causes an empty index csv
to appear. Then I go through the same steps as SCENARIO 1. But no new records are written to the csv
index.
Why can't logstash write records to an existing index?
NOTE: I also tried adding a mapping to the csv
index, but it didn't make any difference. Logstash still can't write to an index that already exists.