Hi,
i want to export some data from old indexes and write them into a text file. When I restart logstash, it exports some data (a part of one day, the index has a complete month) and goes back to do nothing. I am using the latest Version 7 on this machine.
config:
input {
elasticsearch {
hosts => [ "localhost:9200" ]
query => '{ "query": { "query_string": { "query": "*" } } }'
index => "home-2023.09*"
docinfo => true
}
}
filter {
translate {
field => "edomi-KO-ID"
destination => "translated_data"
dictionary_path => "/etc/logstash/conf.d/lookup_edomi-KNXGA.csv"
fallback => "NOT_FOUND"
refresh_interval => 3600
}
if [translated_data] == "NOT_FOUND" {
drop {}
}
translate {
field => "edomi-KO-ID"
destination => "edomi-GA_KO"
dictionary_path => "/etc/logstash/conf.d/lookup_edomi-KNXGA.csv"
refresh_interval => 3600
fallback => "null"
}
}
output {
csv {
fields => [ "@timestamp", "Funktion", "Geschoss", "Gewerk", "KNX-Name", "KNX-Wert", "KNX-Wert-Float", "Name", "Raum", "edomi-GA_KO", "edomi-KO-ID", "edomi-PA", "edomi-Typ" ]
path => "/opt/export/knx2-%{+YYYY-MM-dd}.txt"
}
}
Edit:
What I want to achieve:
- retrieve the index from the input section
- check if the value in edomi-KO-ID exists in the first line of the lookup file
- if not, drop the message
- if it exists, use the lookup file to add a value into the field edomi-GA_KO
- write to a file that contains the date in the filename.
the lookup table has roughly 70 lines. the exported file has 602 lines
Am I missing something? Could this be a caching issue or something?
Thanks for your help!