Logstash output question


(FREDERIC) #1

Hi all,
i'd like to know if it's possibile to write some custom text inside of a file, during a csv upload inside elastic search


(Magnus Bäck) #2

Logstash doesn't send files to Elasticsearch. It sends JSON documents, in your case created from CSV files. Do you want to create additional fields/properties in the JSON document created from each CSV line, or what do you want to accomplish?


(FREDERIC) #3

beside sending JSONs to ES, i'd like to write a text file with the processed csv name and the number of records correctly uploaded n ES, i need this to manage csv reports better


(Magnus Bäck) #4

Didn't you ask the same question a few days ago? I don't think there's a simple way of doing it. But can't you extract that information from ES instead? If you save the CSV filename in a field you can aggregate on the filename field and get back the document count per filename.


(FREDERIC) #5

okay, but don't you have any idea about how this could be achieved? if you could give me some place to start it would be better, otherwise i am willing to hear some other reccomendation about how to execute it from ES, as you just mentioned. thank you


(Magnus Bäck) #6

Use a terms aggregation to query ES and get back a table of how many documents there are for each CSV file.


(FREDERIC) #7

ok thanks, but there's another issue: every time i load records inside an index through csv not ALL of them gets adeed, many of these get to update pre-existing records, so it turns out to be difficult to assess that everything went at it should or if there have been any errors/exceptions


(system) #8