My subject :
Some agents supply every days multiple logs (<50ko per file) on each server (100 servers windows or linux). I would like load logs in ES in order to consult easily the logs from a unique centralized way.
loading logs in ES :
Each log file name has interesting information, so the filename must be a field (or split in 2 or 3 fields) in my elasticsearch database : or may be I have to provide the filename in the index name ?
content of file :
Elasticsearch is generally used to index data of types like string, number, date, etc... So, I want to load all lines of each file in the database...
But, I think it is also possible to load the file like an attachments ? Is it possible or what is the good practise (advantage/disadvantage) to load all lines in ES ? There are lots of data to load so, what's the best way ?
If I want to load all my logs file in ES, may be I have to work on the architecture : for example create an index per file ? Is it a good pratcise ?
I really interessting to have some clues to understand how to build my architecture of centralized logs...