Centralization logs : load content of file in elasticsearch or attachments

hello,

My subject :
Some agents supply every days multiple logs (<50ko per file) on each server (100 servers windows or linux). I would like load logs in ES in order to consult easily the logs from a unique centralized way.

  1. loading logs in ES :
    Each log file name has interesting information, so the filename must be a field (or split in 2 or 3 fields) in my elasticsearch database : or may be I have to provide the filename in the index name ?

  2. content of file :
    Elasticsearch is generally used to index data of types like string, number, date, etc... So, I want to load all lines of each file in the database...
    But, I think it is also possible to load the file like an attachments ? Is it possible or what is the good practise (advantage/disadvantage) to load all lines in ES ? There are lots of data to load so, what's the best way ?

  3. If I want to load all my logs file in ES, may be I have to work on the architecture : for example create an index per file ? Is it a good pratcise ?

I really interessting to have some clues to understand how to build my architecture of centralized logs...

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.