Best way to add metadata stored on external server?

Hi,

On our nightly build systems we have currently multiple directories prepared for installation. It looks like this:

/nightly/test_1
/nightly/test_2
/nightly/test_3

After the automatic deployment I have additional metainformation, which is created during the deployment process which I would like to merge:

test_1 -> branch_3773
test_2 -> branch_8823
test_3 -> branch_8826

Logfiles are written there: /nightly/test_x/logs/*.log

No I want to add this meta information of the branch to each log event.
What is the best, efficient way? The data only changes once a day.

We are using multiple logstash instances. So if the metadata changes, all logstash instances need to know.

For performance reason I would like to have this metadata stored and cached locally inside logstash. I assume easiest way is to place this meta information on a file on the source host.
What possibilities do I have then?

  • using translate filter plugin
    • con: I need to access a file on the source server by logstash server. Don't know if this would be allowed in production.
    • what would be the way to access this file from logstash on an external folder?
  • shipping the file with filebeat to logstash. Logstash is writing it to elasticsearch. Using elasticsearch-filter-plugin to read the current meta information.
    • con: i tried sth similar in the past, The performance impact was very big when doing elasticsearch-query for each log line. Is there some possibility of caching inside logstash, that the query runs only once a minute or so?
  • setting up a database and write metadata information into that database. Use jdbc_static or jdbc_streaming for getting meta information cachable into all logstash instances
    • con: additional infrastructure needed (database)
  • other approaches? What do you suggest?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.