Upload my logs from node to ElasticSearch

Hi, I'm new to ElasticSearch.
I want my application to upload its logs while it's running - to my ElasticSearch server.
My application is using node (typescript), and I can't understand how to upload a JSON log I created to the server.
Do I have to create my own log file and only then send it to the ES server?
Please help :slight_smile:

Writing to a local log file and then have e.g. Filebeat read this and forward it to Elasticsearch is generally a good practice as it provided buffering on disk in case of issues with your Elasticsearch cluster. If you send data directly any issues with Elasticsearch or connectivity to it could potentially affect your application.

2 Likes

Could you reference me to a code doing that, or to a guide?

As @Christian_Dahlqvist said, you could send your logs directly to Elasticsearch via the HTTP API, but using an intermediate log file can be useful as a persistent buffer and queue.

If you choose to write a log file and ingest it via filebeat, the filebeat configuration documentation can get you started. In particular the inputs documentation, the elasticsearch output documentation and the json decoding documentation could be of use.

For the new fields added by decoding your json data, I would strongly recommend to add them to the index template using the setup.template.append_fields setting. This ensures your data types are correctly interpreted by Elasticsearch.

If you run into any roadblocks along the way, please don't hesitate to ask for specific advice.

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.