How to use Logs in Logstash format with Filebeat

Hello guys,

I'm at my wit's end and I need your advice.
We use Filebeat to send our logs to elastic search (Cloud). All good so far. Nginx and system logs are working fine so far. Now we would like to send our logs from NodeJS Applications to elasticsearch via Filebeat. These are created as files with the help of Winston in Logstash format. Unfortunately they are not visible in Kibana. What have we missed here?

Here an example from the logs:

{"@message":"Lorem Ipsum: {\"Foo\":{\"_id\":\"351b3fb9-fdca-4df5-8737-65d75e65c1f5\",\"country\":\"DE\"}}","@timestamp":"2020-04-25T11:36:30.654Z","@fields":{"context":"lorem ipsum","level":"info"}}

PS: I didn't set up an ingest pipeline because I thought it would be handled by the Logstash format. Am I wrong? If so, what would the grok pattern have look like?

I've checked Winston Logstash format at https://github.com/getninjas/winston-logstash-format.
I think it is just a JSON logger with 2 old fields which existed in Logstash 2.x.

I would suggest maybe this logger, which tries to use the ECS format.
This is part of a broader project by Elastic to provide loggers which generate logs to be sent to Elasticsearch.

In both cases (if you want to use Logstash Winston or the ECS Winston), this is the input to decode JSON files:

filebeat.inputs:
- type: log 
  paths:
    - /path/to/nodejs/app/*.log
  json.keys_under_root: true
  json.add_error_key: true

I need to check your filebeat.yml file.

Do you specify a destination index or you send everything to the default index (filebeat-...)?

Do you see any mapping errors in Filebeat or Elasticsearch logs?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.