Logstash logs are read but are not pushed to elasticsearch

I have the following logstash configuration:

logstash-configration.conf
input {
  file {
	codec => "json_lines"
	path => ["/etc/logstash/input.log"]
	sincedb_path => "/etc/logstash/dbfile"
	start_position => "beginning"
	ignore_older => "0"
  }
}
output {
   elasticsearch {
      hosts => ["192.168.169.46:9200"]
   }
   stdout {
      codec => rubydebug
   }
}

The /etc/logstash/input.log file is populated with logs from a running java application. The logs are in the following json format (they are written inline separated by the \n character):

log-input-formatted
{
"exception": {
	"exception_class": "java.lang.RuntimeException",
	"exception_message": "Test runtime exception stack: 0",
	"stacktrace": "java.lang.RuntimeException: Test runtime exception stack: 0"
},
"@version": 1,
"source_host": "WS-169-046",
"message": "Test runtime exception stack: 0",
"thread_name": "parallel-1",
"@timestamp": "2019-12-02T16:30:14.084+02:00",
"level": "ERROR",
"logger_name": "nl.hnf.logs.aggregator.demo.LoggingTest",
"aplication-name": "demo-log-aggregation"
}

I also updated the logstash default template using the elasticsearch API:

Put request body at: http://192.168.169.46:9200/_template/logstash?pretty
{
"index_patterns": "logstash-*",
"version": 60002,
"settings": {
    "index.refresh_interval": "5s",
    "number_of_shards": 1
},
"mappings": {
    "dynamic_templates": [
        {
            "message_field": {
                "path_match": "message",
                "match_mapping_type": "string",
                "mapping": {
                    "type": "text",
                    "norms": false
                }
            }
        },
        {
            "string_fields": {
                "match": "*",
                "match_mapping_type": "string",
                "mapping": {
                    "type": "text",
                    "norms": false,
                    "fields": {
                        "keyword": {
                            "type": "keyword",
                            "ignore_above": 256
                        }
                    }
                }
            }
        }
    ],
    "properties": {
        "@timestamp": {
            "type": "date"
        },
        "@version": {
            "type": "keyword"
        },
        "source_host": {
            "type": "keyword"
        },
        "message": {
            "type": "text"
        },
        "thread_name": {
            "type": "text"
        },
        "level": {
            "type": "keyword"
        },
        "logger_name": {
            "type": "keyword"
        },
        "aplication_name": {
            "type": "keyword"
        },
        "exception": {
            "dynamic": true,
            "properties": {
                "exception_class": {
                    "type": "text"
                },
                "exception_message": {
                    "type": "text"
                },
                "stacktrace": {
                    "type": "text"
                }
            }
        }
    }
}

}

Elasticsearch responds with "acknowledged": true and I can see the template being updated via API.

Now starting logstash with debug log level i see the input logs being read but not sent to elasticsearch, although the index is created but it's always empty:

[2019-12-03T09:30:51,655][DEBUG][logstash.inputs.file     ][custom] Received line {:path=>"/etc/logstash/input.log", :text=>"{\"@version\":1,\"source_host\":\"ubuntu\",\"message\":\"Generating some logs: 65778 - 2019-12-03T09:30:50.775\",\"thread_name\":\"parallel-1\",\"@timestamp\":\"2019-12-03T09:30:50.775+00:00\",\"level\":\"INFO\",\"logger_name\":\"nl.hnf.logs.aggregator.demo.LoggingTest\",\"aplication-name\":\"demo-log-aggregation\"}"}
[2019-12-03T09:30:51,656][DEBUG][filewatch.sincedbcollection][custom] writing sincedb (delta since last write = 1575365451)

Output of http://192.168.169.46:9200/_cat/indices?v call:
health | status | index |uuid |pri | rep | docs.count | docs.deleted | store.size | pri.store.size
green | open | logstash-2019.12.03-000001| ADb37pLARoWJQah5RFFuuA |1|0|0|0|283b| 283b

As you can see the index is created but the docs.count is always 0 and no data is pushed to elasticsearch.

Also, the elasticsearch logs are on debug level too, but i don't see any errors there or anything that could give me a hint about the source of the problem.

Do you guys have any idea or suggestion on why the logs are not pushed to elasticsearch?

can you share your logstash configuration for further analysis

Check the second line of the post

It seems that the json_lines codec is not compatible with file input(documentation is not very accurate on this)
fixed the issue by using:

input {
  file {
	codec => "json"
	path => ["/etc/logstash/input.log"]
  }
}
output {
   elasticsearch {
  hosts => ["192.168.169.46:9200"]
   }
   stdout {
  codec => rubydebug
   }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.