I am unable to get Logstash to read data and send to Elasticsearch index. My Elastcistac is 8.11.2, under a Docker for Windows platform. I have 2 jsonl formatted files that I need to index into Elasticsearch by a unique key. My logstash.conf is:
Logstash initializes with no errors and I can see the input data in the container. However, there is no indication of any attempt to send data to Elasticsearch. No messages appear in the Logstash log anfter the message:
This codec will decode streamed JSON that is newline delimited. Encoding will emit a single JSON string ending in a @delimiter NOTE: Do not use this codec if your source input is line-oriented JSON, for example, redis or file inputs. Rather, use the json codec. More info: This codec is expecting to receive a stream (string) of newline terminated lines. The file input will produce a line string without a newline. Therefore this codec cannot work with line oriented inputs.
You were absolutely right. Solved my problem immediately in this case.
Is there some nuance about how
``` sincedb_path => "/dev/null"
works? After completing my first test, I changed the sincedb_path to point to my custom sincedb_path, then back to "/dev/null" again. Now I'm not getting data to read from either configuration, even with new data introduced,
Two things come to mind one. You should probably read in detail. How since_db works
Second, I see you're creating your own document ID. Tou would probably want to look closely at that. We often see folks somehow Just overriding the same documents over and over again... Unless you have a very specific reason to use your own document ID that generated document, I use her very foolproof.