Logstash not sending data to Elasticsearch

I am unable to get Logstash to read data and send to Elasticsearch index. My Elastcistac is 8.11.2, under a Docker for Windows platform. I have 2 jsonl formatted files that I need to index into Elasticsearch by a unique key. My logstash.conf is:

  file {
    path => "/usr/share/logstash/companies.jsonl"
    mode => "read"
    start_position => "beginning"
    sincedb_path => "/dev/null"
    codec => json_lines
    file_chunk_size => 524288
  }
}
 
output { 

  elasticsearch {
    hosts => "${ELASTIC_HOSTS}"
    user => "${ELASTIC_USER}"
    password => "${ELASTIC_PASSWORD}"
    ssl_certificate_authorities => ["/usr/share/logstash/certs/ca/ca.crt"]
    index => "my-elasticsearch-index"
    document_id => "%{field-from-my-input-file}"
    action => "update"
    doc_as_upsert => true
    id => "Bulk-Filings-Index-Upload"
    }  
   }

Logstash initializes with no errors and I can see the input data in the container. However, there is no indication of any attempt to send data to Elasticsearch. No messages appear in the Logstash log anfter the message:

2023-12-20 08:42:11 [2023-12-20T13:42:11,425][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

No relevant messages appear in the Elasticsearch log.

I have turned on debug and can see my data in the log which appears properly formatted and the record key is as expected.

Adding stdout { codec => rubydebug } to my logstash output produces no output.

This is a development environment, thus the "/dev/null" for the sincedb_path.

What am I missing?

Hi @gtartjr welcome to the community

What does a few sample lines of the source look like?

Pretty sure you need to take that out that's made for a streaming case, not reading from a file

From the docs

This codec will decode streamed JSON that is newline delimited. Encoding will emit a single JSON string ending in a @delimiter NOTE: Do not use this codec if your source input is line-oriented JSON, for example, redis or file inputs. Rather, use the json codec. More info: This codec is expecting to receive a stream (string) of newline terminated lines. The file input will produce a line string without a newline. Therefore this codec cannot work with line oriented inputs.

This is a sample record, but you're right, I failed to see that part of the doc and I see why this may not work as expected. Each record does terminate with a " \n ", new line (ASCII 10).

{"cik": "0000876684", "entityType": "other", "sic": "", "sicDescription": "", "insiderTransactionForOwnerExists": 0, "insiderTransactionForIssuerExists": 0, "name": "ALLIED FINANCIAL CORP II", "tickers": [], "exchanges": [], "ein": "521689359", "description": "", "website": "", "investorWebsite": "", "category": "", "fiscalYearEnd": null, "stateOfIncorporation": "", "stateOfIncorporationDescription": "", "addresses": {"mailing": {"street1": "1919 PENNSYLVANIA AVE", "street2": null, "city": "WASHINGTON", "stateOrCountry": "DC", "zipCode": "20006", "stateOrCountryDescription": "DC"}, "business": {"street1": "1919 PENNSYLVANIA AVE", "street2": null, "city": "WASHINGTON", "stateOrCountry": "DC", "zipCode": "20006", "stateOrCountryDescription": "DC"}}, "phone": "2023311112", "flags": "", "formerNames": []}

So that is precisely not the case for the for json_lines
Try
codec => "json"

You were absolutely right. Solved my problem immediately in this case.

Is there some nuance about how

``` sincedb_path => "/dev/null"
 works? After completing my first test, I changed the sincedb_path to point to my custom sincedb_path, then back to "/dev/null" again. Now I'm not getting data to read from either configuration, even with new data introduced,

Two things come to mind one. You should probably read in detail. How since_db works

Second, I see you're creating your own document ID. Tou would probably want to look closely at that. We often see folks somehow Just overriding the same documents over and over again... Unless you have a very specific reason to use your own document ID that generated document, I use her very foolproof.

So make sure you know what you're doing there

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.