Logger.logstash.outputs.elasticsearch not printing enough logs

Hi

I saw "logger.logstash.outputs.elasticsearch" : "DEBUG"
is printing debug level logs only during logstash start/stop.

But logstash not printing any logs when it parses an event and stores in elasticsearch

My output config looks like

output 
{
	if[eventID]=="002E000A"   
	{
  	 if[update_event]=="true"
           {
            elasticsearch 
            { 
                hosts => ["localhost:9200"]
               document_id => "%{authsid}"
               index =>  "dashboard_write"
               script => "ctx._source.loginCount= params.event.get('loginCount');
                       ctx._source.contractName= params.event.get('contractName');
                        ctx._source.userName= params.event.get('userName');
                         ctx._source.sessionID= params.event.get('sessionID');
                         ctx._source.eventID= params.event.get('eventID');"
              doc_as_upsert => "true"
               action => "update"        
            }
                 
        # stdout { codec => line {format => "Inserted Event Successfully"}}  
        # stdout { codec => rubydebug }
            }
          else
          {
  elasticsearch 
            {
  hosts => ["localhost:9200"]
               document_id => "%{sessionID}"
               index =>  "dashboard_write"
}

}
  	}

I would prefer some log messages once it inserted into index.
Am I missing something here?

I feel the document_id might be causing the problem. Try without it and put to stdout codec and see if you can print into console. (Do without any if conditions first and see)

Nope. It didnt help. Currently my output is changed as below

output
{

elasticsearch
{
hosts => ["localhost:9200"]
index => "dashboard_write"
}

}

Logs contain only below

[2020-08-03T08:06:25,649][DEBUG][logstash.filters.grok ][events] Event now: {:event=>#LogStash::Event:0x433e783e}
[2020-08-03T08:06:25,761][DEBUG][logstash.filters.grok ][events] Running grok filter {:event=>#LogStash::Event:0x433e783e}
[2020-08-03T08:06:25,769][DEBUG][logstash.filters.grok ][events] filters/LogStash::Filters::Grok: removing field {:field=>"accessManager"}
[2020-08-03T08:06:25,770][DEBUG][logstash.filters.grok ][events] Event now: {:event=>#LogStash::Event:0x433e783e}
[2020-08-03T08:06:25,784][DEBUG][logstash.filters.json ][events] Running json filter {:event=>#LogStash::Event:0x433e783e}
[2020-08-03T08:06:25,787][DEBUG][logstash.filters.json ][events] Event after json filter {:event=>#LogStash::Event:0x433e783e}
[2020-08-03T08:06:25,818][DEBUG][logstash.filters.grok ][events] Running grok filter {:event=>#LogStash::Event:0x433e783e}
[2020-08-03T08:06:25,822][DEBUG][logstash.filters.grok ][events] Event now: {:event=>#LogStash::Event:0x433e783e}
[2020-08-03T08:06:25,826][DEBUG][logstash.util.decorators ][events] filters/LogStash::Filters::GeoIP: adding value to field {"field"=>"countryCode", "value"=>["%{[geoIP][country_code2]}"]}
[2020-08-03T08:06:25,830][DEBUG][logstash.util.decorators ][events] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"loginCount", "value"=>[1]}
[2020-08-03T08:06:25,831][DEBUG][logstash.util.decorators ][events] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"deviceName", "value"=>["%{[os_name][0]}"]}
[2020-08-03T08:06:25,832][DEBUG][logstash.util.decorators ][events] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"deviceID", "value"=>["idp%{B}"]}
[2020-08-03T08:06:25,833][DEBUG][logstash.util.decorators ][events] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"sessionID", "value"=>["%{[Y][0]}"]}
[2020-08-03T08:06:25,834][DEBUG][logstash.util.decorators ][events] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"authsid", "value"=>["%{[Y][1]}"]}
[2020-08-03T08:06:25,835][DEBUG][logstash.util.decorators ][events] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"risklevel", "value"=>["null"]}
[2020-08-03T08:06:25,836][DEBUG][logstash.util.decorators ][events] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"failedCount", "value"=>["null"]}
[2020-08-03T08:06:25,837][DEBUG][logstash.util.decorators ][events] filters/LogStash::Filters::Mutate: adding value to field {"field"=>"eventType", "value"=>["null"]}

I dont see any elasticsearch output related.

I have my input/filters/output config separate files with pipelines pointing to that directory. My pipeline.xml

  • pipeline.id: events
    path.config: "/etc/logstash/conf.d/events/"

Will that be an issue? Im using ELK 7.4.2

Can someone reply please?

Can someone help here please?

We are blocked. It will be really helpful if someone throw some light here

If you enable

stdout { codec => rubydebug }

are you seeing events on stdout?

Are you running with the single unconditional elasticsearch output now? If so, are there any documents in the index?

Yes If I enable stdout, Im able to see events in logstash log. Right now I removed all conditional statements

DELETE dashboard_write

I deleted index first and I generated the events. Im able to see that in index as below

{
"_index" : "dashboard_write",
"_type" : "_doc",
"_id" : "gL9j23MBZwpRwXjSGiON",
"_score" : 1.0,
"_source" : {
"deviceName" : "Windows",
"eventID" : "***",
"browserName" : "Chrome",
"sourceIP" : "",
"authsid" : "89999",
"contractName" : "kkkk",
"eventType" : "null",
"userName" : "admin",
"createDate" : "2020-08-11T02:39:31.944Z",
"deviceID" : "vyn",
"sessionID" : "
",
"countryCode" : "US",
"loginCount" : 1,
"failedCount" : 0
}

OK, so documents are being written to the index. Is your issue just that there are not any DEBUG logs? If so, well, that is true. The manticore_adapter is passed the logger, but never uses it. pool.rb only logs when the pool of connections changes. The output itself does not log anything when processing events normally.

Thanks for the update. My requirement is to have logs even in case of successful indexing. We need that for troubleshooting. I thought
logger.logstash.outputs.elasticsearch" : "DEBUG setting would help here. Is there any other way?

There is no per event logging in the elasticsearch output. It does maintain some metrics around the number of response to bulk request, but I cannot tell you anything about them or how to access them.

The elasticsearch output is built on top of cheald/manticore, and that is built on top of the Apache httpClient. I believe httpClient has logging but that is low level. (I do not know what version of httpClient the output is built upon.)

Ok . My overall requirement somehow I need to know logstash has received/processed events. Im happy even I will be able to print the event once it received in input or filter, even if not in output. Whole exercise is to make sure events are reaching logstash.
I tried to set logstash.inputs.syslog to DEBUG. I thought it will print event/log once it recieved. But it didn't

Is there any other way?

Take a look at the pipeline stats API. That will tell you whether events are flowing through the pipelines and outputs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.