Capture Log for Logstash For every successfull pipeline execution

Hello,

I want to capture log for logstash sucessfully fetching the api data from http_poller plugin and entering it to my elasticDB.

My configuration is:

input {
    http_poller {
	id => "test-plugin"
    urls => {
	test_api => {
        method => "POST"
        url => "api url"
		headers => {
                    "Content-Type" => "application/json"
                }
      }
	}	
	request_timeout => 120
	schedule => {cron => "8 5 * * * UTC"}
	codec => "json"
	tags => ["test-api"]
  }
}
filter { 
	mutate {
		remove_field => [ "[message]" ]
		remove_field => [ "[@version]" ]
		remove_field => [ "[event]" ]
	}
}
output {
	if "test-api" in [tags]
	{
		elasticsearch {
			id => "test-output"
			hosts => ["host1"]
			user => "user"
			password => "password"
			index => "my-index"
			document_id => "%{ID}"
			doc_as_upsert => true
			action => "update"
		 }
	}
}

After every successfull run, a log will be inserted in csv file...Is there a way to do that....

Hi @Rakhshunda_Noorein_J

You should be able to specify a second output and use the CSV output plugin.

It is logging all the event in the inputs..

I want to log the events only all the events from my input is complete.

You can read the response code from the message of the http_poller and then only allow codes = 200 to hit the CSV output.

Something like:

output {
	if "test-api" in [tags]
	{
		elasticsearch {
			id => "test-output"
			hosts => ["host1"]
			user => "user"
			password => "password"
			index => "my-index"
			document_id => "%{ID}"
			doc_as_upsert => true
			action => "update"
		 }
	}
    if [@metadata][code] == 200
	{
		csv {
			path => /mypath/output.csv
            fields => ["field1", "[nested][field]"]
            ...
		 }
	}
}

Keep in mind, Logstash will send both messages simultaneously to Elasticsearch and CSV outputs if the conditions are met properly.

You need to provide more context about what is being logged and what you want to log.

Your input is a http_poller input filter, if the return of this filter has multiple lines, every single line will be logged.

1 Like

yes... But my requirement is for all the event of input plugins successfully ran, then insert an event to the csv file.

ex. soppose I am calling a web api from http_poller input plugin and sending the input to my elasticsearch index. When this is successfully done then insert only 1 line in csv with date, tag and message for logging pupose so that I can know when the input ran.

Unless you have some message that identify that the request is finished, Logstash cannot do that.

If you have some message where you can apply a conditional filter, you would be able to only write something on a csv after this message.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.