Nested Logstash Config files

Hi ,

I like to pass payload to another logstash config file and post the payload fields to ES once it successfully it will post other fields to ES.

filter
{
if "START [[" in [message]
{
grok
{
match => { "message" => "%{TIMESTAMP_ISO8601:logtime} %{LOGLEVEL:loglevel} %{DATA:audittype} %{DATA:app} - %{GREEDYDATA:jsondata}" }
}
json { source => "jsondata" }
mutate { add_field => { "RequestTimeStamp" => "%{[event][timestamp]}"
"LogLevel" => "%{[event][logLevel]}"
"Tracepoint" => "%{[event][tracepoint]}"
"ApplicationName" => "%{[event][appName]}"
"AppVersion" => "%{[event][appVersion]}"
"Environment" => "%{[event][environment]}"
"BusinessGroup" => "%{[event][businessGroup]}"
"InterfaceName" => "%{[event][interfaceName]}"
"CID" => "%{[event][correlationId]}"
"TID" => "%{[event][traceId]}"
"FlowName" => "%{[event][locationInfo][flowName]}"
"TracePointDesc" => "%{[event][tracepointDesc]}"
"Payload" => "%{[event][content][payload]}"
}
}
# Need to pass payload to another logstash config and parse and send to ES
dissect { mapping => { "sourcetype" => "%{SourceSystem}" } }
mutate { remove_field => ["sourcetype","app","jsondata","payload","message","event","Payload"] }
}

output {
if [audittype]=="START" or [audittype]=="END"
{
elasticsearch
{
id => "estraveldemoid"
index => "traveldemoindex"
hosts => ["localhost:9200"]
}
stdout {}
}
}

Sample Log:

2020-04-27 00:44:30,445 INFO START [[Thread1]]custom.utils.logger -
{
"sourcetype":"Travel Agency Application",
"event":{
"timestamp":"2020-04-29T00:00:00.000",
"logLevel":"INFO",
"tracepoint":"START",
"appName":"Travel Booking website",
"appVersion":"1.0.0","environment":"DEV",
"businessGroup":"TravelBookingGroup",
"interfaceName":"TourBookingInterface",
"correlationId":"TBT017",
"traceId":"MMT003","threadName":"",
"locationInfo":{"flowName":"bookingTravelTickets"},
"tracepointDesc":"START Transaction - Travel Booking Website",
"content":{
"payload":{
"Date": "28.04.2020",
"Name": "KARTHIK",
"Origin":"MAA",
"Destination":"FRK",
"AirlineBooking":"Yes",
"HotelBooking":"Yes",
"TaxiBooking": "YES"
}
}
}
}

It will work? How we can call another logstash.Config and get the control back to first Logstsh file?

Regards,
Karthikeyan S

Hi,
You can't call logstash configuration files like subroutines.
However, you you can use clone filter to create a copy of the document (payload) you are currently processing; this copy will have a different type value.

Then you can add conditional blocks to perform different filtering or ES outputs depending on this field.

You can read about it in https://www.elastic.co/blog/using-logstash-to-split-data-and-send-it-to-multiple-outputs

Edit: @Badger is spot on :smiley: this was the default way of accomplishing it in past versions.
Maybe it's simpler and still usable but pipeline configuration is a more modern and flexible solution to achieve it :wink:
https://www.elastic.co/guide/en/logstash/master/pipeline-to-pipeline.html#forked-path-pattern

Old school! For any recent version I would use pipeline-to-pipeline communication (which that blog post links to).

1 Like

Hi Andres,

My intention is to call from one method to another method here both methods are in different file like in Java.

Example: I will pass my payload to the script, that script will parse the payload and post the details to ES and control come back to logstash config and post the remaining details to ES.
If the script is ruby it can be run in logstash file, correct me my understanding is correct?

Regards,
Karthikeyan S

It doesn't work like that. You can use a ruby filter as one extra step to allow data procesing that would not be possible or practical using the existing logstash filters. But ruby code (which is just one more filter) won't "call" other configuration files, other logstash filters or send to different ES outputs by itself.

There is an example here: https://www.elastic.co/guide/en/logstash/current/plugins-filters-ruby.html

You can review the links from my previous response, I think these are the 2 straightforward alternatives to accomplish what you want: either clone a document and do conditional processing or set up a configuration with pipelines where you can "chain" different logstash configuration files.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.