Json split multiple event in Logstash Pipeline

Hi All,

I am trying to ingest the python script output using exec input plugin filter but i am facing issue while performing split operation on message. I am not sure how can I split into fields, below sample output which is receiving under message :
Output/Message field
{"required_role":"OPERATOR","user_role":"DESIGNER","accepted":true,"request_url":"/catalog_service_settings","rest_params":{},"original_time":637841834485391604,"severity":"INFO","audit_time":637841834485391604}
{"required_role":"OPERATOR","user_role":"DESIGNER","accepted":true,"request_url":"/catalog_service_settings","rest_params":{},"original_time":637841839892273074,"severity":"INFO","audit_time":637841839892273074}
{"required_role":"OPERATOR","user_role":"DESIGNER","accepted":true,"request_url":"/catalog_service_settings","rest_params":{},"original_time":637841840485632613,"severity":"INFO","audit_time":637841840485632613}
{"required_role":"OPERATOR","user_role":"DESIGNER","accepted":true,"request_url":"/catalog_service_settings","rest_params":{},"original_time":637841845977496183,"severity":"INFO","audit_time":637841845977496183}
{"required_role":"OPERATOR","user_role":"DESIGNER","accepted":true,"request_url":"/catalog_service_settings","rest_params":{},"original_time":637841846485656787,"severity":"INFO","audit_time":637841846485656787}

Current Logstash Pipeline :

input {
exec {
command => "python /usr/share/logstash/scripts/qlik_2.py"
interval => 30
}
}

filter {
json {
source => "message"
skip_on_invalid_json => true
tag_on_failure => ["failed_json"]
}
split {
field => "message"
}
}

Could you please help me with the same.
@Badger @elastic_team Kindly assist how can ingest data in respective fields in index.

Hello,

Please do not ping people that are not part of the thread.

It is not clear what you want to do and what the issue is. What is your output?

If I understand correctly you want to create a new event for each one of the json documents in the output of the exec input plugin?

You should put the split filter before the json filter, try that and see if it works.

Hello,

Now the output is getting split into new documents but Json filter is not working as I am not able to ingest log in key value format in kibana.

Can you please help me how can I resolved the same

You need to share the output you are having, also share your updated pipeline config.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.