Alternative grok with API

Hello,
I've some pipeline which use grok to parse logs and apply some modifications.
As i collect in input data from Elastic index, make some modifications and send it directly data transformed in an Elastic index, is it possible instead of using GROK using request API?

Example

I have this one

[START] Formattage syslog_md_text

grok {
  match => { "[raw_syslog_result][syslog_message]" => [
    "^%{WORD:door_status}$"
  ]}
  #overwrite => [
  #]

Hi @sam1975 ,
you can try using ingest pipeline, using the processors you can do this kind of operation very quickly. You can also simulate the pipeline in the dev console using example data.

You can read more about it here

Let me know if you need more help.

1 Like

hello
yes,with pleasure
I start with this extract of a pipeline
grok {
match => { "[raw_syslog_result][syslog_message]" => [

    #sensor connected for 'Smoke' at 'POC'. 0x2E35
    #Sensor Disconnected for 'Smoke' at 'POC'. 0x2E1E
    "(Sensor|sensor) %{WORD:status} for \'%{WORD:physical_location_id}\'",

    #Message: A USB device is installed in the slot Front Port 0.
    #Message: A USB device is removed from the slot Front Port 0.
    #Message: Drive 4 is installed in disk drive bay 1.
    #Message: Drive 4 is removed from disk drive bay 1.
    "Message: (A )?%{GREEDYDATA:physical_location_id_A} is %{WORD:status} (in|from) (the slot )?%{GREEDYDATA:physical_location_id_B}."

Do you think (and how ?) i could use ingest pipeline? In another way, i've thought also to dissect/kv but whatever, i need to make something more easy and hava some gain in term of performances.

Yes you can use ingest pipeline. You need to add in the output part of logstash like that.

pipeline => "PIPELINE_NAME" 

and then going in Kibana -> Stack Management -> Ingest Pipeline you can create the pipeline.

You can simulate the pipeline with the data in elasticsearch using the simulate API

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.