Hello,
I've some pipeline which use grok to parse logs and apply some modifications.
As i collect in input data from Elastic index, make some modifications and send it directly data transformed in an Elastic index, is it possible instead of using GROK using request API?
Hi @sam1975 ,
you can try using ingest pipeline, using the processors you can do this kind of operation very quickly. You can also simulate the pipeline in the dev console using example data.
hello
yes,with pleasure
I start with this extract of a pipeline
grok {
match => { "[raw_syslog_result][syslog_message]" => [
#sensor connected for 'Smoke' at 'POC'. 0x2E35
#Sensor Disconnected for 'Smoke' at 'POC'. 0x2E1E
"(Sensor|sensor) %{WORD:status} for \'%{WORD:physical_location_id}\'",
#Message: A USB device is installed in the slot Front Port 0.
#Message: A USB device is removed from the slot Front Port 0.
#Message: Drive 4 is installed in disk drive bay 1.
#Message: Drive 4 is removed from disk drive bay 1.
"Message: (A )?%{GREEDYDATA:physical_location_id_A} is %{WORD:status} (in|from) (the slot )?%{GREEDYDATA:physical_location_id_B}."
Do you think (and how ?) i could use ingest pipeline? In another way, i've thought also to dissect/kv but whatever, i need to make something more easy and hava some gain in term of performances.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.