Unable to Parse Syslog messages into Elasticsearch

Hello,

I am sending syslog messages from barracuda mail spammer directly into Elasticsearch database.
The index got populated with data but the data are not parsed correctly. The log attributes are all shown in the message field. Please attached sample for the normalized data.

My question is it possible to solve the parsing issue from the index mapping ? if yes, how to do it.

To parse the message you can use the Grok ingest processor.

The short answer is you can solve this using index mapping by utilizing runtime fields which will parse and create the fields you want. But that is not the best way since each one of those fields and records will be parsed every time the data is called.

The best way is to have an ingest processor do the parsing and then indexing the data already parsed and never need to touch it again.

Hello Aaron,

Thanks for your reply.

Where to put the pipeline configuration ? There is no beats between source and destination ?

I'd checkout this page for ingest pipelines. It explains it very well. Here is how the flow would work.

Beats -> Elasticsearch -> Index Template -> Ingest Processor -> Index

Hello aaron,

I am not able to parse the barracuda email spammer log in a correct way. If you already have can you please share it with me. Another question, how the ingest pipeline will know which index to be executed on. This process is executed automatically, or there is additional steps of configuration.

Thanks,
E

I am assuming you are using beats to send the data. There is a setting in the Elasticsearch output where you can set the pipeline ID. That means send the data to ES and run this pipeline which will parse the data.

Hello Aaron,

The syslog messages are sent directly from Barracuda mail appliance to Elasticsearch, no beats in between.

Br,
e

In that case you can set the pipeline ID in the Index Settings.

PUT index-name 
{
  "settings": {
    "index.final_pipeline" : "pipeline-name"
  }
}

Hello Aaron,

The above query requires username and password to be able to write to a specific index. In addition, The index name is changing every 24h, meaning I have to manually run the command on daily basis.

Br,

Hello aaron,

I am receiving this error:

{
"error" : {
"root_cause" : [
{
"type" : "security_exception",
"reason" : "action [indices:admin/create] is unauthorized for user [anonymous_user]"
}
],
"type" : "security_exception",
"reason" : "action [indices:admin/create] is unauthorized for user [anonymous_user]",
"caused_by" : {
"type" : "illegal_state_exception",
"reason" : "There are no external requests known to support wildcards that don't support replacing their indices"
}
},
"status" : 403
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.