Paser logs with grok

I'm researching about ELK for my company's monitoring work, with the passer log I'm wondering that will only add grok-patterns to the grok-patterns config file but can't be added via kibana, I have find out Qradar has a DSM Editor that can completely add the fields you want easily, can someone explain it to me
Thank you

Can you provide more context about your issue, it is not clear? What you want to be explained?

QRadar is a completely different tool that works in a different way.

The Elastic Stack is composed of differente tools like Elasticsearch for storing your data, Kibana to make it easy to search on your data, Logstash to send your data, the collectors in the Beats family etc.

Theses days you may not even need to use Logstash, it all depends on your use case.

leandrojmp, I think CodeRed has complained about why no interface in Kibana to add grok pattern which will LS recognize, save and apply.
In LS, it's manually added a conf file on LS server and that's all. Kibana talks only with Elasticsearch.
As leandrojmp said, no need for LS if something is not so specific. In most cases, you use Elastic agent or Filebeat then use Ingest pipelines in Kibana.

I am learning about passer logs with many types of logs
I researched and found that with ELK, I have to use grok and rely on grok-pattern to create the fields I want
with some kind of log i find grok-pattern pattern is not enough and i want to add new grok patterns however i just found that i can add via patterns to config file is there any way i can is it possible to add templates via kibana's interface or something else
Sorry my english is not good, i use google translate so it's confusing

Yep, that's what I mean
Thank you for replying to me

Check this, intro to LS.

There is not, like I said, they are different tools, QRadar you have basically everything in one place, with Elastic Stack you have multiple tools, each one with its own function.

Elastic choose to not have basically no administration interface for Logstash.

But as I said you may not even need Logstash, these days Logstash is more of an advanced use case, you should first see if you can get the data you need into elasticsearch using the Elastic Agent that has many built-in integrations that will help you parse your logs.

1 Like

Yep, I will try
Thank you for replying to me

However, I still want to learn about creating the fields I want
Will using Elastic Agent allow me to do that?
For example with windows logs there is a processname field and I want to change it to process_name_xxx, or as part of logs cusername":"www-data","cwd":""/var/www/nextcloud\
i want to create cwd field for example
is this possible

Elastic Agent will parse most of the fields for you with an integration, but you still can change it.

Also, not everything has integrations, some kinds of log you will need to create your own parses.

Do you have any working example? Have you installed Elasticsearch yet and played with it?

You need at least to do a proof of concept to know if Elastic Stack will work the way you want or not.

But basically you can change the data any way you want, how you will do that depends entirely on how you are collecting it and what are your needs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.