How can parse log data into individual key=val pair dynamically in logstash?

My question is how can i separate the log data into individual key=val pair in logstash.This means when i get the log data from beats or file sources it will be in string form so i need that string must be divided separate data like key=val pair. Here another note is i don't know any information about log data it will change dynamically.can anyone guide me how can i parse the data according to above requirements??

my logstash conf file is like below

    input {
       beats {

         port => 5044
        }

    }

filter{###
}
    output {
    if [beat][hostname] == "slave1"{
    elasticsearch {
                hosts => ["localhost:9200"]
                manage_template => true
                index => "slave1-%{+YYYY.MM.dd}"


            }
    }
    else if [beat][hostname] == "slave2"{
    elasticsearch {
                hosts => ["localhost:9200"]
                manage_template => true
                index => "slave2-%{+YYYY.MM.dd}"
}

can  anyone guide me how can i achieve this one??

Thanks in advance!!!

Hi

Could you post a couple or more samples of your input data so we get a feel of what you mean by you not knowing about log data. Is it the names of the keys that change? is it the separator?

Applying your code, could you post a sample of your output from stdout{} for each of those samples?

Take a look at the kv{} filter here: https://www.elastic.co/guide/en/logstash/current/plugins-filters-kv.html It might be what you are looking for.

Hope this helps.

Hi thanks for response

I am getting data from multiple VM's through filebeats to logstash. The data which we will get is a string format right i need do divide that data into multiple individual fields (like key and value pair).I gone through kv filter article but i am not able to find the requirement that i am looking for. Here below are the sample logs i am getting from different servers(VM's)

Slave1 log data:-

150.129.60.43 - - [12/Mar/2020:11:26:06 +0000] "GET / HTTP/1.1" 304 0 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.117 Safari/537.36"
150.129.60.43 - - [12/Mar/2020:11:26:06 +0000] "GET / HTTP/1.1" 304 0 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.117 Safari/537.36"
150.129.60.43 - - [12/Mar/2020:11:26:07 +0000] "GET / HTTP/1.1" 304 0 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.117 Safari/537.36"
150.129.60.43 - - [12/Mar/2020:11:26:07 +0000] "GET / HTTP/1.1" 304 0 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.117 Safari/537.36"
150.129.60.43 - - [12/Mar/2020:11:26:07 +0000] "GET / HTTP/1.1" 304 0 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.117 Safari/537.36"
150.129.60.43 - - [12/Mar/2020:11:26:07 +0000] "GET / HTTP/1.1" 304 0 "-" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.117 Safari/537.36"

I need to divide above string ( log data) into individual fields like separate index can you help the way how can i do this

Hi

Looking at your log, I think the grok{} filter would be more appropriate. It will give you each portion of the log you define into a field with the name of your choice. Info here: https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html

Hope this helps

Hi
As you said for above logs grok filter will be perfect.But I will tell you my project scenario.i have a laravel Project, don't know what type of data i will get as a log message it will change dynamically so i have to create a filter which will read the log message and divide that log data into individual fields as a key value pair so that i can easily able to create dashboards.can you help how we can do this?? I am not able to find the way and below is the sample laravel log data.
Note : This message will change dynamically

laravel Log data::

2020-03-12 08:53:46] local.ERROR: SQLSTATE[HY000] [1045] Access denied for user 'pmauser'@'localhost' (using password: YES) (SQL: select count() as aggregate from users where email = gopipathshala@gmail.com) {"exception":"[object] (Illuminate\Database\QueryException(code: 1045): SQLSTATE[HY000] [1045] Access denied for user 'pmauser'@'localhost' (using password: YES) (SQL: select count() as aggregate from users where email = gopipathshala@gmail.com) at /var/www/user/vendor/laravel/framework/src/Illuminate/Database/Connection.php:664, PDOException(code: 1045): SQLSTATE[HY000] [1045] Access denied for user 'pmauser'@'localhost' (using password: YES) at /var/www/user/vendor/laravel/framework/src/Illuminate/Database/Connectors/Connector.php:70)

Hey,

can we able to separate log message based space or colon or any character in that message??
because once we divide the data the my job will be easy.is there any way for that??

Hello can any help on this ??

Once the data is divided everyone's job is easy. After all, "dividing the data" is the hard part. I have been doing Elastic Stack projects for nearly 4 years now, and I always explain it this same way...

3% of my time is spent on Elasticsearch, 7% on analytics (dashboards, alerting, ML) and 90% is on getting the data and getting it into a form where it is useful... and doing so at scale. This includes collection, parsing, formatting, transforming and enriching.

As was written once in a Forbes article...

Data scientists spend 60% of their time on cleaning and organizing data. Collecting data sets comes second at 19% of their time, meaning data scientists spend around 80% of their time on preparing and managing data for analysis.

The good news is that between Beats, Logstash and Elasticsearch Ingest Pipelines, you have a lot of tools to get this job done. Throw Kafka and KSQL into the architecture and you have even more options. However there is no getting around the work of creating the necessary data processing logic for your use-cases.

Rob

GitHub YouTube LinkedIn
How to install Elasticsearch & Kibana on Ubuntu - incl. hardware recommendations
What is the best storage technology for Elasticsearch?

Hi Rob,

Thanks for your reply.

I had achieved this task by using Kv plugin it is working fine.But can you help how to install ELK 7.X version and configurations needed in Elasticsearch and kibana and logstash. Because now i am using 6.x and i can able to install and use it but i am not able to install ELK 7.X and I am getting error if tried to install it can please help me on this error.