How do I extract a part of a string in the message field in Kibana and then plot it on graph?

Here, I want to extract the values of each lineNumber string(i.e, 26,121,18...) in the message field and then plot it on a graph. I am just able to plot the number of occurrences of string "lineNumber", but I need to plot n number of lineNumber values on a graph. So how do I do it? Please help me out.

The most efficient and scalable way to do this is to parse the message field at ingest time and extract the fields you want to run analysis on. You can do this using a grok processor within an ingest pipeline.

Hello @Christian_Dahlqvist, I hadn't setup the grok filter while I was setting up my whole ELK server. But then I realized that to get the lineNumber as a new field, I have to setup the grok filter. Can you please elucidate more upon how to setup the grok filter in the logstash configuration file and then how to extract the lineNumber string's values and then plot the graph?
I had used this link to setup the ELK server : https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04#configure-logstash

But the config for grok didnt work. SO, please explain more on this.

If all your log lines have a similar structure, it may be easier to instead use the dissect filter instead of grok. This blog post provides a good introduction.

Yes, the log lines have a similar structure as you can see in the above attached screenshot. Can you give an example filter to extract the value of "lineNumber" string? Because I included a filter in /etc/logstash/conf.d/10-syslog-filter.conf . But then the logs were not transferring through filebeat. So, give an example which I can directly deploy.

I do currently not have time to write it for you, but if you show what you have tried so far we might be able to help you find what is wrong.

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

This was one of the filter that was given in https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04#configure-logstash

I replaced the syslog by log, as I had configured it on Filebeat. Then replaced SYSLOGTIMESTAMP:syslog_timestamp by lineNumber:number. But then the logs were not flowing from Filebeat to logstash. So, how can I modify to highlight lineNumber as a field? and have to extract that lineNumber value in n number of lines to plot a graph. Thanks in advance.

@Christian_Dahlqvist, any inputs?

That pattern does not at all match your log format. Have you looked through the blog post and documentation around the dissect filter I provided? Have you tried using it?

I just showed an example, I had modified according to my requirement. Dissect filter, I'll try it.

Let us know if you have any problems with it and we'll try to help.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.