Logstash Filebeat integration issue with custom logs

Hi,

I have a custom log generated by shell script as shown below:

06:48:18 PM CPU %user %nice %system %iowait %steal %idle
06:48:19 PM all 3.05 0.00 0.54 0.46 0.00 95.95
06:48:20 PM all 3.19 0.00 0.50 0.21 0.00 96.10
06:48:21 PM all 3.67 0.00 0.56 0.46 0.00 95.31
06:48:22 PM all 2.92 0.00 0.52 0.92 0.00 95.64
06:48:23 PM all 2.97 0.00 0.44 0.38 0.00 96.22

I used filebeat to transfer this log contents to Logstash. Log stash displays the below json file:

{
"_index": "logstash-2017.01.21",
"_type": "my_log",
"_id": "AVnBHC-fDiezc7ujylT8",
"_score": null,
"_source": {
"@timestamp": "2017-01-21T12:59:27.397Z",
"offset": 213953,
"@version": "1",
"beat": {
"hostname": "hypervisor.airframe.cbis.eirmnp.nsn-rdnet.net",
"name": "hypervisor.airframe.cbis.eirmnp.nsn-rdnet.net",
"version": "5.1.2"
},
"input_type": "log",
"host": "hypervisor.airframe.cbis.eirmnp.nsn-rdnet.net",
"source": "/root/sar_logs",
"message": "06:29:26 PM all 3.02 0.00 0.46 0.77 0.00 95.74",
"type": "my_log",
"tags": [
"beats_input_codec_plain_applied"
]
},
"fields": {
"@timestamp": [
1485003567397
]
},
"sort": [
1485003567397
]
}

The message filed in json is showing a string value. How can I convert or parse this message field into json?
is there any template available?

Regards,
Ramakrishna

There are multiple filter that can help you with this. I suspect the grok filter will be the best fit for splitting the string but you'll also want to use a date filter to get the @timestamp field right.

Hi,

Thanks for the response. By reading various posts I used below logstash configuration:

[root@elk-stack ~]# cd /etc/logstash/conf.d/
[root@elk-stack conf.d]# cat 02-input.conf
input{
beats{
type => beats
port => 5044
}
}
[root@elk-stack conf.d]# cat 10-filter.conf
filter{
csv{
columns => [ "time", "CPU", "user", "nice", "system", "iowait", "steal", "idle" ]
separator => ","
remove_field => ['message']
}
mutate{
convert => ["user","float"]
convert => ["nice","float"]
convert => ["system","float"]
convert => ["iowait","float"]
convert => ["steal","float"]
convert => ["idle","float"]
}
}
[root@elk-stack conf.d]# cat 30-output.conf
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
[root@elk-stack conf.d]#

Output on Kibana Discover:

{
"_index": "logstash-2017.01.22",
"_type": "my_log",
"_id": "AVnDsp-IDiezc7uj67Ui",
"_score": null,
"_source": {
"offset": 616138,
"steal": 0,
"idle": 95,
"input_type": "log",
"CPU": "all",
"source": "/root/sar_logs",
"type": "my_log",
"nice": 0,
"tags": [
"beats_input_codec_plain_applied"
],
"@timestamp": "2017-01-22T01:02:56.853Z",
"system": 0.54,
"@version": "1",
"beat": {
"hostname": "hypervisor.airframe.cbis.eirmnp.nsn-rdnet.net",
"name": "hypervisor.airframe.cbis.eirmnp.nsn-rdnet.net",
"version": "5.1.2"
},
"host": "hypervisor.airframe.cbis.eirmnp.nsn-rdnet.net",
"time": "06:32:56",
"iowait": 0.92,
"user": 3.54
},
"fields": {
"@timestamp": [
1485046976853
]
},
"sort": [
1485046976853
]
}

Now I am able to split the csv data into a json object. Now I am facing 2 issues:

  1. Kibana Visualization is having issue with displaying the stats.(some how managed to graph). Not clear on what I have done.
  2. The timestamp available in log is not used. The @timestamp used to index data is available for visualization.

I will still work on this in the coming week. My aim is to display all my servers data using ELK stack.
Thanks for the suggestion. I will read grok filter usage also.

Regards,
Ramakrishna

Kibana Visualization is having issue with displaying the stats.(some how managed to graph). Not clear on what I have done.

What do you want to achieve?

The timestamp available in log is not used. The @timestamp used to index data is available for visualization.

You need a date filter for that. It's very unfortunate that the log doesn't include the date along with the time but I think Logstash will default to the current date.

Hi,

I apologize for not providing clear information.

I want to display server performance statistics live from different servers using the Kibana Dashboard.
For this I have done the following:

  1. Created a server for elk-stack. Installed the following components:
    Elastic Search 5.1
    Kibana 5.1
    LogStash 5.1

  2. On a client Linux server whose data is to be displayed, I have installed filebeat.
    Now I generated the file /root/sar_logs on this server for every second containing below data:
    06:48:18 PM CPU
    06:48:19 PM all 3.05 0.00 0.54 0.46 0.00 95.95
    06:48:20 PM all 3.19 0.00 0.50 0.21 0.00 96.10
    06:48:21 PM all 3.67 0.00 0.56 0.46 0.00 95.31
    06:48:22 PM all 2.92 0.00 0.52 0.92 0.00 95.64
    06:48:23 PM all 2.97 0.00 0.44 0.38 0.00 96.22

  3. Now I want to visualize this data live on Kibana Dashboard:
    on X-Axis: Time
    on Y-Axis: %user%
    %nice%
    %system%
    %iowait%
    %steal%
    %idle%

But when I go to Kibana Visualize & Select New-> Line Chart, I am not able to understand what parameters will give this graph.

On Y-Axis if I select "Max" & Add metrics & X-Axis "Date Histogram" only the graph is coming.

I am not able to understand, why Max is to be selected. I am referring to the Kibana Documentation for this.
As suggested I will Include Date-24hr format time in my log file.

Hmm, I'm not sure how to add multiple metrics from the same documents to the same chart. I suggest you ask about that in the Kibana category.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.