Help with json values recieved at kibana

So i am trying ELK for academic purpose. Now i reached point where kibana is receiving log from other vm's mod-sec logs. But i cannot categories it with ip as not ip filed there.So any can help with the problem.
Here is a sample log

"_index": "filebeat-2018.08.30",
"_type": "doc",
"_id": "H2vKiWUB-x2104jz0DvB",
"_version": 1,
"_score": null,
"_source": {
"@version": "1",
"host": {
"name": "debian"
"type": "syslog",
"@timestamp": "2018-08-30T07:44:53.759Z",
"message": "[30/Aug/2018:03:44:46 --0400] W4egbn8AAQEAAAkbUBYAAABO 50110 80",
"input": {
"type": "log"
"tags": [
"beat": {
"version": "6.3.2",
"name": "debian",
"hostname": "debian"
"prospector": {
"type": "log"
"source": "/var/log/apache2/modsec_audit.log",
"offset": 64192
"fields": {
"@timestamp": [
"sort": [

So here you see the message field contain ip can i have different filed for ip.?

Hey @aravind2579,

To clarify, are you looking for a way to extract the IP Address from the message field?

"message": "[30/Aug/2018:03:44:46 --0400] W4egbn8AAQEAAAkbUBYAAABO 50110 80"

If so, you will need to split that out into its own field before sending the document to Elasticsearch.

How are you sending your logs to Elasticsearch now? Both Beats and Logstash have mechanisms for parsing log files to extract this type of information. I'd recommend asking for help in their discussion boards if you need help getting that setup.

filebeat is taking the log and passing through to logstash. Logsatash is passing data to elasticsearch.

Is there documentation on how to split log..?

Logstash has extensive documentation -- here are some examples for splitting log entries into fields:

You will likely use a grok pattern to split the entries. If so, you can take advantage of the Grok Debugger to test your pattern:

Thank you.
I will look into into it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.