How to parse mix json logs

hi,
i am very new to elk.so i do not know how to parse the mixed json logs here is sample log that i want parse,i getting these logs from filebeat.

2019-02-03 23:51:54,263 | {" MACID":"00009934","ID":"1","SS":"26","FW":"V5.1.14","TSRC":"R", "STATUS":"SOFT RESET","SN":"25925","PCK":{"M26":"AQPAQF5GUJAERk93BUZPwnhGTy0eRrRuhUazKspGtLcXOVFJUjmdqII4+8z3OhLFrLcnzPe5kgAAAAC3F7lRqIK4+6iCOPsAAD+AAAA/gAAAP4AAAD+AAAAAAJumu0QAAAAAm6a7RBJvuwMSb7qDEm86gxJvuwMSbzsDEm87gxJvOoMSbzuDsTBC8PefQu5aAELwWh1CSJqZOnwAAEIQCj0/V7hSP14KPT9XAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAx3QBAxDPfEuB5/9LhswgSoH0AEgcXOgBAxDXBwAOmVUAAr1iAA4ZpAAA2h4BAwwgAAAABAAAAHW+Ons7l36HXFduJw=="},"RTC":"19/02/03,23:51:35"}

use cases
1.i want to calculate total number logs based on particular field from logs
for example =>
I want find out total number of logs that matches with MACID":"00009934"

2.how to filter the logs based on one the field.
let say i want to search the logs for "STATUS":"SOFT RESET" so it shoud return me all the logs where it found "STATUS":"SOFT RESET"

any one have any clue please help me
Thank you

You can parse a message like that using

    dissect { mapping => { "message" => "%{ts} %{+ts} | %{restOfLine}" } }
    json { source => "restOfLine" }
    date { match => [ "ts", "ISO8601" ] }

How to query the number of documents that contain a given field is an elasticsearch (or kibana) question, not a logstash question.

Thanks Badger for four valuable reply.
How to add field from JSON here so that i can filter records in kibana based on that field.I have one doubt here why don't we use grok here.and when to use grok and when to use dissect.it would be appreciable.

Thank you

The filter I showed will result in an event that looks like this once you mutate+remove fields like ts, restOfLine, message, etc.

{
"STATUS" => "SOFT RESET",
"@timestamp" => 2019-02-04T04:51:54.263Z,
"RTC" => "19/02/03,23:51:35",
"PCK" => {
"M26" => "AQPAQF5GUJAERk93BUZPwnhGTy0eRrRuhUazKspGtLcXOVFJUjmdqII4+8z3OhLFrLcnzPe5kgAAAAC3F7lRqIK4+6iCOPsAAD+AAAA/gAAAP4AAAD+AAAAAAJumu0QAAAAAm6a7RBJvuwMSb7qDEm86gxJvuwMSbzsDEm87gxJvOoMSbzuDsTBC8PefQu5aAELwWh1CSJqZOnwAAEIQCj0/V7hSP14KPT9XAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAx3QBAxDPfEuB5/9LhswgSoH0AEgcXOgBAxDXBwAOmVUAAr1iAA4ZpAAA2h4BAwwgAAAABAAAAHW+Ons7l36HXFduJw=="
},
" MACID" => "00009934",
"FW" => "V5.1.14",
"SS" => "26",
"SN" => "25925",
"ID" => "1",
"TSRC" => "R"
}

If you ingest that into elasticsearch you will be able to search on a field like STATUS. Note that " MACID" has a leading space in the field name, so you may want to mutate+rename that.

When to use grok and when to use dissect is a matter of taste. I prefer to use dissect on well structured parts of logs, but grok can do the same job.

thanks for reply,here is my configuration file its not working

#INPUT

input {
beats {
port => 5044
}
}

#FILTER

filter {
if [type] == "log" {

dissect { mapping => { "message" => "%{ts} %{+ts} | %{restOfLine}" } }

json { source => "restOfLine"
   target => "message"	
   add_field => {
  "mac_addr" => "%{MACID}"
}
   }

#this is another way that i tried still not working
mutate {
add_field => ["status", "i want to add value from json filed here"]
}

date { match => [ "ts", "ISO8601" ] }

}
}

#OUTPUT

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "todayslogs-%{+YYYY.MM.dd}"
}
stdout {
codec => rubydebug
}
}

i am really very confused with flow of execution.please help me.

thank you

You are making changes to the solutions I am suggesting that do not make a lot of sense. Start off with

    dissect { mapping => { "message" => "%{ts} %{+ts} | %{restOfLine}" } }
    json { source => "restOfLine" }
    mutate {
        remove_field => [ "message", "restOfLine" ]
        rename => { " MACID" => "MACID" }
    }
    date { match => [ "ts", "ISO8601" ] }

If you change the json filter to include the option 'target => "message" then all of the fields are output inside an object called message. I don't see how that can possibly help. You need to approach this systematically. Don't even bother putting the data into elasticsearch at this point. Just look at the output you get from the rubydebug codec on stdout. When you make a change to the filter, see if the the change to the output is an improvement. If it is not, then undo the change.

I followed the same script provided by you still getting same output as i was getting previously below is sample output that i am getting :

"_index": "todayslogs-2019.02.12",
"_type": "doc",
"_id": "4nQD42gBCBNxC0o5rrkh",
"_version": 1,
"_score": null,
"_source": {
"@version": "1",
"@timestamp": "2019-02-12T18:41:34.668Z",
"source": "/var/log/test_logs.log",
"input": {
"type": "log"
},
"offset": 54500584,
"tags": [
"beats_input_codec_plain_applied"
],
"beat": {
"hostname": "amolpc-HP-Laptop-14-bs0xx",
"version": "6.6.0",
"name": "amolpc-HP-Laptop-14-bs0xx"
},
"host": {
"architecture": "x86_64",
"containerized": false,
"os": {
"platform": "ubuntu",
"version": "16.04.2 LTS (Xenial Xerus)",
"family": "debian",
"name": "Ubuntu",
"codename": "xenial"
},
"id": "810d928d8f414904937f2c900154e8ee",
"name": "amolpc-HP-Laptop-14-bs0xx"
},
"message": "2019-02-03 23:51:54,263 | {"MACID":"0418003f","ID":"1","SS":"26","FW":"V5.1.14","TSRC":"R","SN":"25925","PCK":{"M26":"AQPAQF5GUJAERk93BUZPwnhGTy0eRrRuhUazKspGtLcXOVFJUjmdqII4+8z3OhLFrLcnzPe5kgAAAAC3F7lRqIK4+6iCOPsAAD+AAAA/gAAAP4AAAD+AAAAAAJumu0QAAAAAm6a7RBJvuwMSb7qDEm86gxJvuwMSbzsDEm87gxJvOoMSbzuDsTBC8PefQu5aAELwWh1CSJqZOnwAAEIQCj0/V7hSP14KPT9XAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAx3QBAxDPfEuB5/9LhswgSoH0AEgcXOgBAxDXBwAOmVUAAr1iAA4ZpAAA2h4BAwwgAAAABAAAAHW+Ons7l36HXFduJw=="},"RTC":"19/02/03,23:51:35"}",
"prospector": {
"type": "log"
},
"log": {
"file": {
"path": "/var/log/test_logs.log"
}
}
},
"fields": {
"@timestamp": [
"2019-02-12T18:41:34.668Z"
]
},
"sort": [
1549996894668
]
}

there is no change in output.if i remove the filter part from configuration still will get the same result.my only intension behind this is to map the fields from json so that i can use those fields in kibana for filtering the data.

Thank you

You made the filter conditional upon '[type] == "log"'

You do not have a field called type. You have _type, [input][type], or [prospector][type]

now i have removed if condition from configuration see below

filter {
dissect { mapping => { "message" => "%{ts} %{+ts} | %{restOfLine}" } }
json { source => "restOfLine" }
mutate {
#add_field => ["macid" : "dynamic field from json"] it should visible in kibana for filtering the #data
remove_field => [ "message", "restOfLine" ]
rename => { " MACID" => "MACID" }
}
date { match => [ "ts", "ISO8601" ] }
}

and i have written output to the file and its showing below output

{"FW":"V5.1.14","prospector":{"type":"log"},"RTC":"19/02/03,23:51:35","input":{"type":"log"},"PCK":{"M26":"AQPAQF5GUJAERk93BUZPwnhGTy0eRrRuhUazKspGtLcXOVFJUjmdqII4+8z3OhLFrLcnzPe5kgAAAAC3F7lRqIK4+6iCOPsAAD+AAAA/gAAAP4AAAD+AAAAAAJumu0QAAAAAm6a7RBJvuwMSb7qDEm86gxJvuwMSbzsDEm87gxJvOoMSbzuDsTBC8PefQu5aAELwWh1CSJqZOnwAAEIQCj0/V7hSP14KPT9XAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAx3QBAxDPfEuB5/9LhswgSoH0AEgcXOgBAxDXBwAOmVUAAr1iAA4ZpAAA2h4BAwwgAAAABAAAAHW+Ons7l36HXFduJw=="},"@version":"1","MACID":"0418003f","host":{"containerized":false,"architecture":"x86_64","name":"amolpc-HP-Laptop-14-bs0xx","os":{"codename":"xenial","platform":"ubuntu","name":"Ubuntu","version":"16.04.2 LTS (Xenial Xerus)","family":"debian"},"id":"810d928d8f414904937f2c900154e8ee"},"beat":{"name":"amolpc-HP-Laptop-14-bs0xx","version":"6.6.0","hostname":"amolpc-HP-Laptop-14-bs0xx"},"TSRC":"R","@timestamp":"2019-02-03T18:21:54.263Z","offset":55569056,"SN":"25925","macid":"MACID","source":"/var/log/test_logs.log","ID":"1","SS":"26","log":{"file":{"path":"/var/log/test_logs.log"}},"ts":"2019-02-03 23:51:54,263","tags":["beats_input_codec_plain_applied"]}

and then i have pointed output to elastic search and its able to create index and data also available there but its not showing in kibana.
even if i get the data in kibana its not useful for me because what i want is all log data along with fields extracted from json like SN,MACID so that filter the records in kibana based on extracted fields.

all log data should remain in message field,and at same time i need exracted fields from json like SN,MACID to filter the data in kibana,so that i create dashboard,chart something like this.

please help me to get this thing done,will really appreciate your support.

note : there is no need of removing space from MACID because that was mistake made by me while copy/paste

So the data is in elasticsearch but you cannot see it in Kibana? Did you update the index pattern? Did you set the time picker to include 2019-02-03T18:21:54.263Z? (Use something like Last Month.)

i have deleted previous index from elasticsearch and created new one still problem is same
when i remove the code from filter provided by you its able send the logs in elasticsearch and its showing logs in kibana too.
like this
filter {}

how can i see extracted fileds from json in kibana for filtering but original log data should be available in message fileld,this is my actual required.

Comment out the mutate filter, leaving the dissect and json filters. What do you then get when you use

output { stdout { codec => rubydebug } }

where do i see the output of this ?
i tried with file,i i have written output the file

You would see it on stdout. But no matter, we can see from that entry you wrote to the file that the individual fields of the message were parsed out. For example in

"TSRC":"R","@timestamp":"2019-02-03T18:21:54.263Z","offset":55569056,"SN":"25925"

both TSRC and SN are fields that were in the log message and are now fields on the event. So, once again... Did you update the index pattern in Kibana? Did you set the time picker to include 2019-02-03T18:21:54.263Z?

output to the file is follows

{"message":"2019-02-03 23:51:54,263 | {"MACID":"0418003f","ID":"1","SS":"26","FW":"V5.1.14","TSRC":"R","SN":"25925","PCK":{"M26":"AQPAQF5GUJAERk93BUZPwnhGTy0eRrRuhUazKspGtLcXOVFJUjmdqII4+8z3OhLFrLcnzPe5kgAAAAC3F7lRqIK4+6iCOPsAAD+AAAA/gAAAP4AAAD+AAAAAAJumu0QAAAAAm6a7RBJvuwMSb7qDEm86gxJvuwMSbzsDEm87gxJvOoMSbzuDsTBC8PefQu5aAELwWh1CSJqZOnwAAEIQCj0/V7hSP14KPT9XAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAx3QBAxDPfEuB5/9LhswgSoH0AEgcXOgBAxDXBwAOmVUAAr1iAA4ZpAAA2h4BAwwgAAAABAAAAHW+Ons7l36HXFduJw=="},"RTC":"19/02/03,23:51:35"}","offset":61875000,"@timestamp":"2019-02-03T18:21:54.263Z","restOfLine":"{"MACID":"0418003f","ID":"1","SS":"26","FW":"V5.1.14","TSRC":"R","SN":"25925","PCK":{"M26":"AQPAQF5GUJAERk93BUZPwnhGTy0eRrRuhUazKspGtLcXOVFJUjmdqII4+8z3OhLFrLcnzPe5kgAAAAC3F7lRqIK4+6iCOPsAAD+AAAA/gAAAP4AAAD+AAAAAAJumu0QAAAAAm6a7RBJvuwMSb7qDEm86gxJvuwMSbzsDEm87gxJvOoMSbzuDsTBC8PefQu5aAELwWh1CSJqZOnwAAEIQCj0/V7hSP14KPT9XAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAx3QBAxDPfEuB5/9LhswgSoH0AEgcXOgBAxDXBwAOmVUAAr1iAA4ZpAAA2h4BAwwgAAAABAAAAHW+Ons7l36HXFduJw=="},"RTC":"19/02/03,23:51:35"}","SS":"26","PCK":{"M26":"AQPAQF5GUJAERk93BUZPwnhGTy0eRrRuhUazKspGtLcXOVFJUjmdqII4+8z3OhLFrLcnzPe5kgAAAAC3F7lRqIK4+6iCOPsAAD+AAAA/gAAAP4AAAD+AAAAAAJumu0QAAAAAm6a7RBJvuwMSb7qDEm86gxJvuwMSbzsDEm87gxJvOoMSbzuDsTBC8PefQu5aAELwWh1CSJqZOnwAAEIQCj0/V7hSP14KPT9XAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAx3QBAxDPfEuB5/9LhswgSoH0AEgcXOgBAxDXBwAOmVUAAr1iAA4ZpAAA2h4BAwwgAAAABAAAAHW+Ons7l36HXFduJw=="},"tags":["beats_input_codec_plain_applied"],"input":{"type":"log"},"MACID":"0418003f","ID":"1","TSRC":"R","host":{"architecture":"x86_64","name":"amolpc-HP-Laptop-14-bs0xx","id":"810d928d8f414904937f2c900154e8ee","os":{"name":"Ubuntu","version":"16.04.2 LTS (Xenial Xerus)","platform":"ubuntu","family":"debian","codename":"xenial"},"containerized":false},"ts":"2019-02-03 23:51:54,263","prospector":{"type":"log"},"FW":"V5.1.14","SN":"25925","@version":"1","beat":{"hostname":"amolpc-HP-Laptop-14-bs0xx","name":"amolpc-HP-Laptop-14-bs0xx","version":"6.6.0"},"log":{"file":{"path":"/var/log/test_logs.log"}},"RTC":"19/02/03,23:51:35","source":"/var/log/test_logs.log"}

My final attempt... the timestamp on the document is

"@timestamp":"2019-02-03T18:21:54.263Z"

That's Feb 3rd. Today is Feb 13th where I am. You have Kibana set to display documents from "Today". That's not going to include documents from the 3rd. Adjust the time picker.

Apologize if I trubbled you,but the logs that you are refering above all are sample logs for testing purpose that why it's showing older dates in logs

I have commenitted the date field in filter code noe it's showing the logs.

Thank you very much for your kind support.

every thing is working fine for first type of log but i getting different types of logs from the server how do map those logs too,here are some sample logs
working fine ------>
2019-02-03 23:51:54,263 | {"MACID":"00009934","ID":"1","SS":"26","FW":"V5.1.14","TSRC":"R","SN":"25925","PCK":{"M26":"AQPAQF5GUJAERk93BUZPwnhGTy0eRrRuhUazKspGtLcXOVFJUjmdqII4+8z3OhLFrLcnzPe5kgAAAAC3F7lRqIK4+6iCOPsAAD+AAAA/gAAAP4AAAD+AAAAAAJumu0QAAAAAm6a7RBJvuwMSb7qDEm86gxJvuwMSbzsDEm87gxJvOoMSbzuDsTBC8PefQu5aAELwWh1CSJqZOnwAAEIQCj0/V7hSP14KPT9XAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAx3QBAxDPfEuB5/9LhswgSoH0AEgcXOgBAxDXBwAOmVUAAr1iAA4ZpAAA2h4BAwwgAAAABAAAAHW+Ons7l36HXFduJw=="},"RTC":"19/02/03,23:51:35"}


not workng getting error for this ---->

2019-02-03 23:52:11,940 | [V4,0833364F,533.330,0,0,533.330,0,0,0,0,-0.849,0,0,-0.849,628.064,0,0,628.064,432.013,
431.847,433.645,430.547,249.423,247.824,251.089,249.355,1.055,0,0,3.164,49.975,38982176.000,0
,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,19/02/03,23:52:12]

how do i map both type of logs,would be appreciable if you can help me in this.
Thank you