How to use Grok for JSON parsing


I want to parse this JSON object to ELK

"Format": "IDEA0",
"ID": "2b03eb1f-fc4c-4f67-94e5-31c9fb32dccc",
"DetectTime": "2022-01-31T08:16:12.600470+07:00",
"EventTime": "2022-01-31T01:23:01.637438+00:00",
"Category": ['Intrusion.Botnet'],
"Confidence": 0.03,
"Note": "C&C channel, destination IP: port: 8007/tcp score: 0.9324",
"Source": [{'IP4': [''], 'Type': ['CC']}]

I've already used this grok to parse it but i think the field won't separate, and get some error that

input {
        file {
                path => "/home/ubuntu/Downloads/StratosphereLinuxIPS/output/*.json"
                start_position => "beginning"
                sincedb_path => "/dev/null"

filter {
        json {
                source => "message"

output {
        elasticsearch {
                hosts => ["localhost:9200"]
                index => "test-test"
                user => "***"
                password => "***"


Perhaps take a look at this thread.

Hi Stephen,

Thanks for the answer.

Is there any other solution to separate it in the Logstash Pipeline ? so I don't need to run it using the jq command.

it's because I'm countering this error while using the jq command

Parse error: Invalid numeric literal at line 6, column 31

The real problem here is this

I want to separate the field (see picture) in kibana but i don't know-how


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.