What is wrong with my JSON file? Why this is not being parsed?

Hi team,

I am writing my logs in json file and and input filter is json directly but those are being parsed correctly in elasticsearch? Any clue what I am missing here?

Here is my json file

{"org_name": "google.com", "policy_spf": "pass", "org_email": "noreply-dmarc-support@google.com", "policy_dkim": "pass", "policy_pct": "100", "auth_spf_result": "pass","auth_dkim_domain": "xxx,ccc", "auth_dkim_result": "pass", "identifier_header_from": "xxx,ccc", "date_end": "2019-01-04T05:29:59", "date_start": "2019-01-03T05:30:00", "source_ip": "1,.2.3.4", "count": 1, "auth_spf_domain": "xxx,ccc", "policy_p": "none", "submitter": "unknown", "policy_disposition": "none", "policy_domain": "xxx,ccc", "id": "15325652754200102860"}

{"org_name": "google.com", "policy_spf": "fail", "org_email": "noreply-dmarc-support@google.com", "policy_dkim": "fail", "policy_pct": "100", "auth_spf_result": "pass", "identifier_header_from": "mail.xxx,ccc", "date_end": "2019-01-04T05:29:59", "date_start": "2019-01-03T05:30:00", "source_ip": "2.3.4.5", "count": 1, "auth_spf_domain": "apc01-hk2-obe.outbound.protection.outlook.com", "policy_p": "none", "submitter": "unknown", "policy_disposition": "none", "policy_domain": "xxx,ccc", "id": "15325652754200102860"}
{"org_name": "google.com", "policy_spf": "pass", "org_email": "noreply-dmarc-support@google.com", "policy_dkim": "pass", "policy_pct": "100", "auth_spf_result": "pass", "auth_dkim_domain": "xxx,ccc", "auth_dkim_result": "pass", "identifier_header_from": "xxx,ccc", "date_end": "2019-01-04T05:29:59", "date_start": "2019-01-03T05:30:00", "source_ip": "2.2.2.2", "count": 1, "auth_spf_domain": "xxx,ccc", "policy_p": "none", "submitter": "unknown", "policy_disposition": "none", "policy_domain": "xxx,ccc", "id": "15325652754200102860"}

And here is the config file

input {
  file {
    type => "json"
    path => "/log/*.json"
    start_position => "beginning"

sincedb_path => "/dev/null"
}
}

filter {
  if [source_type] == "json-logs" {
    json {
      source => "message"
      tag_on_failure => ["_jsonparsefailure"]
    }
  }

Do I need write GROK pattern for my JSON logs? Since those are JSOn should get parsed automatically right?

As far as I can see you do not have a field called ‘source_type’ so your JSON filter will never run.

hmm so below should work directly right?

filter {
 json {
  source => "message"
  tag_on_failure => ["_jsonparsefailure"]

}

You can also use the JSON codec.

As in configuration or suggesting for Json codec plugin?

I think either should work.

1 Like

I believe by default logstash-codec-json is there..

Any alternative?

You need to use the JSON filter (with correct conditionals) or the JSON codec. Either should work.

Nah man..neither worked!! I would highly appreciate if you could pinpoint the mistake? or which stanza I am wrong?

I mean I tried all the combinations and none of them worked :frowning:

This seems to work fine for me so I am not sure what you are doing wrong:

input {
  generator {
    lines => ['{"org_name": "google.com", "policy_spf": "fail", "org_email": "noreply-dmarc-support@google.com", "policy_dkim": "fail", "policy_pct": "100", "auth_spf_result": "pass", "identifier_header_from": "mail.xxx,ccc", "date_end": "2019-01-04T05:29:59", "date_start": "2019-01-03T05:30:00", "source_ip": "2.3.4.5", "count": 1, "auth_spf_domain": "apc01-hk2-obe.outbound.protection.outlook.com", "policy_p": "none", "submitter": "unknown", "policy_disposition": "none", "policy_domain": "xxx,ccc", "id": "15325652754200102860"}
{"org_name": "google.com", "policy_spf": "pass", "org_email": "noreply-dmarc-support@google.com", "policy_dkim": "pass", "policy_pct": "100", "auth_spf_result": "pass", "auth_dkim_domain": "xxx,ccc", "auth_dkim_result": "pass", "identifier_header_from": "xxx,ccc", "date_end": "2019-01-04T05:29:59", "date_start": "2019-01-03T05:30:00", "source_ip": "2.2.2.2", "count": 1, "auth_spf_domain": "xxx,ccc", "policy_p": "none", "submitter": "unknown", "policy_disposition": "none", "policy_domain": "xxx,ccc", "id": "15325652754200102860"}']
    count => 1
  } 
} 

filter {
  json {
    source => "message"
  }
}

output {
  stdout { codec => rubydebug }
}

Replace the generator with your input and inspect what is written to stdout.

My Input is just simple

input {
file {
type => "json"
codec => json_lines
path => "/log/*.json"
start_position => "beginning"
sincedb_path => "/dev/null"
}

And just

filter {
json {
source => "message"
}
}

Worked?

Try using a JSON codec, not json_lines, and remove the filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.