Logstash can't process all JSON logs


(Ferdous Shibly) #1

Hi,

We are sending logs directly from django based python application to logstash over TCP/9111 port. We found that some logs are missing like

{"@timestamp": "2018-07-13T14:38:23.741Z", "remote_addr": "10.0.2.2", "host": "ubuntu-xenial", "message": "Successful multi-factor authentication step", "path": "/home/ubuntu/themis_airflow/profiles/views.py", "@version": "1", "stack_info": null, "logger_name": "profiles.views", "auth": {"user": 152390415851211, "step": 1, "mfa": "REQ"}, "type": "flow-django-local", "tags": ["mfa-success"], "level": "INFO"}

{"@timestamp": "2018-07-13T14:39:28.506Z", "remote_addr": "10.0.2.2", "host": "ubuntu-xenial", "message": "Successful multi-factor authentication step", "path": "/home/ubuntu/themis_airflow/profiles/views.py", "@version": "1", "stack_info": null, "logger_name": "profiles.views", "auth": {"user": 152390415851211, "step": 1, "mfa": "REQ"}, "type": "flow-django-local", "tags": ["mfa-success"], "level": "INFO"}

We found the first log in Kibana but the second one didn't appeared.

Here are out logstash configuration:

input{
  tcp {
    host  => "0.0.0.0"
    port  => 9111
    codec => "json"
    tags  => ["django"]
  }
}

filter {
  if "django" in [tags] {
      json {
        source => "message"
      }
  }
}

output {

  if "django" in [tags] {
    amazon_es {
      hosts  => ["<AWS-ES>"]
      region => "<region>"
      index  => "django-%{+YYYY.MM.dd}"
    }
  }

  else {
    null {}
  }

}

output {
  if "django" in [tags] {
    s3{
      region => "<region>"
      bucket => "dotdash-qa-application-logs"
      prefix => "app/django/%{+YYYY.MM.dd}"
      size_file => 2146304
      time_file => 5
      codec => json
      canned_acl => "private"
    }
  }
  else {
    null {}
  }
}

We found the following warning log in logstash

[2018-07-13T15:07:04,097][WARN ][logstash.filters.json ] Error parsing json {:source=>"message", :raw=>"Login failed.", :exception=>#<LogStash::Json::ParserError: Unrecognized token 'Login': was expecting ('true', 'false' or 'null')
at [Source: (byte[])"Login failed."; line: 1, column: 7]>}

Any help would be appreciated.

Ferdous Shibly


#2

If you are using a json codec on the input the "message" field will be something like "Successful multi-factor authentication step" and that is not valid JSON. You do not need a json filter if you use a codec.


(Ferdous Shibly) #3

After removing filter pipeline, we found some tags are missing. We got only tags which are being added in the input chain. And still now some logs are missing.


#4

If that is your complete configuration it is hard to see how that could be happening. But I would suggest replacing those null outputs with something like file to see if there are events going through those branches.


(Ferdous Shibly) #5

Only message I found is

[2018-07-15T23:16:31,025][WARN ][logstash.filters.json ] Error parsing json {:source=>"message", :raw=>"Login failed.", :exception=>#<LogStash::Json::ParserError: Unrecognized token 'Login': was expecting ('true', 'false' or 'null')
at [Source: (byte[])"Login failed."; line: 1, column: 7]>}


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.