Grok WARN without using it

Filebeat config:

filebeat.prospectors:
 - paths:
- /*.log
   tags: ["internal", "log"]
   json.message_key: msg
   json.add_error_key: true
   harvester_limit: 500
   close_inactive: 5m

 - paths:
- /other/*.log
   tags: ["one", "two"]
   json.message_key: msg
   json.add_error_key: true
   harvester_limit: 900
   close_inactive: 1m
   close_eof: true
   clean_removed: true

output:
  logstash:
hosts: ["logstashhost:5000"]
loadbalance: true

Logstash pipeline;

input {
	beats {
		port => 5000
	}
}

filter {
	# Sample: 2018-01-05T15:48:34+01:00
	date {
	    match => [ "json.keyValues.startTime", "YYYY-MM-DD'T'HH:mm:SSZZ" ]
	    add_tag => ["dated"]
	}

	mutate {
        remove_field => ["offset", "input_type", "beat", "host", "type"]
    }
}

output {
    elasticsearch {
        id => "log-meta"
        hosts => ["http://ingestion:9200"]
        index => "datasync-meta-%{+YYYY.MM.dd}"
        template => "/usr/share/logstash/config/log-template.json"
        template_name => "log"
    }
	
	elasticsearch {
	    id => "log-all"
	    hosts => ["http://ingestion:9200"]
		index => "log-all-%{+YYYY.MM.dd}"
		template => "/usr/share/logstash/config/log-template.json"
        template_name => "log"
    }

#    stdout { codec => rubydebug }
}

And the template file:

{
  "index_patterns": [
    "log-*"
  ],
  "version": 1,
  "settings": {
    "index.refresh_interval": "5s"
  },
  "mappings": {
    "doc": {
      "dynamic": false,
      "properties": {
        "@timestamp": {"type": "date"},
        "@version": {"type": "keyword"},
        "json": {
          "properties": {
            "keyValues": {
              "properties": {
                "ewdId": {"type": "long"},
                "srcRow": {"type": "long"},
                "runId": {"type": "text"},
                "startTime": {"type": "date"},
                "cmd": {"type": "text"},
                "class": {"type": "keyword"},
                "tenant": {"type": "keyword"},
                "action": {"type": "keyword"},
                "file": {"type": "text"},
                "pid": {"type": "long"},
                "jobName": {"type": "keyword"}
              }
            },
            "msg": {"type": "text"},
            "step": {
              "type": "keyword"
            },
            "tags": {
              "type": "text",
              "norms": false,
              "fields": {
                "keyword": {
                  "type": "keyword",
                  "ignore_above": 256
                }
              }
            }
          }
        },
        "source": {
          "type": "text",
          "norms": false,
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        },
        "tags": {
          "type": "text",
          "norms": false,
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        },
        "type": {
          "type": "text",
          "norms": false,
          "fields": {
            "keyword": {
              "type": "keyword",
              "ignore_above": 256
            }
          }
        }
      }
    }
  }
}

Leads to the warning:

logstash_1 | [2018-01-27T21:04:50,555][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"log-all-2018.01.27", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x4ce400e5>], :response=>{"index"=>{"_index"=>"log-all-2018.01.27", "_type"=>"doc", "_id"=>"sT5vOWEBsEwqrBiS5-5U", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [json.keyValues.srcRow]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"38075 by rule 0\""}}}}}

I cant see why, because

  1. Im not using the grok-plugin.
  2. The type of json.keyValues.srcRow is long and not string/ text

It is only a warning, but it fills my logs :-/ and I would like to figure out how to avoid this.

Im not using the grok-plugin.

No, and the error message isn't complaining about grok either.

The type of json.keyValues.srcRow is long and not string/ text

The error message indicates that you're trying to send a document where that field contains "38075 by rule 0" which clearly can't be parsed as a long.

The error message indicates that you're trying to send a document where that field contains "38075 by rule 0" which clearly can't be parsed as a long.

Right, but the content isn't "38075 by rule 0". It is 38075 and I dont know where the rest is comming from.

If I use the stdout output plugin, I get:

logstash_1   | {
logstash_1   |           "tags" => [
logstash_1   |         [0] "xxx",
logstash_1   |         [1] "log",
logstash_1   |         [2] "beats_input_raw_event"
logstash_1   |     ],
logstash_1   |         "source" => "/1.log",
logstash_1   |     "@timestamp" => 2018-01-29T08:46:31.216Z,
logstash_1   |           "json" => {
logstash_1   |               "msg" => "w:1",
logstash_1   |              "step" => "personGroup",
logstash_1   |         "keyValues" => {
logstash_1   |                   "srcRow" => 1128,
logstash_1   |                       "id" => "1",
logstash_1   |         },
logstash_1   |              "tags" => [
logstash_1   |             [0] "exists"
logstash_1   |         ]
logstash_1   |     },
logstash_1   |       "@version" => "1"
logstash_1   | }

There you can see that the type long should fit for srcRow.

Robert

That stdout example is the result if a different input line. What if you try with the 38075 line?

Yes, it is. It is sent by filebeat, but have the same syntax like

{"step":"pppp","tags":["exists"],"msg":"id:20","keyValues":{"runId":"bb27bc7f78aed86a7bbf8b0790bb65a214e1f976","identifiedBy":"modelAttrKey '20' for attrRule 'id'","pid":"20","srcRow":0}}

Here you can see, that type of "srcRow" is numeric, also.

I declared the filed "srcRow" as an long, like the pid, which comes encode as json string?

You were right, there was an row number containting text. Thanks. I changed the type to "text" and fixed the problem.

Thanks you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.