Grok pattern is matching but not working

config in pipeline/logstash.conf:

filter {
	grok { 
		match => {
			"message" => [
				"\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{LOGLEVEL:level}\] \[%{PATH:path}:%{POSINT:line}\] \[%{WORD:node}\] \[%{GREEDYDATA:message}\]",
				"%{COMBINEDAPACHELOG}"
				]
		}
		overwrite => [ "message" ]
	}

	urldecode {
		field => "request"
	}
}

examples:

[2018-10-19 15:22:30.709] [INFO] [/build/app.js:42] [master] [POST:/app/games/2/posts/25571/relationships 400 11.250 ms 221.213.146.164 {"openid":"oAl2A0i_3kpAodDyAR53CZFVgX-g"}]
140.207.54.75 - - [19/Oct/2018:15:38:14 +0800] "POST /wx56b01188481383cb/callback?signature=2688cea83553f2cb6f9c2b76eb3381108cca0f21&timestamp=1539934693&nonce=1210898241&openid=oAl2A0pnnLJEtF1fb1rNr69Zsa6s&encrypt_type=aes&msg_signature=54a361c14e8f73a81ca8189189b42b78c0ea4c5f HTTP/1.1" 200 7 "-" "Mozilla/4.0" "-"

result in debugger:

{
  "path": "/build/app.js",
  "node": "master",
  "level": "INFO",
  "line": "42",
  "message": "POST:/app/games/2/posts/25571/relationships 400 11.250 ms 221.213.146.164 {\"openid\":\"oAl2A0i_3kpAodDyAR53CZFVgX-g\"}",
  "timestamp": "2018-10-19 15:22:30.709"
}
{
  "request": "/wx56b01188481383cb/callback?signature=2688cea83553f2cb6f9c2b76eb3381108cca0f21&timestamp=1539934693&nonce=1210898241&openid=oAl2A0pnnLJEtF1fb1rNr69Zsa6s&encrypt_type=aes&msg_signature=54a361c14e8f73a81ca8189189b42b78c0ea4c5f",
  "agent": "\"Mozilla/4.0\"",
  "auth": "-",
  "ident": "-",
  "verb": "POST",
  "referrer": "\"-\"",
  "response": "200",
  "bytes": "7",
  "clientip": "140.207.54.75",
  "httpversion": "1.1",
  "timestamp": "19/Oct/2018:15:38:14 +0800"
}

data in kibana:

NOT CHANGED

{
  "_index": "filebeat-6.4.2-2018.10.19",
  "_type": "doc",
  "_id": "Ey5Fi2YB_U2Lw1whTETO",
  "_version": 1,
  "_score": null,
  "_source": {
    "@timestamp": "2018-10-19T07:41:04.517Z",
    "source": "/var/lib/docker/containers/64e370d4ad3278ca5a3c7f790b81d17aa509bd1b5b54c31c942ed74e6b51f855/64e370d4ad3278ca5a3c7f790b81d17aa509bd1b5b54c31c942ed74e6b51f855-json.log",
    "offset": 26716807,
    "stream": "stdout",
    "prospector": {
      "type": "docker"
    },
    "input": {
      "type": "docker"
    },
    "beat": {
      "name": "iZbp13wzrvhq9vui56o0jqZ",
      "hostname": "iZbp13wzrvhq9vui56o0jqZ",
      "version": "6.4.2"
    },
    "host": {
      "name": "iZbp13wzrvhq9vui56o0jqZ"
    },
    "message": "[2018-10-19 15:41:04.515] [INFO] [/build/common/schedule-job-manager.js:49] [master] [定时job:CHECK_DIST 完成]",
    "docker": {
      "container": {
        "image": "parsec-tech/game-0008",
        "name": "game0008_game-0008_1",
        "id": "64e370d4ad3278ca5a3c7f790b81d17aa509bd1b5b54c31c942ed74e6b51f855",
        "labels": {
          "com": {
            "docker": {
              "compose": {
                "project": "game0008",
                "service": "game-0008",
                "version": "1.20.0-rc2",
                "config-hash": "f2001599317358200942f3a5511a92106cba9b45b9ce811c282c6316b755c37d",
                "container-number": "1",
                "oneoff": "False"
              }
            }
          }
        }
      }
    }
  },
  "fields": {
    "@timestamp": [
      "2018-10-19T07:41:04.517Z"
    ]
  },
  "highlight": {
    "docker.container.image": [
      "@kibana-highlighted-field@parsec-tech/game-0008@/kibana-highlighted-field@"
    ]
  },
  "sort": [
    1539934864517
  ]
}
{
  "_index": "filebeat-6.4.2-2018.10.19",
  "_type": "doc",
  "_id": "Hy5Fi2YB_U2Lw1whYkQ2",
  "_version": 1,
  "_score": null,
  "_source": {
    "@timestamp": "2018-10-19T07:41:10.524Z",
    "prospector": {
      "type": "docker"
    },
    "host": {
      "name": "iZbp13wzrvhq9vui56o0jqZ"
    },
    "beat": {
      "version": "6.4.2",
      "name": "iZbp13wzrvhq9vui56o0jqZ",
      "hostname": "iZbp13wzrvhq9vui56o0jqZ"
    },
    "source": "/var/lib/docker/containers/3cdc915d9e8cd82a2bac66523c5902214048c8821f5b7d47834cec74a3b0af0b/3cdc915d9e8cd82a2bac66523c5902214048c8821f5b7d47834cec74a3b0af0b-json.log",
    "offset": 17352916,
    "message": "58.251.80.52 - - [19/Oct/2018:15:41:10 +0800] \"POST /wxa040c2edf31a93d0/callback?signature=ba57cc48819d635a8e1a6c34183e198541c16b57&timestamp=1539934870&nonce=1100722667&openid=o0SUiwsUpGtuvZlAy9aJtnnbH_ac&encrypt_type=aes&msg_signature=49f9eef49a3bb36a4da75b3c7c40b8364ad54493 HTTP/1.1\" 200 7 \"-\" \"Mozilla/4.0\" \"-\"",
    "input": {
      "type": "docker"
    },
    "stream": "stdout",
    "docker": {
      "container": {
        "id": "3cdc915d9e8cd82a2bac66523c5902214048c8821f5b7d47834cec74a3b0af0b",
        "image": "nginx",
        "name": "nginx"
      }
    }
  },
  "fields": {
    "@timestamp": [
      "2018-10-19T07:41:10.524Z"
    ]
  },
  "highlight": {
    "docker.container.image": [
      "@kibana-highlighted-field@nginx@/kibana-highlighted-field@"
    ]
  },
  "sort": [
    1539934870524
  ]
}

My guess would be that you are sending the data directly to Elasticsearch and not through Logstash. Check your Filebeat config to see that only the Logstash output is enabled.

yes, it's not sending to logstash.

I changed, but can't send any data

input {
	tcp {
		port => 5044
	}
}
###################### Filebeat Configuration Example #########################

# This file is an example configuration file highlighting only the most common
# options. The filebeat.reference.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/filebeat/index.html

# For more available modules and options, please see the filebeat.reference.yml sample
# configuration file.

#=========================== Filebeat inputs =============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: docker
  containers.ids: '*' #所有的容器
  processors:
    - add_docker_metadata: ~ #补充容器元数据


#============================= Filebeat modules ===============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

#==================== Elasticsearch template setting ==========================

setup.template.settings:
  index.number_of_shards: 3
  #index.codec: best_compression
  #_source.enabled: false

#============================== Kibana =====================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  #host: "localhost:5601"

#================================ Outputs =====================================

# Configure what output to use when sending the data collected by the beat.

#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["<HIDDEN>:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

2018-10-19T15:51:25.326+0800	INFO	pipeline/output.go:95	Connecting to backoff(async(tcp://<HIDDEN>:5044))
2018-10-19T15:51:25.371+0800	INFO	pipeline/output.go:105	Connection to backoff(async(tcp://<HIDDEN>:5044)) established
2018-10-19T15:51:34.328+0800	INFO	log/harvester.go:251	Harvester started for file: /var/lib/docker/containers/9205bfe2f780797735f5b22226f41fffeeb04c820845831b133109b577ed5d1b/9205bfe2f780797735f5b22226f41fffeeb04c820845831b133109b577ed5d1b-json.log
2018-10-19T15:51:34.328+0800	INFO	log/harvester.go:251	Harvester started for file: /var/lib/docker/containers/6b526152fde48fb9203e9b03125c77364c49ebf3b264c199e86b4411f88d2866/6b526152fde48fb9203e9b03125c77364c49ebf3b264c199e86b4411f88d2866-json.log
2018-10-19T15:51:54.186+0800	INFO	[monitoring]	log/log.go:141	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":20,"time":{"ms":20}},"total":{"ticks":80,"time":{"ms":80},"value":80},"user":{"ticks":60,"time":{"ms":60}}},"info":{"ephemeral_id":"7d61af71-74ea-4e6c-bdf1-f094901e4aff","uptime":{"ms":30016}},"memstats":{"gc_next":6023296,"memory_alloc":3167288,"memory_total":13328680,"rss":23687168}},"filebeat":{"events":{"active":83,"added":120,"done":37},"harvester":{"open_files":5,"running":5,"started":5}},"libbeat":{"config":{"module":{"running":0},"reloads":1},"output":{"events":{"active":35,"batches":3,"total":35},"type":"logstash","write":{"bytes":11994}},"pipeline":{"clients":1,"events":{"active":81,"filtered":39,"published":81,"retry":31,"total":120}}},"registrar":{"states":{"current":34,"update":37},"writes":{"success":37,"total":37}},"system":{"cpu":{"cores":2},"load":{"1":0.13,"15":0.16,"5":0.23,"norm":{"1":0.065,"15":0.08,"5":0.115}}}}}}
2018-10-19T15:51:55.374+0800	ERROR	logstash/async.go:256	Failed to publish events caused by: read tcp <HIDDEN>:47536-><HIDDEN>:5044: i/o timeout
2018-10-19T15:51:55.374+0800	ERROR	logstash/async.go:256	Failed to publish events caused by: read tcp <HIDDEN>:47536-><HIDDEN>:5044: i/o timeout
2018-10-19T15:51:55.374+0800	ERROR	logstash/async.go:256	Failed to publish events caused by: read tcp <HIDDEN>:47536-><HIDDEN>:5044: i/o timeout
2018-10-19T15:51:55.376+0800	INFO	[publish]	pipeline/retry.go:166	retryer: send wait signal to consumer
2018-10-19T15:51:55.376+0800	INFO	[publish]	pipeline/retry.go:168	  done
2018-10-19T15:51:55.376+0800	ERROR	logstash/async.go:256	Failed to publish events caused by: client is not connected
2018-10-19T15:51:56.377+0800	ERROR	pipeline/output.go:121	Failed to publish events: client is not connected

Do you have any firewall or network issue preventing connectivity?

I can use telnet to connect to the port

Ah, you need to use the beats input plugin, not the tcp plugin.

I found this log in logstash

Received an event that has a different character encoding than you configured.
... :expected_charset=>"UTF-8"}

I change tcp into beats then logstash logs:

[2018-10-19T08:29:55,850][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2018.10.19", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x6c0a5f5f>], :response=>{"index"=>{"_index"=>"logstash-2018.10.19", "_type"=>"doc", "_id"=>"yi5xi2YB_U2Lw1wh_2YH", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [host]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:681"}}}}}

yes, it's working. I just missing to config output part

Thanks for your help!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.