Logstash showing logs send to Kibana with yellow index and mismatched pre-populated fields mapping

Using the following version for:-

logstash 7.1.1
Kibana "7.1.1
elasticsearch: 7.1.1

**My **
logstash.conf

input {
  beats { port => 5044}
}

filter {
  grok {
  match => [
        "message", "%{TIMESTAMP_ISO8601:timestamp_sting}%{SPACE}%{GREEDYDATA:line}"
 ]
}

date {
        match => ["timestamp_sting", "ISO8601"]
 }

mutate {
        remove_field => [message, timestamp_sting]
 }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
#    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
  #  user => "elastic"
   # password => "changeme"
  }
stdout {
        codec => rubydebug
        }
}

**My **
logstash.yml

node.name: test
path.logs: /root/logstash-7.1.1/LOG

**My **
filebeat.yml

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - /root/filebeat-7.1.1-linux-x86_64/sample.log
  fields_under_root: true
  fields:
    type: log
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.template.settings:
  index.number_of_shards: 1
setup.kibana:
output.logstash:
 hosts: ["localhost:5044"]
processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~

Logs of logstash

[2019-06-16T18:20:50,028][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.1.1"}
[2019-06-16T18:20:53,027][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-06-16T18:20:53,249][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-06-16T18:20:53,330][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2019-06-16T18:20:53,332][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2019-06-16T18:20:53,352][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2019-06-16T18:20:53,372][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-06-16T18:20:53,455][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1, "index.lifecycle.name"=>"logstash-policy", "index.lifecycle.rollover_alias"=>"logstash"}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2019-06-16T18:20:53,562][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"test", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x43ed32fa run>"}
[2019-06-16T18:20:54,173][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2019-06-16T18:20:54,180][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"test"}
[2019-06-16T18:20:54,232][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:test], :non_running_pipelines=>}
[2019-06-16T18:20:54,280][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2019-06-16T18:20:54,492][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

logs of filebeat

Troubleshooting

labuser@elk:/root/filebeat-7.1.1-linux-x86_64/logs$ curl http://localhost:9200/_cat/indices?v
health status index                uuid                   pri rep docs.count docs.deleted store.size pri.store.size
green  open   .kibana_task_manager FZGzMH03R-2s5p1W-TT0pg   1   0          2            0     45.5kb         45.5kb
green  open   .kibana_1            oxNrrLMIS2OkCrTU9aaycg   1   0          6            2     44.2kb         44.2kb
yellow open   logstash             pOBB-7K0RuqYAaRclHjFBg   1   1          1            0     17.7kb         17.7kb


labuser@elk:/root/filebeat-7.1.1-linux-x86_64/logs$ curl http://localhost:9200/_node/stats/pipelines
{"error":{"root_cause":[{"type":"index_not_found_exception","reason":"no such index [_node]","resource.type":"index_expression","resource.id":"_node","index_uuid":"_na_","index":"_node"}],"type":"index_not_found_exception","reason":"no such index [_node]","resource.type":"index_expression","resource.id":"_node","index_uuid":"_na_","index":"_node"},"status":404}labuser@elk:/root/filebeat-7.1.1-linux-x86_64/logs$

sample.log

2008-09-15T11:30:00Z alpha beta
2008-09-15T12:18:00Z charlie r
2008-09-15T13:20:00Z moth ff
2008-09-15T14:40:00Z seven eight

stdout logstash

elasticsearch debug

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.