My logstash instance seems to report that it's running fine, and connecting to ES. I also can confirm the ES endpoint is correct, and my AWS IAM role gives me access privs, by its response to a curl GET. Meanwhile, other logstash instances on different machines are successfully sending data to ES and having it show up in Kibana.
But nothing shows up in Kibana from my instance.
Any idea what might be going on?
One thing I'm unclear on is the structure of the data I'm sending to Kibana. Kibana seems to want json documents that have some kind of index or name matching "logstash-*" -- is this set automatically by logstash? The debug logs don't make it clear what the final, sent document will look like.
Excerpted debug log is below:
[2017-08-04T03:08:30,089][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<Java::JavaNet::URI:0x4631dacb>}
[2017-08-04T03:08:30,090][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-08-04T03:08:30,154][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-08-04T03:08:30,165][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2017-08-04T03:08:30,165][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<Java::JavaNet::URI:0x1213f8be>]}
[2017-08-04T03:08:30,656][DEBUG][logstash.inputs.file ] /home/ec2-user/distro/distrib/flask.log: sincedb last value 1538625, cur size 1538625
[2017-08-04T03:08:30,656][DEBUG][logstash.inputs.file ] /home/ec2-user/distro/distrib/flask.log: sincedb: seeking to 1538625
[2017-08-04T03:08:30,742][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-08-04T03:12:00,813][DEBUG][logstash.filters.grok ] Running grok filter {:event=>2017-08-04T03:12:00.805Z ip-10-0-0-87 2017-08-04 03:12:00,133; distrib ; INFO ; For instance 15226, subscription query returned 104 results}
[2017-08-04T03:12:00,815][DEBUG][logstash.filters.grok ] Event now: {:event=>2017-08-04T03:12:00.805Z ip-10-0-0-87 2017-08-04 03:12:00,133; distrib ; INFO ; For instance 15226, subscription query returned 104 results}
[2017-08-04T03:12:00,815][DEBUG][logstash.pipeline ] output received {"event"=>{"msg"=>"For instance 15226, subscription query returned 104 results", "path"=>"/home/ec2-user/distro/distrib/flask.log", "@timestamp"=>2017-08-04T03:12:00.805Z, "module"=>"distrib", "loglevel"=>"INFO", "@version"=>"1", "host"=>"ip-10-0-0-87", "message"=>"2017-08-04 03:12:00,133; distrib ; INFO ; For instance 15226, subscription query returned 104 results", "timestamp"=>"2017-08-04 03:12:00,133"}}
[2017-08-04T03:12:05,630][DEBUG][logstash.pipeline ] Pushing flush onto pipeline