Kibana not showing Netflow data?

I've been struggling attempting to prove out the ELK stack with a particular input. I've successfully setup all of the latest versions thinking that was my issue to no avail. Here's what i have.

On a single box (linux vm) I've got ElasticSearch 2.0, Logstash 2.0 and Kibana 4.2

Logstash is processing net flow data from a network switch (v5 net flow). I've got log stash sending the net flow data to elastic AND to an output file. I see lots of output going to the file, so it's receiving and processing the data. I setup a new index in Kibana to be able to discover the net flow data (logstash-netflow*). Kibana successfully sees the new index and presents me with a time option (@timestamp), which i select to use and voila, there are all of the fields listed. Awesome, i'm about to visualize some flows. But....Kibana can never pull up any data?

So, I verify that the data is in ElasticSearch by going straight to it and sure enough, i can pull up data

{"took":7,"timed_out":false,"_shards":{"total":5,"successful":5,"failed":0},"hits":{"total":112913,"max_score":1.0,"hits":[{"_index":"logstash-netflow-2015.11.01","_type":"logs","_id":"AVDDKgNzYsNGmdCX6dMr","_score":1.0,"_source":{"@timestamp":"2015-11-01T08:11:13.294Z","netflow":{"version":5,"flow_seq_num":11708233,"engine_type":0,"engine_id":0,"sampling_algorithm":0,"sampling_interval":0,"flow_records":30,"ipv4_src_addr":"192.168.100.63","ipv4_dst_addr":"192.168.100.242","ipv4_next_hop":"0.0.0.0","input_snmp":5,"output_snmp":5,"in_pkts":1,"in_bytes":60,"first_switched":"2015-11-01T08:11:08.293Z","last_switched":"2015-11-01T08:10:53.293Z","l4_src_port":60374,"l4_dst_port":53,"tcp_flags":0,"protocol":17,"src_tos":0,"src_as":0,"dst_as":0,"src_mask":24,"dst_mask":32},"@version":"1","host":"192.168.100.3"}},{"_index":"logstash-netflow-2015.11.01","_type":"logs","_id":"AVDDKgNzYsNGmdCX6dMt","_score":1.0,"_source":{"@timestamp":"2015-11-01T08:11:13.294Z","netflow":{"version":5,"flow_seq_num":11708233,"engine_type":0,"engine_id":0,"sampling_algorithm":0,"sampling_interval":0,"flow_records":30,"ipv4_src_addr":"74.205.31.227","ipv4_dst_addr":"192.168.100.63","ipv4_next_hop":"0.0.0.0","input_snmp":5,"output_snmp":5,"in_pkts":19,"in_bytes":10019,"first_switched":"2015-11-01T08:11:08.293Z","last_switched":"2015-11-01T08:10:53.293Z","l4_src_port":443,"l4_dst_port":35585,"tcp_flags":0,"protocol":6,"src_tos":0,"src_as":0,"dst_as":0,"src_mask":0,"dst_mask":24},"@version":"1","host":"192.168.100.3"}},{"_index":"logstash-netflow-2015.11.01","_type":"logs","_id":"AVDDKgNzYsNGmdCX6dMv","_score":1.0,"_source":{"@timestamp":"2015-11-01T08:11:13.294Z","netflow":{"version":5,"flow_seq_num":11708233,"engine_type":0,"engine_id":0,"sampling_algorithm":0,"sampling_interval":0,"flow_records":30,"ipv4_src_addr":"192.168.100.242","ipv4_dst_addr":"192.168.100.203","ipv4_next_hop":"0.0.0.0","input_snmp":5,"output_snmp":5,"in_pkts":1,"in_bytes":110,"first_switched":"2015-11-01T08:11:08.293Z","last_switched":"2015-11-01T08:10:53.293Z","l4_src_port":53,"l4_dst_port":32907,"tcp_flags":0,"protocol":17,"src_tos":0,"src_as":0,"dst_as":0,"src_mask":32,"dst_mask":24},"@version":"1","host":"192.168.100.3"}},{"_index":"logstash-netflow-2015.11.01","_type":"logs","_id":"AVDDKgNzYsNGmdCX6dNJ","_score":1.0,"_source":{"@timestamp":"2015-11-01T08:11:13.294Z","netflow":{"version":5,"flow_seq_num":11708293,"engine_type":0,"engine_id":0,"sampling_algorithm":0,"sampling_interval":0,"flow_records":30,"ipv4_src_addr":"192.168.100.135","ipv4_dst_addr":"192.168.100.242","ipv4_next_hop":"0.0.0.0","input_snmp":5,"output_snmp":5,"in_pkts":1,"in_bytes":111,"first_switched":"2015-11-01T08:11:13.293Z","last_switched":"2015-11-01T08:10:58.293Z","l4_src_port":63638,"l4_dst_port":53,"tcp_flags":0,"protocol":17,"src_tos":0,"src_as":0,"dst_as":0,"src_mask":24,"dst_mask":32},"@version":"1","host":"192.168.100.3"}},{"_index":"logstash-netflow-2015.11.01","_type":"logs","_id":"AVDDKgNzYsNGmdCX6dNN","_score":1.0,"source":{"@timestamp":"2015-11-01T08:11:13.294Z","netflow":{"version":5,"flow_seq_num":11708293,"engine_type":0,"engine_id":0,"sampling_algorithm":0,"sampling_interval":0,"flow_records":30,"ipv4_src_addr":"192.168.100.242","ipv4_dst_addr":"192.168.100.71","ipv4_next_hop":"0.0.0.0","input_snmp":5,"output_snmp":5,"in_pkts":1,"in_bytes":80,"first_switched":"2015-11-01T08:11:13.293Z","last_switched":"2015-11-01T08:10:58.293Z","l4_src_port":53,"l4_dst_port":64041,"tcp_flags":0,"protocol":17,"src_tos":0,"src_as":0,"dst

I don't see anything in the Kibana or Elastic logs that would lead me to believe that there is an issue.

Can you elaborate this point, what sort of vizs are you trying to build?

correct, Kibana won't display anything. Well, more like i don't see it actively showing data, and another wrinkle to throw in...it sporadically works? Worked for around an hour in the middle of the night last night come to find out. Which is undoubtedly the reason that i can see a mapping of elements or that an index actually exists.

It's still not clear what's not working.

Is KB not working at all? Does Discover show anything? Can you search in Discover? Are you building a viz? What type? What happens you try building a viz?

Yeah, i apologize because it is just that anomalous. Kibana is in fact working for other indices, just not actively showing live data for this particular one related to processing net flow data. Really strange. I even blew away all indices and started fresh with the latest stack.

So even though i can see log stash actively writing out flows to a local file (and ElasticSearch), i can't seem to get Kibana to show that same data "live" (last 15 min with 10 second refresh).

ok, i think i figured it out. Apparently the Switch was not setup to use NTP and the timezone was still set to UTC rather than UTC -6 that would match the rest of the equipment involved. I'm seeing live data right now. This rocks!

Ahh, good old TZs :stuck_out_tongue: