Data and time in Kibana / ES

Hi all

Maybe I am being really stupid but the logs showing up seem to be stored / passed with the wrong timestamp. Let me explain:

  1. How I noticed this - When building a search filter etc I noticed there were no records within a 24 hour period. Looking at the ES Pipeline I could see the records coming thick and thin into the pipeline. So they were definitely ending up in ES

What I did to try troubleshoot:

  1. Checked my pipeline config:

    #input { stdin { } }

    filter {
    if [type] =="suricataIDPS" {
    json {
    source => "message"
    }
    date {
    match => [ "timestamp", "ISO8601" ]
    }

To me (and as I said that could be wrong) that look correct, the time zone I am looking for is GMT+2.

I then looked at the ES manual and (again to me) this looks like the correct format.

I looked at the firewall timezone (generating the data) and the timezone is correct - GMT +2, I looked at the server running the ELK stack and the timezone is correct - GMT +2. I checked my client - GMT + 2. I even changed the date time setting in advanced tab to not use the browsers DT and use the correct specified timezone.

At this point I wanted to ask for help on what I could be doing wrong so that I can get some guidance where to look read and learn or adjust the filter above.

Just for clarity - all the records are streaming in nicely its just the @timestamp that is wrong.

Version of ELK is the latest downloadable version

Is this a Kibana-related question or would it make more sense in Logstash/Elasticsearch?

If you're using a relative time range in Kibana (i.e. "Last 24 hours") then the timezone shouldn't matter, it's only when you use an absolute range that the time zones should affect the time range.

Going to look at that (time ranges) could be so bold to ask if there is a url for me to educate myself or if you could just point me in the right direction for me to check.

You right could be a ES or LS post I just wasnt sure

Sure, here ya go:

https://www.elastic.co/guide/en/kibana/current/set-time-filter.html

Thank you checked it out and wasnt that :frowning:

Okay, sounds like it's a problem with your Logstash/ES setup, I will go ahead and switch the labels if that's okay.

Please show an example document that ended up in Elasticsearch. Copy/paste the raw JSON from the JSON tab in Kibana (visible when you expand a document in the Discover view).

Sure thing - just removed some IP info etc

{

"_index": "logstash-2018.09.18",
"_type": "doc",
"_id": "wFsz82UBERgXOhVDd-6V",
"_version": 1,
"_score": null,
"_source": {
"event_type": "alert",
"input": {
"type": "log"
},
"flow_id": 326428362774953,
"in_iface": "xxxxxxxx",
"stream": 0,
"tags": [
"SuricataIDPS",
"JSON",
"beats_input_codec_plain_applied",
"ET-Sig"
],
"source": "/var/xx/xx/xxxxx/eve.json",
"src_port": xxxxxxx,
"offset": 1926251246,
"proto": "UDP",
"beat": {
"hostname": "xxxxxxxxxxxxxxxxxxxxxxs",
"version": "6.3.2",
"name": "xxxxxxxxxxxxxxxxxx"
},
"payload": "xxxxxxxxxxxxxx",
"prospector": {
"type": "log"
},
"@version": "1",
"@timestamp": "2018-09-18T23:14:02.904Z",
"message": "{"timestamp":"2018-09-19T01:14:02.904248+0200","flow_id":326428362774953,"in_iface":"xxxx","event_type":"alert","src_ip":"xxxxxxxxxxxxx","src_port":xxx,"dest_ip":"xxxxxxxxx","dest_port":xxxx,"proto":"UDP","alert":{"action":"allowed","gid":1,"signature_id":2016149,"rev":2,"signature":"ET INFO Session Traversal Utilities for NAT (STUN Binding Request)","category":"Attempted User Privilege Gain","severity":1},"app_proto":"failed","payload":"xxxxxxx","payload_printable":"xxxxxxxxxxxx","stream":0,"packet":"xxxxxxxxxxx","packet_info":{"linktype":0}}",
"alert": {
"category": "Attempted User Privilege Gain",
"signature_id": 2016149,
"gid": 1,
"severity": 1,
"action": "allowed",
"rev": 2,
"signature": "ET INFO Session Traversal Utilities for NAT (STUN Binding Request)"
},
"dest_FQDN": "ec2-54-172-47-69.compute-1.amazonaws.com",
"payload_printable": "xxxxxxxxxxxxxxxxxxxxx",
"dest_port_serviceName": "stun",
"ids_rule_type": "Emerging Threats",
"geoip": {
"country_code2": "xx",
"ip": "xxxxxxxxxxxxxxx",
"country_code3": "xx",
"longitude": xx,
"timezone": "Africa/Johannesburg",
"continent_code": "AF",
"country_name": "South Africa",
"region_name": "xxxxxxxxxxxxxxxxxxx",
"location": {
"lon": 27.9667,
"lat": -26.05
},
"city_name": "xxxxxxxxxxxxxxxxxx",
"region_code": "xx",
"postal_code": "xx",
"latitude": -xx
},
"src_FQDN": "xxxxxxxxxxxxxxxxxxxxxxx",
"type": "xxxxxxxxxxxxxxxxxxxxxxxxxx",
"packet": "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx",
"packet_info": {
"linktype": 0
},
"dest_port": 3478,
"src_ip": "xxxxxxxxxxx",
"app_proto": "failed",
"host": {
"name": "xxxxxxxxxx"
},
"dest_ip": "xxxxxxxxxxxxxx",
"Signature_Info": "http://doc.emergingthreats.net/bin/view/Main/2016149",
"timestamp": "2018-09-19T01:14:02.904248+0200"
},
"fields": {
"@timestamp": [
"2018-09-18T23:14:02.904Z"
],
"timestamp": [
"2018-09-18T23:14:02.904Z"
]
},
"highlight": {
"event_type": [
"@kibana-highlighted-field@alert@/kibana-highlighted-field@"
],
"type": [
"@kibana-highlighted-field@suricataIDPS@/kibana-highlighted-field@"
],
"alert.category.keyword": [
"@kibana-highlighted-field@Attempted User Privilege Gain@/kibana-highlighted-field@"
]
},
"sort": [
1537312442904
]
}

The @timestamp field is correct. What problem are you perceiving?

My perception is that the logs on the firewall if I look at the pure log file is showing the correct timestamp, the TimeZone is correct on my server running the ELK stack. However the logs are close to 8 hours late on the dashboard.

What do I mean by late: Say the log says that an event occurred at 1pm when I get the visualization on the dashboard it is showing 9pm.

The environment is simple: FireWall --> ELK Server both have the correct timezones and if I ssh and check the time they are both in sync.

Seems like Kibana has the wrong idea of your current timezone then. Kibana has a timezone setting somewhere, have you checked what it's set to? The default is the browser's timezone.

Checked everything even set the Kibana timezone in the advanced settings to my specific timezone and not the "Browser" option and still the timing is incorrect?

Any other ideas?

Just to get back to this topic and close out. Looked again at the logs being filtered on the server running the file beat - it ended up being the logging service that is logging for Sep on that date.

Apologies I did check that before obviously not well enough. We can close this out now.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.