Timestamp field has a 5 minutes delay


(h_foxit) #1

hello, still a newbie here I installed the 5.5 version of ElasticStack
I have a time problem 5 min difference with the local time
I do receive ossec logs
2017 Sep 06 00:05:03 WinEvtLog: Security: AUDIT_SUCCESS(4624)
but the timestamp of kibana is
September 5th 2017, 23:59:57.449

already checked the ossec server time and it is correct ..
my logstash conf contain
date {
match => ["timestamp",
"MMM dd HH:mm:ss",
"MMM d HH:mm:ss",
"MMM dd yyyy HH:mm:ss",
"MMM d yyyy HH:mm:ss"
]
timezone => "Africa/Tunis"

} # date fin

any help .. thx


(Magnus Bäck) #2

Your date filter doesn't work because you're not using the right pattern for the string "2017 Sep 06 00:05:03". Try "yyyy MMM dd HH:mm:ss".


(h_foxit) #3

I did check the output of logstash and the time is correct ..
I did test the ouput of filebeat in a file .. and the timestamp being send is the root of the problem

a sample of filebeat output

{"@timestamp":"2017-09-07T09:41:25.310Z","beat":{"hostname":"try","name":"softun-ossim","version":"5.5.1"},"input_type":"log","message":"AV - Alert - "1504777237" --\u003e RID: "18149"; RL: "3"; RG: "windows,"; RC: "Windows User Logoff."; USER: "r.maamri"; SRCIP: "None"; HOSTNAME: "xx"; LOCATION: "hh"; EVENT: "[INIT]2017 Sep 07 10:45:43 [END]"; ","offset":239515423,"source":"/var/ossec/logs/alerts/alerts.log","type":"servers"}


(h_foxit) #4

tried to modify the max_bulk_size same problem
is there a way to delete the @timestamp send by filebeat in logstash
and add the actual time of the elk server
thx


(Magnus Bäck) #5

I did check the output of logstash and the time is correct …

What, exactly, do you mean by that?

tried to modify the max_bulk_size same problem

Why would changing that option help?

is there a way to delete the @timestamp send by filebeat in logstash
and add the actual time of the elk server

The date filter does exactly that but it's possible that you've misconfigured it.

Are you sending the events to Logstash for processing or is Filebeat sending directly to ES? Show the configuration of both Filebeat and Logstash and post an example event from Elasticsearch (e.g. by copying/pasting from Kibana's JSON tab).


Add timestamp of the ELK server
(h_foxit) #6

So I have

Filebeat >> Logstash >> ES >> Kibana

I tested the configuration file in logstash without the input beat section
my logstash.conf

input {

stdin {}

beats {
port => 5044
host => "0.0.0.0"
ssl => true
ssl_certificate => "/etc/logstash/logstash.crt"
ssl_key => "/etc/logstash/logstash.key"
}
}
filter{
if [type] == "servers" {
grok {
match => { "message" => 'AV - Alert - "%{INT:id}" --> RID: "%{INT:rule.id.sid}"; RL: "%{INT:rule.alert.level}"; RG: "%{GREEDYDATA:rule.group},"; RC: "%$
add_field => [ "received_at", "%{@timestamp}" ]

} # end grok
date {
match => ["timestamp",
"MMM dd HH:mm:ss",
"MMM d HH:mm:ss",
"MMM dd yyyy HH:mm:ss",
"MMM d yyyy HH:mm:ss"
]
timezone => "Africa/Tunis"
} # date fin
grok {
match => { "ossec_event" => "%{GREEDYDATA:date_ossec} WinEvtLog: %{GREEDYDATA:event}" }
# some log messages does not contain date_ossec => we will use the date of system date
} # end grok

mutate {
rename => ["ossec_event", "message"]
remove_field => ["event"]
} # end mutate

}
}

when I remove the beats {} in input and test the filter there is a timestamp field and it is courrect

Why would changing that option help?

maybe the delay is because of the size .. been tryng to speed up filebeat

The date filter does exactly that but it’s possible that you’ve misconfigured it.

my filebeat.yml

filebeat.prospectors:

#ossim authentication log file

  • input_type: log
    paths: ["/var/log/auth.log"]
    document_type: ossimauthentication

#cisco asa log file

  • input_type: log
    paths: ["/var/log/cisco-asa.log"]
    document_type: ciscoasa

#cisco swithes log file

  • input_type: log
    paths: ["/var/log/cisco-router.log"]
    document_type: ciscoswitches

#ossec log file

  • input_type: log
    paths: ["/var/ossec/logs/alerts/alerts.log"]
    document_type: servers

##----------------------------- Logstash output --------------------------------
output.logstash:
hosts: ["192.200.12.130:5044"]
bulk_max_size : 5144
#Optional SSL. By default is off.
#List of root certificates for HTTPS server verifications
ssl.certificate_authorities: ["/etc/filebeat/logstash.crt"]

#Certificate for SSL client authentication
#ssl.certificate: "/etc/pki/client/cert.pem"

#Client Certificate Key
#ssl.key: "/etc/pki/client/cert.key"
ignore_older: 2h

output.file:
path: "/tmp/filebeat"
filename: filebeat


(h_foxit) #7

restarted the service and the date is completly wrong even the info when i restarted is wrong

2017/09/07 12:02:23.198830 beat.go:186: INFO Setup Beat: filebeat; Version: 5.5.1
2017/09/07 12:02:23.197902 metrics.go:23: INFO Metrics logging every 30s
2017/09/07 12:02:23.229732 logstash.go:90: INFO Max Retries set to: 3
2017/09/07 12:02:23.229863 outputs.go:108: INFO Activated logstash as output plugin.
2017/09/07 12:02:23.229943 publish.go:295: INFO Publisher name: ossim
2017/09/07 12:02:23.230082 async.go:63: INFO Flush Interval set to: 1s
2017/09/07 12:02:23.230325 async.go:64: INFO Max Bulk Size set to: 2048
Config OK
. ok


(Magnus Bäck) #8

Please answer all questions. Remaining:

  • Post an example event from Elasticsearch (e.g. by copying/pasting from Kibana’s JSON tab).

(h_foxit) #9

{
"_index": "servers-2017.09.07",
"_type": "servers",
"_id": "AV5b5dPz-rWg8jK0iLVj",
"_version": 1,
"_score": null,
"_source": {
"src.ip": "None",
"rule.group": "windows",
"syslog_severity_code": 5,
"offset": 274310178,
"rule.alert.level": "5",
"syslog_facility": "user-level",
"input_type": "log",
"syslog_facility_code": 1,
"rule.description": "Windows Network Logon",
"source": "/var/ossec/logs/alerts/alerts.log",
"message": "2017 Sep 07 11:32:44 WinEvtLog: Security: AUDIT_SUCCESS(4624): Microsoft-Windows-Security-Auditing: An account was successfully logged on. Subject: Security ID: S-1-0-0 Account Name: - Account Domain: - Logon ID: 0x0 Logon Type: 3 New Logon: Security ID: S-1-5-21-3692100098-1010675234-1518838908-4077 Account Name: Mxima-PC$ Account Domain: en.com Logon ID: 0x70839563 Logon GUID: {C00ED0AC-F03F-E6F1-74D9-94EFA21FFA32} Process Information: Process ID: 0x0 Process Name: - Network Information: Workstation Name: Source Network Address: 192.24.189.12 Source Port: 49690 Detailed Authentication Information: Logon Process: Kerberos Authentication Package: Kerberos Transited Services: - Package Name (NTLM only): - Key Length: 0 This event is generated when a logon session is created. It is generated on the computer that was accessed. ",
"type": "servers",
"rule.id.sid": "700003",
"syslog_severity": "notice",
"tags": [
"beats_input_codec_plain_applied"
],
"@timestamp": "2017-09-07T10:27:41.211Z",
"received_at": "2017-09-07T10:27:41.211Z",
"date_ossec": "2017 Sep 07 11:32:44",
"@version": "1",
"beat": {
"hostname": "ossim",
"name": "ossim",
"version": "5.5.1"
},
"host": [
"ossim",
"SA"
],
"location": "(SA) ->WinEvtLog",
"id": "1504780061",
"user": "Mxima$"
},
"fields": {
"received_at": [
1504780061211
],
"@timestamp": [
1504780061211
]
},
"sort": [
1504780061211
]
}


(Magnus Bäck) #10

You're using a date filter to parse a timestamp field but you have no such field in your events. Perhaps you should parse the date_ossec field instead?


(h_foxit) #11

yes I could do that but I have other type of log that does not contain any date field
that is why I have been trying to fix the timestamp field

Is there any way to edit the timestamp in logstash ?

I will try to reinstall filebeat


(h_foxit) #12

I did install filebeat 5.5.2 same problem ..


(h_foxit) #13

solved by installing filebeat 5.5.2 and configuring ossim to use an ntp server .. :slight_smile: thx for your help
the delay is 26 second but it 's ok ..
thx again


(system) #14

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.