Change @timestamp to time from actual log file

Hello everybody..

Im trying to send log files from an Ubuntu client to my ES server.
That seems to work fine although I can not seem to get the time field right. (want the time stamp from the log file, not when it has been run by ES!)

also I would like to know how I exclude fields like fx "host" and "path"

I have made a .conf and a matching pattern file, witch I will post here, as well as a screenshot of the output shown in Kibana..

Thanks In advance


input {
file {
exclude => ".gz"
path => "/home/lasse/Skrivebord/3asa5525/messages-2015-10-30
# for live data it is better to start data at "end". (fx streaming)
start_position => "beginning"

filter {
if [type] == "CISCO_ASA_FIREWALL" {
grok {
patterns_dir => "/home/lasse/Skrivebord/patterns/CISCO_ASA_PATTERN.txt"
match => ["message", "%{CISCO_TAGGED} %{GREEDYDATA:cisco_message}"]

	syslog_pri {	#what is this, and why do I need it?		

	date {
	  	match => ["timestamp",
	  	"MMM dd HH:mm:ss",
	  	"MMM d HH:mm:ss",
	  	"MMM dd yyyy HH:mm:ss",
	  	"MMMM d yyyy HH:mm:ss"]
	  	target => "timestamp"

	if "_grokparsefailure" not in [tags] {
		mutate {
			rename => ["cisco_message" , "message"]
			remove_field => ["timestamp"]
	grok {
		patterns_dir => "/home/lasse/Skrivebord/patterns/CISCO_ASA_PATTERN.txt"
		match => [
			"message" , "%{CISCOASA106021}",
			"message" , "%{CISCOASA106001}",
			"message" , "%{CISCOASA106006_106007_106010}",
			"message" , "%{CISCOASA106017}",
			"message" , "%{CISCOASA106020}",
			"message" , "%{CISCOASA106014}",
			"message" , "%{CISCOASA305006}",
			"message" , "%{CISCOASA313001_313004_313008}",
			"message" , "%{CISCOASA710003_1}",
			"message" , "%{CISCOASA710003}",
			"message" , "%{CISCOASA746005}",
			"message" , "%{CISCOASA746006}",
			"message" , "%{CISCOASA106023}",
			"message" , "%{CISCOASA313004}",
			"message" , "%{CISCOASA313005}",
			"message" , "%{CISCOASA313009}",
			"message" , "%{CISCOASA338004_338008}",
			"message" , "%{CISCOASA410001}",
			"message" , "%{CISCOASA400028}",
			"message" , "%{CISCOASA400037}",
			"message" , "%{CISCOASA402117}",
			"message" , "%{CISCOASA402119}",
			"message" , "%{CISCOASA405104}",
			"message" , "%{CISCOASA419001}",
			"message" , "%{CISCOASA419002}",
			"message" , "%{CISCOASA500004}",
			"message" , "%{CISCOASA507003}",
			"message" , "%{CISCOASA733100}",
			"message" , "%{CISCOASA752010}",
			"message" , "%{CISCOASA752016}",
			"message" , "%{CISCOASA106100}",
			"message" , "%{CISCOASA106101}",
			"message" , "%{CISCOASA111007}",
			"message" , "%{CISCOASA111008}",
			"message" , "%{CISCOASA111010}",
			"message" , "%{CISCOASA305013}",
			"message" , "%{CISCOASA321001}",
			"message" , "%{CISCOASA502103}",
			"message" , "%{CISCOASA713041}",
			"message" , "%{CISCOASA713049}",
			"message" , "%{CISCOASA713050}",
			"message" , "%{CISCOASA713073}",
			"message" , "%{CISCOASA713074}",
			"message" , "%{CISCOASA713075}",
			"message" , "%{CISCOASA713076}",
			"message" , "%{CISCOASA713119}",
			"message" , "%{CISCOASA713120}",
			"message" , "%{CISCOASA713130}",
			"message" , "%{CISCOASA713184}",
			"message" , "%{CISCOASA713228}",
			"message" , "%{CISCOASA713902}",
			"message" , "%{CISCOASA713904}",
			"message" , "%{CISCOASA713257}",
			"message" , "%{CISCOASA713904}",
			"message" , "%{CISCOASA725001}",
			"message" , "%{CISCOASA725002}",
			"message" , "%{CISCOASA725003}",
			"message" , "%{CISCOASA752004}",
			"message" , "%{CISCOASA725006}",
			"message" , "%{CISCOASA725007}",
			"message" , "%{CISCOASA110002}",
			"message" , "%{CISCOASA110003}",
			"message" , "%{CISCOASA106015}",
			"message" , "%{CISCOASA106016}",
			"message" , "%{CISCOASA113004}",
			"message" , "%{CISCOASA113005}",
			"message" , "%{CISCOASA113008}",
			"message" , "%{CISCOASA113009}",
			"message" , "%{CISCOASA302010}",
			"message" , "%{CISCOASA302013_302014_302015_302016}",
			"message" , "%{CISCOASA302020_302021}",
			"message" , "%{CISCOASA303002}",
			"message" , "%{CISCOASA305011}",
			"message" , "%{CISCOASA607001}",
			"message" , "%{CISCOASA602101}",
			"message" , "%{CISCOASA602303_602304}",
			"message" , "%{CISCOASA605005}",
			"message" , "%{CISCOASA607001}",
			"message" , "%{CISCOASA611101}",
			"message" , "%{CISCOASA611103}",
			"message" , "%{CISCOASA622001}",
			"message" , "%{CISCOASA713172}",
			"message" , "%{CISCOASA713905_1}",
			"message" , "%{CISCOASA713905}",
			"message" , "%{CISCOASA734001}",
			"message" , "%{CISCOASA737006}",
			"message" , "%{CISCOASA737029}",
			"message" , "%{CISCOASA737031}",
			"message" , "%{CISCOASA111009}",
			"message" , "%{CISCOASA609001_609002}",
			"message" , "%{CISCOASA710001_710002_710003_710005_710006}",
			"message" , "%{CISCOASA713236}",
			"message" , "%{CISCOASA713906}",
			"message" , "%{CISCOASA715036_715046_715047_715075}"

output {
stdout { codec => plain }
file {
path => "/home/lasse/Skrivebord/ASA_archive_1/%{type}/%{YYYY-MM}/%{type}-%{+YYYY-MM-dd}.log"
elasticsearch { hosts => [""]
document_type => "text"
index => "clientone"
user => elastic
password => Infowise

#== Cisco ASA ==
HOSTNAME \b(?:[_0-9A-Za-z][_0-9A-Za-z-]{0,62})(?:.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(.?|\b)
CISCO_TAGGED %{CTIMESTAMP:timestamp}( %{SYSLOGHOST:host})? %{CISCO_TAG:ciscotag}:
CISCO_CLASS [0-9]{3}
CISCO_TAG %{CISCO_STRUC:cisco_facility}-%{INT:cisco_severity}-%{CISCO_STRUC:cisco_mnemonic}|WLC[0-9]+

Common Particles

CISCO_ASA_ACTION Built|Teardown|Deny|Denied|denied|requested|permitted|received|denied by ACL|discarded|est-allowed|Dropping|dropping|created|deleted|SENDING|RECEIVED|monitored|dropped|terminated|Rejected
CISCO_ASA_REASON AAA failure|Duplicate TCP SYN|TCP Reset-O|Failed to locate egress interface|Invalid transport field|No matching connection|DNS Response|DNS Query|(?:%{WORD}\s*)*
CISCO_ASA_DIRECTION Inbound|inbound|Outbound|outbound
CISCO_ASA_INTERVAL first hit|%{INT}-second interval
CISCO_ASA_XLATE_TYPE static|dynamic


You have a date filter that attempts to parse the timestamp field, but does your event have a timestamp field? And what does it look like? Until you're done debugging this I suggest you don't remove the timestamp field with that mutate filter since you're tampering with the evidence.

Hi Magnus

not sure what your saying /asking me :smiley:
The time in the actual log file is under the "message" tag. (see screenshot)

The time in the actual log file is under the "message" tag. (see screenshot)

I know. And you have a grok filter that parses the message to extract various fields, including timestamp. Then you have a date filter that parses timestamp and stores the result back into timestamp. After that you delete the timestamp field with a mutate filter. That's clearly not a good idea.

On top of that, the field that Kibana uses as the event timestamp is by default @timestamp, not timestamp. That is, unless you've changed your index pattern configuration in Kibana it won't matter if you fix your Logstash filters so that they write to the timestamp field because Kibana won't use it.


  • Use @timestamp as your timestamp field. It's possible to change this and use timestamp or whatever, but don't do that until you understand things better.
  • Remove the target => "timestamp" option in your date filter.
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.