Timestamp want to store in index as date datatype

Hi Experts,

i am using filebeat to read the soa logs and want msg_timestamp value store in index as date datatype. Please help me how to create the index on msg_timestamp.

Below is my machine description:
OS : Linux 7
ELK: 6.4
tools : elasticsearch, kibana, logstash, filebeat.

Regards
San

There are quite a few ways to accomplish this. It may just work right out of the box if you've parsed your date field cleanly. By default, Elasticsearch uses dynamic field mapping with date detection enabled. If your date is in a format that is support by this function then it'll just work automatically. You can even customize this pattern if you have an index template already. You could also set the data type to date during Filebeat parsing using any number of methods.

Hi Philip,

thanks for the reply.

Can you please tell me how to create index template and how to use in the code.

I am just using the below code in logstash:

input:

input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

filter:

filter {
  grok {
	match => [ "message", "<%{DATA:log_timestamp}> <%{WORD:log_level}> <%{WORD:thread}> <%{HOSTNAME:hostname}> <%{HOSTNAME:servername}> <%{DATA:timer}> <<%{DATA:kernel}>> <> <%{DATA:uuid}> <%{NUMBER:timestamp}> <%{DATA:misc}> <%{DATA:log_number}> <%{DATA:log_message}>" ]
  }
}

output:

output {
elasticsearch {
hosts => ["xx.xxx.xx.xxx:9200"]
sniffing => true
manage_template => false
index => "admin_%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
stdout { codec => rubydebug }
}

Well if you're using Logstash, simply use the date filter and it will do the work for you by putting whatever field you want to use as your timestamp in the special @timestamp field.

i want apply the date filter and date filter in the template and create the index on date/error type. can you advice.

do you have any example code?

I'm not sure I understand your question. Can you elaborate?

Hi Philip,

First Thanks for your help!!!

Flow of ELK : filebeat --> logstash --> Elasticsearch --> Kibana.

Actually what i tried to achieve is, i have soa logs in on one file named "AdminServer.log" (append mode).
just on the based of date/error type(like update, shutdown, listener error, etc)/severity type i want to create the graph.

my logstash conf file is below:

input:

input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

filter:

  filter {
  if [message] !~ /^####/ {
	drop {
	}
  }
  grok {
	match => { "message" => "\#\#\#\#\<%{DATA:msg_timestamp}\> \<%{DATA:msg_severity}\> \<%{DATA:msg_subsystem}\>%{GREEDYDATA:msg_details}" }
  }
  date {
	match => [ "msg_timestamp", "MMMM Do YYYY, HH:mm:ss.SSS" ]
  }
}

output:

output {
elasticsearch {
hosts => ["xx.xxx.xx.xxx:9200"]
sniffing => true
manage_template => false
index => "te_%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
stdout { codec => rubydebug }
}

Please advise what i have to do sothat i can create graph based on date/error type/severity.

Regards,
San

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.