How to Create Graph using log file timestamp instead of kibana time

Hi Experts,

Please tell me how to Create a Graph using log file timestamp instead of Kibana time option.

Like in below screenshot named "image 1", I want to use logfile timestamp instead of "date histogram" in aggregation.

When I am choosing the Term keyword in aggregation that time I am not able to choose the "interval" option like daily basis/hourly basis etc, as showin in screenshot image 2.

Please advise what are the option to make the graph.

Image 1:

Image 2:

msg_timestamp is better to store in index as date datatype
https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-types.html

this approach as you printscreen would also work but it will show only 5 values (see SIZE 5) you could update it. But since it is not date type it will not be sorted as date but as string .

You shoult set correct datatype during insertion of the data into elasticsearch.
I use templates it sets the correct mapping for matched imports
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-templates.html
within template just set
"msg_timestamp": {
"type": "date"}

you can create 1 time mapping but templates is applied for every new index of the same type created

Hi Petr,

Thank you for the concern.

Can you please share your logstash filter conf file.

Regards
Sanchit Gupta

Hi Petr,

I need your more help on this, i tried your advised way but not successing. can you please help me to setup/

Regards
San

Have you created the mappings?

Please share your logstash conf in order to fix it.
I suppose you should use date function some of this example
in case target is not defined it automatically targets it @timestamp

input {
file {
#some file or directory conf
}}

filter {

#Example1
date {
match => ["msg_timestamp", "UNIX_MS"]
timezone => "Etc/UTC"
target => "msgtime"
remove_field => ["timestamp"]
}

#example2
date {
match => ["msg_timestamp", "ISO8601"]
target => "msg_timestamp"
}

#example 3
date {
match => ["msg_timestamp", "YYYYMMddHHmmss"]
timezone => "Etc/UTC"
}

#example 4
date {
match => ["msg_timestamp", "YYYYMMddHHmmss"]
timezone => "Etc/UTC"
target => "msg_timestamp"
}
}

output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "test"
}

 stdout { }

}

check the reference guide
https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html

1 Like

Hi Petr,

as per your advise i tried to create one template as shown below:
after creation of this template i am facing the errors and even after deletion of this template still not working for me and even while creating the "Create index pattern" in kibana dashboard still i am getting the field named "created_at"
Create Template command:

PUT _template/template_1
{
  "index_patterns": ["admin*", "filebeat*"],
  "settings": {
	"number_of_shards": 1
  },
  "msg_timestamp": {
   "type": "date"},
  "mappings": {
	"_doc": {
	  "_source": {
		"enabled": false
	  },
	  
	  "properties": {
		"host_name": {
		  "type": "keyword"
		},
		"created_at": {
		  "type": "date",
		  "format": "EEE MMM dd HH:mm:ss Z YYYY"
		}
	  }
	}
  }
}

Delete Templete Command:
DELETE /_template/template_1

Errors:

[2018-12-06T11:07:36,948][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"admin_filebeat-2018.12.06", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x59ec1d48], :response=>{"index"=>{"_index"=>"admin_filebeat-2018.12.06", "_type"=>"doc", "_id"=>"exIFgmcBLCbRLhh1fER2", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [admin_filebeat-2018.12.06] as the final mapping would have more than 1 type: [_doc, doc]"}}}}

Hi ,

I am just using the below code in logstash:

input:

input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

filter:

filter {
if [message] !~ /^####/ {
	drop {
	}
}

grok {
	match => { "message" => "\#\#\#\#\<%{DATA:msg_timestamp}\> \<%{DATA:msg_severity}\> \<%{DATA:msg_subsystem}\>%{GREEDYDATA:msg_details}" }
}
}

output:

output {
elasticsearch {
hosts => ["xx.xxx.xx.xxx:9200"]
sniffing => true
manage_template => false
index => "admin_%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
stdout { codec => rubydebug }
}

Hi petr,

i tried to add filter but still not working for me.

  filter {
  if [message] !~ /^####/ {
	drop {
	}
  }

  grok {
	match => { "message" => "\#\#\#\#\<%{DATA:msg_timestamp}\> \<%{DATA:msg_severity}\> \<%{DATA:msg_subsystem}\>%{GREEDYDATA:msg_details}" }
  }
}
filter {
date {
match => ["msg_timestamp", "YYYYMMddHHmmss"]
timezone => "Etc/UTC"
target => "msg_timestamp"
}
}

output of msg_timestamp: Nov 12, 2018 1:42:03 PM CST

Please check and advise .

I doubt that your match is working as that pattern is unusual with everything strung together. "YYYYMMddHHmmss"

Regardless, an issue you have is multiple mapping types. You have to fix that first. Just delete your index and you should be fine, and re-index your data. logstash uses "doc" and your mapping uses "_doc."

Hi Brynan,

i have deleted my index.

I am new to elk so understanding, where u are talking about "_doc. "

Hello Sanchit. "_doc" is what you posted earlier in your reply. Your conflicting doc types was the first error you were receiving. It is in the error you posted. Now that you have deleted your index, the next time you ingest data, it should bring in the data, and not have errors. The next step is to then start working on your date ingestion issue. Petr gave some excellent examples of how dates can be processed to match a pattern, and then ingest that field as a data type of date. Once you achieve that, then you should be able to do the queries that you want to.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.