Hi,
Thanks for helping me in all the stuff. Now, I am facing a weird problem.
These are my logs
2017-01-03 05:40:50.522 INFO main ---> org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.retry.annotation.RetryConfiguration' of type [class org.springframework.retry.annotation.RetryConfiguration$$EnhancerBySpringCGLIB$$88c2216e] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2017-01-03 05:40:50.543 INFO main ---> org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker : Bean 'configurationPropertiesRebinderAutoConfiguration' of type [class org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$af188c46] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
And my logstash server configuration file is this.
filter {
grok { match => { "message" => "^%{TIMESTAMP_ISO8601:event_time}\s+%{LOGLEVEL:level}\s+%{SYSLOGPROG}\s---\s%{JAVACLASS:class}\s+:\s+%{GREEDYDATA:message}$"}}
}
Now I want to sort the data on the basis of event_time. But when I am trying to do the same. it says field data is not true. Can somebody help me what is the mistake .
Now I want to sort the data on the basis of event_time.
Exactly how are you doing this?
But when I am trying to do the same. it says field data is not true.
Please quote the actual error message in full.
Hi @magnusbaeck,
Query
GET logstash-2017.07.03/_search
{
"sort": [
{
"event_time": {
"order": "desc"
}
}
]
}
Error
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Fielddata is disabled on text fields by default. Set fielddata=true on [event_time] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory."
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "logstash-2017.07.03",
"node": "FDpTDtSMQo2YpsSoSrPKGg",
"reason": {
"type": "illegal_argument_exception",
"reason": "Fielddata is disabled on text fields by default. Set fielddata=true on [event_time] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory."
}
}
],
"caused_by": {
"type": "illegal_argument_exception",
"reason": "Fielddata is disabled on text fields by default. Set fielddata=true on [event_time] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory."
}
},
"status": 400
}
My question I am unable to map event_time with date datatype and do sorting.
The default field for the event's timestamp is @timestamp. Unless you really want the field to be named event_time you can save yourself some trouble by sticking to the defaults.
It could be that ES doesn't recognize "2017-01-03 05:40:50.522" as a timestamp and therefore mapped the field as text. If you use the date filter you can transform the timestamp into something that ES will recognize as a timestamp and then your query should work just fine. (But note that you'll have to reindex to change the mapping of the event_time field.)
I[quote="magnusbaeck, post:4, topic:91636"]
If you use the date filter you can transform the timestamp into something that ES will recognize as a timestamp and then your query should work just fine. (But note that you'll have to reindex to change the mapping of the event_time field.)
[/quote]
Thanks. I am struggling to visualize this part. Can you please give me a small example or how I can proceed with that.
You're having issues understanding the date filter? Its documentation contains a couple of examples.
https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html#plugins-filters-date-match
Hi @magnusbaeck, I believe there is some confuse in our communication. I don't want to convert or change the name of @timestamp into event_time. Please check my logstash filter, I want to break log message into event_time, log level and other parts.
filter {
grok { match => { "message" => "^%{TIMESTAMP_ISO8601:event_time}\s+%{LOGLEVEL:level}\s+%{SYSLOGPROG}\s---\s%{JAVACLASS:class}\s+:\s+%{GREEDYDATA:message}$"}}
}
Now this event_time is coming as text, but I want it as date. I tried mapping then logstash parser fails. I have no idea how to fix it.
Now this event_time is coming as text, but I want it as date.
I know.
Use the date filter to parse the event_time field. You can save yourself some trouble by saving the parsed result into the @timestamp field but you can also use the target option to overwrite the existing event_time value with the parsed value.
The string parsed by the date filter will be recognized as a timestamp by Elasticsearch, but the mapping of existing indexes will not change. I suggest you delete your current index(es) and have Logstash recreate them.
Thanks @magnusbaeck.
please find my logstash conf file.
input {
beats {
port => 5044
}
}
filter {
grok { match => { "message" => "^%{TIMESTAMP_ISO8601:event_time}\s+%{LOGLEVEL:level}\s+%{SYSLOGPROG}\s---\s%{JAVACLASS:class}\s+:\s+%{GREEDYDATA:message}$"}}
date {
match => ["event_time", "YYYY-MM-dd HH:mm:ss.SSS", "ISO8601"]
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
manage_template => true
template_name => "logstash*"
}
stdout { codec => rubydebug }
}
See this is my logs which also contains multiple lines log stacktrace.
2017-01-03 05:40:49.681 INFO main --- org.springframework.context.annotation.AnnotationConfigApplicationContext : Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@41d16cc3: startup date [Tue Jan 03 05:40:49 UTC 2017]; root of context hierarchy
2017-01-03 05:40:49.693 INFO main --- com.getsentry.raven.DefaultRavenFactory : Using an HTTP connection to Sentry.
2017-01-03 05:40:49.935 INFO background-preinit --- org.hibernate.validator.internal.util.Version : HV000001: Hibernate Validator 5.2.4.Final
2017-01-03 05:40:50.355 INFO main --- org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor : JSR-330 'javax.inject.Inject' annotation found and supported for autowiring
2017-01-03 05:40:50.522 INFO main --- org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker : Bean 'org.springframework.retry.annotation.RetryConfiguration' of type [class org.springframework.retry.annotation.RetryConfiguration$$EnhancerBySpringCGLIB$$88c2216e] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2017-01-03 05:40:50.543 INFO main --- org.springframework.context.support.PostProcessorRegistrationDelegate$BeanPostProcessorChecker : Bean 'configurationPropertiesRebinderAutoConfiguration' of type [class org.springframework.cloud.autoconfigure.ConfigurationPropertiesRebinderAutoConfiguration$$EnhancerBySpringCGLIB$$af188c46] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying)
2017-01-03 05:40:51.430 INFO main --- com.getsentry.raven.connection.AsyncConnection : Gracefully shutdown sentry threads.
2017-01-03 05:40:52.430 WARN main --- com.getsentry.raven.connection.AsyncConnection : Graceful shutdown took too much time, forcing the shutdown.
2017-01-03 05:40:52.430 INFO main --- com.getsentry.raven.connection.AsyncConnection : 5 tasks failed to execute before the shutdown.
2017-01-03 05:40:52.430 INFO main --- com.getsentry.raven.connection.AsyncConnection : Shutdown finished.
2017-01-03 05:40:52.445 INFO main --- com.getsentry.raven.DefaultRavenFactory : Using an HTTP connection to Sentry.
My Kibana Query :
GET logstash-2017.07.04/_search
{
"sort": [
{
"event_time": {
"order": "desc"
}
}
]
}
Output : grep parse failure. There is no event_time in it.
You've configured the date filter to save the parsed result in the @timestamp field, yet it's the event_time field you're trying to sort on. That doesn't make sense.
Output : grep parse failure. There is no event_time in it.
What do you mean?
@magnusbaeck.
Thats my stupidity. Now I check the mapping. I can find event_time as date field. Thanks to you. 
But for my kibana query. I am getting only this.
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "Fielddata is disabled on text fields by default. Set fielddata=true on [event_time] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead."
}
],
"type": "search_phase_execution_exception",
"reason": "all shards failed",
"phase": "query",
"grouped": true,
"failed_shards": [
{
"shard": 0,
"index": "logstash-2017.07.04",
"node": "suo9gTyRRxWBZiqGOt3nzg",
"reason": {
"type": "illegal_argument_exception",
"reason": "Fielddata is disabled on text fields by default. Set fielddata=true on [event_time] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead."
}
}
]
},
"status": 400
}