Log management application

i have an index of logs collected using logstash, now i want to classify the fields, for example collect all the exception in the field "exception" and store them so i can use them later in my log management java application any advice on how to do that

You can easily run an aggregation to find out what they are, but what do you mean by "collect these?"

i meant i already have all my logs in the elastic search such as this example:

{
"_index" : "logstash-2018.02.19",
"_type" : "doc",
"_id" : "daUerWEBZrue2cT4uG7q",
"_score" : 1.0,
"_source" : {
"app" : "core",
"threadname" : "[DiscoveryClient-CacheRefreshExecutor-0]",
"exception" : "ConnectTimeout",
"@version" : "1",
"host" : "sana-OptiPlex-3020",
"@timestamp" : "2018-02-19T08:12:04.836Z",
"LEVEL" : "WARN",
"type" : "syslog",
"path" : "/home/sana/Bureau/core/dataAndTranslation-2018-01-17.0.log",
"message" : "RetryableEurekaHttpClient: Request execution failed with message: org.apache.http.conn.ConnectTimeoutException: Connect to jhipster-registry:8761 timed out",
"logtime" : "2018-01-17 17:54:37,400"
}
},

and i want throught A spring boot application to do a vertical split: send all the values that exception could have and save them in mongodb collection.
so please how to do this aggregation using spring boot and elasticsearch

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.