Elasticsearch Queries

I'm trying some scenarios using ELK. As I'm new to ELK could you guys help me out to make some queries for my below need.

Senarios:
1:- Find the no of occurrence of particular string. like ResourceManager
2:- Response time (find the time between two specific string). like between first ResourceManager to last last ResourceManager.
3:- Pattern matching (few specific line one after another).
4:- Threshold break (Like CPU > 80) - Theshold string capture.

Example log file - Logstash to Elasticsearch

2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool general - Queries
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool sysquery - Queries
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool sysdata - Memory(KB)
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool wosdata - Memory(KB)
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool tm - Queries
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool refresh - Queries
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool recovery - Queries
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool dbd - Queries
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool jvm - Queries
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool blobdata - File Handles
2017-03-14 12:23:43.477 unknown:0x7f3a7e [ResourceManager] pool metadata - Memory(KB) - Threshold breakup > 80%
2017-03-14 12:23:43.477 unknown:0x7f3a7e [Init] Dumping out open file descriptors
2017-03-14 12:23:43.477 unknown:0x7f3a7e @node0001: 00000/4273: Open FD 0[[STDIN]] -> /dev/null
2017-03-14 12:23:43.477 unknown:0x7f3a7e @node0001: 00000/4273: Open FD 1[[STDOUT]] -> /data/disks_a/db/dbLog
2017-03-14 12:23:43.477 unknown:0x7f3a7e @node0001: 00000/4273: Open FD 2[[STDERR]] -> /data/disks_a/db/dbLog
2017-03-14 12:23:43.477 unknown:0x7f3a7e @node0001: 00000/4273: Open FD 3[Unknown] -> /proc/160845/fd
2017-03-14 12:23:43.477 unknown:0x7f3a7e @node0001: 00000/4273: Open FD 4[Unknown] -> /opt/vertica/log/adminTools.errors
2017-03-14 12:23:43.478 unknown:0x7f3a7e @node0001: 00000/4273: Open FD 5[Unknown] -> /dev/null
2017-03-14 12:23:43.478 unknown:0x7f3a7e @node0001: 00000/4273: Open FD 6[Unknown] -> /data/disks_a/db/node0001_catalog/startup.log
2017-03-14 12:23:43.478 unknown:0x7f3a7e @node0001: 00000/4273: Open FD 7[Unknown] -> /data/disks_a/db/node0001_catalog/ErrorReport.txt

Thank you.

What have you tried so far?

Hi Warkolm,

Here is what I've done so far - loaded file from logstash to elasticsearch

logstash-ver.conf file:-
`input {
file {
path => ["/home/release/ver2.log"]
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
}

output {
elasticsearch {
hosts => ["localhost:9200"]
index => "varlog-%{+YYYY.MM.dd}"
document_type => "system_logs"
}
stdout { codec => rubydebug }
}
`

STEP 2:- /opt/logstash/bin/logstash -f logstash-ver.conf

Simple query - just to check out the occurrences of word 'catalog'

curl -XGET 'http://localhost:9200/verticalog-2017.06.12/log/_search?pretty' -d '{
"query": {
"function_score": {
"query": {
"filtered": {
"query": {
"bool": {
"must": {
"query_string": {
"query": "Catalog*",
"fields": [
"@version",
"host",
"message",
"path",
"tags",
"type"
]
}
},
"should": []
}
},
"filter": {
"bool": {
"must": []
}
}
}
},
"functions": []
}
},
"size": "50"
}'

That grok pattern does not match the log though, you probably need a custom pattern.

Could you please suggest a query to give me the below results -
ID Message ID message Log Start Time Log End Time Count
1 1 ResourceManager 2017-03-14 12:23:43 2017-03-14 12:23:43 11

Thanks.

 curl -XGET 'http://localhost:9200/varlog-2017.06.12/log/_count?q=message:ResourceManager'

I'm getting the count via this query but not able to pull out the other info, like log start time and log end time within the same elastic search query.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.