This is my current configuration in logstash configuration:
Logstash Configuration:
input {
beats {
port => 5044
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200/ "]
}
stdout {
codec => rubydebug
}
}
I am able to get logs via Filebeats from my application server.
Usecase:
Find all logs for text "Rest API call"
Write to index in elastic-search
And create date wise visualization in kibana.
Not sure how to use Filter tags .
You can e.g. use a conditional in your filter section to selectively drop events that don't match the desired pattern.
filter {
if [message] !~ /REST API call/ {
drop { }
}
}
https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html
do i need to write anything inside drop {} ?
that was total perfect one! Thanks
one more thing from filter i got data, there is one "message" key.
I want to extract each value from message and pass to index in Elasticsearch.
Message is of format:
[2017-04-26 07:32:10,673] req#502bf179-49a5-4604-aae0-a7a85607ec30~@ClassName #Class-Method-Name INFO [[ACTIVE] ExecuteThread: '13' for queue: 'weblogic.kernel.Default (self-tuning)'](java class qualified name) - Calling REST API ** http-based-rest-call
I want to extract http call, User and timestamp
Use a grok filter for that.
grok filter needs to given all values or placeholder, cant i just extract required values ?
I'm not sure what you're asking. You can extract any parts of the string to new fields.
is just extracting http request possible ?
I said: You can extract any parts of the string to new fields.
That means you can extract the HTTP request.
If you're not familiar with regular expressions the grok constructor web site might be useful.
Can you help on this Timestamp not getting my log value using date filter logstash i was able to get request but timestamp is not getting overriden with log date
system
(system)
Closed
May 26, 2017, 9:56am
14
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.