Monitor events

Hi,

I am using Logstash, Elasticsearch and Kibana to continously monitor events sent by a server via a TCP port.
Every server event is described by: Name, ID, Subsystem where it occured, Occurence Time, Clearing Time and State.
State can be: Uncleared or Cleared (in this case Clearing Time attribute is also populated):

{Name: Authentication Failure, ID: 87645, Subsystem: SR01, Occurence Time: 2018-08-08 14:00:00, Clearing Time: - , State: Uncleared}

After an interval of time every server event (hopefully) ends in Cleared state. In this case I want to keep only one entry of that event with the Cleared state.

Is it possible to do this?

Thanks!

Yes.

When indexing your data via logstash, set the Elasticsearch _id field to the ID field from the event. That way, future events that map to the same ID will not be written to a new Elasticsearch document but rather update the existing document. Besure to configure the Elasticsearch output action to allow for updates.

Hi,

Thank you for your answer! After implementing with action "update" I receive a document missing exception error like this.

  • configuration:

input {
file {
type => "json"
path => "/home/gabi/PycharmProjects/alarme_logstash/alarm_logfile.json"
start_position => "beginning"
ignore_older => 0
}
}

filter {
json {
source => "message"
}
}

output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
document_id => "%{ID}"
action => "update"

}
}

  • error

[WARN ] 2018-08-14 17:58:56.170 [Ruby-0-Thread-7@[main]>worker1: :1] elasticsearch - Could not index event to Elasticsearch. {:status=>404, :action=>["update", {:_id=>"%{ID}", :_index=>"logstash-2018.08.14", :_type=>"doc", :_routing=>nil, :_retry_on_conflict=>1}, #<LogStash::Event:0xe7278f0>], :response=>{"update"=>{"_index"=>"logstash-2018.08.14", "_type"=>"doc", "_id"=>"%{ID}", "status"=>404, "error"=>{"type"=>"document_missing_exception", "reason"=>"[doc][%{ID}]: document missing", "index_uuid"=>"3-_Mf6gTR7ivVE46dMZ0ag", "shard"=>"1", "index"=>"logstash-2018.08.14"}}}}

My JSON event looks like this:

{
"AlarmName" => "AlarmSlogan",
"message" => "{"Occurtime": "2018-08-14 17:26:48", "Severity": "Minor", "NeType": "EQ3900", "State": "Unacknowledged Event", "AlarmName": "AlarmSlogan", "ID": "649390", "NeName": "EQ_21", "Location": "Other details regarding the alarm"}",
"@version" => "1",
"Severity" => "Minor",
"NeType" => "EQ3900",
"State" => "Unacknowledged Event",
"path" => "/home/gabi/PycharmProjects/alarme_logstash/alarm_logfile.json",
"@timestamp" => 2018-08-14T14:58:54.009Z,
"host" => "Gabi",
"type" => "json",
"Occurtime" => "2018-08-14 17:26:48",
"NeName" => "EQ_21",
"Location" => "Other details regarding the alarm",
"ID" => "649390"
}

I don't know now if I should use doc_as_upsert to prevent trying to update a document that does not exist (for new events) or if there is an error in the way I am extracting ID field from the JSON event.

Thanks!

You must set the field _id to the value of the field ID. Elasticsearch tracks documents via _id.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.