Upsert solution Logstash

Hello,
I have a solution with filebeat(on server) -> logstash(on server) -> elasticsearch(as service)

i have to ingest data from file.log in json format.
each row have an "GUID" element wich is use for upsert. there is the structure of a row generate by workflow:

{"connectorid":"1600140", **"endtime":"", "starttime":"2022-12-02 09:30:00.000"**, "completed":"false", "type":"scripting-groovy-script", "activityid":"1700283", "elementname":"assignationTache", "writetime":"2022-12-02 09:29:52.165", "loglevel":"INFO", "caseid":"81015", **"guid":"81015-1700283-1600140"**, "notificationtype":"Connector", **"state":"INITIALIZING"**, "parentobjectid":"81015-1700283", "rootcaseid":"81014"}

an another step of the same workflow:

{"connectorid":"1600140", **"endtime":"", "starttime":""**, "completed":"false", "type":"scripting-groovy-script", "activityid":"1700283", "elementname":"assignationTache", "writetime":"2022-12-02 11:00:00.000", "loglevel":"INFO", "caseid":"81015", **"guid":"81015-1700283-1600140"**, "notificationtype":"Connector", **"state":"PROCESS"**, "parentobjectid":"81015-1700283", "rootcaseid":"81014"}

the final step of the workflow:

{"connectorid":"1600140", **"endtime":"2022-12-02 11:30:00.000", "starttime":""**, "completed":"true", "type":"scripting-groovy-script", "activityid":"1700283", "elementname":"Vérifier informations", "writetime":"2022-12-02 11:30:00.000", "loglevel":"INFO", "caseid":"81015", **"guid":"81015-1700283-1600140"**, "notificationtype":"Activity", **"state":"DONE",** "parentobjectid":"81015-1700283", "rootcaseid":"81014"}

Actually i have this configuration in logstash.conf:

input {
  beats{
    port => 5044
    add_field => {
      "[@metadata][target_index]" => "index-bonita-qal-processlog-%{+YYYY-MM-dd}"
    }  
  }
}

# filtre qui permet de parser la donnee entrante
filter {
  json {
  source => "message"
  target => "fields"
  }
  mutate {
  add_field => { "[@metadata][target_document_id]" => "%{[fields][guid]}" }
  }
}
# configuration de sortie
output {
    elasticsearch {
      hosts => ["<elasticsearch_endpoint>"]
      user => "<username>"
      password => "<password>"
      proxy => "<proxy_endpoint>"
      index => "%{[@metadata][target_index]}"
      document_id => "%{[@metadata][target_document_id]}"
      doc_as_upsert => true
    }
}

actually it's working, the same GUID element are update with the new step of workflow, but i would like to conserved the initiale "starttime" value because this value is not store in the application. And i have no idea how can i store the initiale value of starttime to be insert after each upsert of the same GUID element.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.