Elasticsearch analyzing and mapping data

  {
    "_index" : "filebeat-7.7.0-2020.06.26",
    "_type" : "_doc",
    "_id" : "-cmj8HIBlEC9388KS7SY",
    "_score" : 1.0,
    "_source" : {
      "message" : "17-02-2020 11:45:55TO    ->  | Brand : TT | Channel :  digitalise | user_id  : 12902 | DirNum : 95350240 | identity ( idType : 2 idNumber : 08263539 ) | country : TN | Msisdn : 92393819 | IMSI : 892160208012526199 Status : 0 Out Code : La demande d''activation de la carte SIM 92393819 est enregistr�e avec succ�s"
    }

how can i analyze only the message field and transform it to fields , to this structure
{Timestamp:
Brand:
Channel: }
thanks a lot

You can do that with an ingest pipeline, or Logstash.

which method is more easy i tried this logstash filter but it's not useful ;
filter {
kv {
field_split => "|?"
value_split => ":"
}
}

The time stamp is not n KV format so you need to use more than one filter. Start e.g. with a dissect filter to store the KV list in a field and then use the KV filter on that.

1 Like

so it sounds like this :
filter {
dissect { mapping => { "message" => "%{[@metadata][ts]}T%{}" } }
date { match => [ "[@metadata][ts]", "ISO8601" ] }
kv {
field_split => "|?"
value_split => ":"
}
}
don't hesitate to help me if you find something wrong

so it sounds like this :
filter {
dissect { mapping => { "message" => "%{[@metadata][ts]}T%{}" } }
date { match => [ "[@metadata][ts]", "ISO8601" ] }
kv {
field_split => "|?"
value_split => ":"
}
}
don't hesitate to help me if you find something wrong

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.