How to map and index the specific json object into elastic search

Let's I'm reading data from the kafka topic and indexing specific data into specific elastic search index.

Example :

Logstash reading message from the kafka topic i has two messages(1. LOG, 2.ERROR)

Message1.
//{
"eid": "LOG",
"ets": 1528400654594,
"ver": "3.0",
"errorInfo":{}
}

{
"eid": "ERROR",
"ets": 1528400654594,
"ver": "3.0",
"errorInfo":{}
}

Now I need to index these two message into elasticsearch using logstash
message1 should go "LOG-Index" and message2 should go to "ERROR-Index"

//input {
kafka {
bootstrap_servers => "localhost:9092"
topics => ["local.telemetry.log"]
type => "local.telemetry.log"
codec => "json"
}
}

output {
elasticsearch {
codec => "json"
hosts => ["http://localhost:9200"]
index => "%{type}-indexer"
user => "elastic"
password => "changeme"
}
}

Shouldn't that be "%{eid}-index"?

How will we get eid? in this '"%{eid}-index"'

Should we need to use any filter ?

Your kafka messages are JSON, and you are using a json codec on the input so if your JSON contains

"eid": "LOG",

I would expect your events to have an eid field.

Yes 'eid ' will be available So you mean when i use codec then we should able to parse the message right ?

And One more clarification since eid value is in uppercase can we able create an index name with lowercase without modifying the message ?

For example :
Something like this possible ?
// '"%{eid}.toLowercase()-index"'

If you are OK with changing eid then use

mutate { lowercase => [ "eid" ] }

If not, make a copy and change that.

mutate { copy => { "eid" => "[@metadata][eid]" } }
mutate { lowercase => [ "[@metadata][eid]" ] }

Then you can use

index => "%{[@metadata][eid]}-index"
1 Like

Thank you ...@Badger

Hi,
In the logstash output, Is it possible to mention index name with mapping object file path ?

output {
elasticsearch {
codec => "json"
hosts => ["http://localhost:9200"]
index => "%{[@metadata][eid]}-index"
user => "elastic"
password => "changeme"
document_type => "_doc"
template => "../config/mappingObj.json"
template_name => "%{[@metadata][eid]}-*" }}

Logstash 2.3 Mutate don't have copy function

Does it have add_field? If so, use

mutate { add_field => { "[@metadata][eid]" => "%{eid}" } }

Thanks, I have resolved with the if and else condition

And in elastic search mapping is it possible to handle the BigDecimal?
Becoz we are facing this error

"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [time_field]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"1.56379850611E+12\" is malformed at \".56379850611E+12\""}}}}

mapping object
"time_field": {
"format": "strict_date_optional_time||epoch_millis",
"type": "date"
},

Any idea?

The mapper will not support BigDecimal because the underlying Java class does not. Use ruby to convert it

    ruby { code => 'event.set("date", BigDecimal.new(event.get("someField")).to_i)' }
    date { match => [ "date", UNIX_MS ] }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.