Discover in Kibana show differrent data in Table vs. JSON

Hi there
I use ELK Stack 7.2. and when I import data, I Discover in Kibana I see proper data, but in JSON, there is missing letter (Š). I see that Kibana can read it properly. I push data to Elastistack via Logstash (ISO-8859-1 coding).

I need to get proper data to my app, should I modify my REST request?
Thank you

Misha

Usually UTF-8 encoding should be used for JSON, see https://tools.ietf.org/html/rfc8259#section-8.1

Is it possible for you to change Logstash encoding to UTF-8 instead of ISO-8859-1 encoding? If yes, does it solve the problem?

Hi,
UTF-8 didn't work for czech diacritics (č,š,ť,...) :frowning: so we must use ISO-8859-1
here is Logstash conf

input {
    tcp {
        port => 5010
    }
    beats {
        port => 5044
    }
    file {
        mode => "read"
        path => ["D:/fulltext-search-input/*.log"]
        codec => json { charset => "ISO-8859-1" }
        start_position => "beginning"
        sincedb_path => "NUL"
        file_completed_action => "delete"   
    }
}

output {
  elasticsearch {
    	hosts => ["http://HELIOS-TELE:9200"]
        index => "%{[@metadata][beat]}"
        document_id => "%{[@metadata][@id]}"
        action => "%{[@metadata][@action]}"
    }
}

I've added Logstash tag to this thread to see if Logstash team has ideas.

I would suggest removing the json codec using a plain (or other) codec to do the charset handling, then parsing it using a json filter.

There is an open issue from jordansissel that says using the charset option on a json codec does not work.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.