Csv columnnames utf-8

I am trying to use the CSV Filter in my Logstashpipeline. When I try to specify Columnnames that contain ü, ä, § or € the JSon that is display by Kibana shows something like "Anzahl Monate (für NN GP abs. €)". Is there any way to make logstash correctly read my conf file? I saved the conf file as UTF-8 and the respective Columnname is "Anzahl Monate (für NN GP abs. €)". I tried finding out if the conf files have to be a special encoding but couldn't find anything on that topic. By the way the content itself is using UTF-8 and is displaying ä, ü ,§ and € correctly. I am using Logstash Version 6.0.1. Same Problem with Logstash 6.2.2.

{
...
    "Zähler Nr.": "Zähler Nr.",
    "Anzahl Monate (für NN GP abs. €)": "Anzahl Monate (für NN GP abs. €)",
    "Offhore Umlage abs. €": "Offhore Umlage abs. €",
    "NNE abs. €": "NNE abs. €",
    "§19 Strom NEV abs.": "§19 Strom NEV abs.",
    "@version": "1",
    ...
  },
  "fields": {
    "@timestamp": [
      "2018-03-01T12:38:44.180Z"
    ]
  },
  "sort": [
    1519907924180
  ]
}

Here is the csv filter part from my .conf-file.

filter {
	csv{
		columns => [
			"Zähler Nr.",
			"Anzahl Monate (für NN GP abs. €)",
			"Offhore Umlage abs. €",		
			"NNE abs. €",
			"§19 Strom NEV abs."
		]
		separator => ";"
		skip_headers=> true
		add_field => {
      "Version" => "1"
    } 
	}
}

Inside of Visual Studio Code everything looks alright. I also tried saving the .conf with the Encoding : "ISO 8859-15" but it resulted in the same Columnnames inside of the JSON in Elasticsearch.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.