I'm having a hard time using the elasticsearch filter plugin, but it seems that the problem is more generic (Logstash JSON parser seems not to understand backslashes in the query strings that it passes to ES), so I'm posting it here too.
- Version: ES/Logstash 5.0.2, elasticsearch filter plugin: 3.1.0
- Operating System: CentOS 7
- Config File (the query I used is in the comment):
input {
elasticsearch {
hosts => "10.x.y.z:9200"
index => "data_for_mapping_test"
query => '
{
"query": {
"term": {
"_id": {
"value": "1"
}
}
}
}
'
}
}
filter {
elasticsearch {
hosts => ["10.x.y.z:9200"]
index => "mappings"
query_template => "/etc/logstash/conf.d/mapping_test.dsl"
# query => '{ "query": { "term": { "AssetType": "%{AssetType}"} } }'
fields => {"AssetTypeGrouping" => "AssetTypeGroupingMapped"}
enable_sort => false
}
}
output {
elasticsearch {
hosts => ["10.x.y.z:9200"]
index => "mappings_test"
}
}
- Sample Data:
My input data contains a field like this:"AssetType":"\\Demo\\Something"
Mymappings
index contains documents like this:
{
"AssetType": "\\Demo\\Something",
"AssetTypeGrouping": "marketing assets"
}
- Steps to Reproduce: Run the pipe ;]
- Error:
:error=>#<LogStash::Json::ParserError: Unrecognized character escape 'D' [...]
NOTE: It helped (worked as expected) when I added the following step in the query parsing in the plugin code:
query_tmp = event.sprintf(@query_dsl).gsub!('\\', '\\\\\\')
So it seems that the Logstash JSON parser doesn't understand backslashes in the query string.
Is it the solution or am I doing something completely stupid?