Hi,
I have the below entries for the record "ddocname": "CNT1882742" and I have to extract the ddoctitle of the same using the elastic search filter.
{
"_index": "test-m-docs",
"_type": "data",
"_id": "AWF0dLEQ_An67UGqACQn",
"_version": 1,
"_score": null,
"_source": {
"@timestamp": "2018-02-08T08:07:38.554Z",
"@version": "1",
"ddocname": "CNT1882742",
** "ddoctitle": "VSNL_R12Upgrade_TECH_UPG_Resource_Mix-DAA1_V1.12.xls",**
"did": 4835074,
"tags": [
"M Stage",
"data"
]
},
"fields": {
"@timestamp": [
1518077258554
]
},
"sort": [
1518077258554
]
}
Logstash file:
input {
#Any other inputs
}
filter{
elasticsearch {
hosts => ["xyz.pyk.com:9200/test-m-docs/data"]
#query => "ddocname":"%{docname}"
query => ddocname:"CNT1882742"
fields => {"ddoctitle"}
#We can any number of fields
#sort => "ddoctitle:desc"
}
}
output {
stdout { codec => json_lines }
stdout { codec => rubydebug }
elasticsearch {
"hosts" => "xyz.pyk.com:9200"
"index" => "test-m1-docs"
"document_type" => "data"
}
}
But I am facing an error as below:
bin/logstash -f elasticquery.conf
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[ERROR] 2018-02-15 15:55:57.780 [LogStash::Runner] agent - Cannot create pipeline {:reason=>"Expected one of #, {, } at line 9, column 42 (byte 249) after filter{\n elasticsearch {\n hosts => ["xyz.pyk.com:9200/test-m-docs/data"]\n #query => "ddocname":"%{docname}"\n query => ddocname"}
Could you please let me know where I am going wrong and the correct format?