Elasticsearch filter query not working


(Ranganath Nangineni) #1

Hi,

The below config file is workingfine but the same query when I used it in my logstash config , it is throwing some error.

Working file:

input {
elasticsearch {
"hosts" => "XYZ.com:9200"
"index" => "test-m-docs"
#result_size => 1
query => '{ "query": {"match": { "ddocname" :"CNT1882742"} },"sort": {"@timestamp":{"order":"desc"}},"from":0,"size":0,"_source":"ddoctitle*" }'
}
}

output {
stdout { codec => json_lines }
stdout { codec => rubydebug }
}

Output:

[ logstash]# bin/logstash -f elasticquery.conf
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path //usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
{"ddoctitle":"VSNL_R12Upgrade_TECH_UPG_Resource_Mix-DAA1_V1.12.xls","@version":"1","@timestamp":"2018-02-22T06:58:10.386Z"}
{"ddoctitle":"VSNL_R12Upgrade_TECH_UPG_Resource_Mix-DAA1_V1.12.xls","@version":"1","@timestamp":"2018-02-22T06:58:10.387Z"}
{"ddoctitle":"VSNL_R12Upgrade_TECH_UPG_Resource_Mix-DAA1_V1.12.xls","@version":"1","@timestamp":"2018-02-22T06:58:10.388Z"}
{"ddoctitle":"VSNL_R12Upgrade_TECH_UPG_Resource_Mix-DAA1_V1.12.xls","@version":"1","@timestamp":"2018-02-22T06:58:10.389Z"}
{
"ddoctitle" => "VSNL_R12Upgrade_TECH_UPG_Resource_Mix-DAA1_V1.12.xls",
"@version" => "1",
"@timestamp" => 2018-02-22T06:58:10.386Z
}
{
"ddoctitle" => "VSNL_R12Upgrade_TECH_UPG_Resource_Mix-DAA1_V1.12.xls",
"@version" => "1",
"@timestamp" => 2018-02-22T06:58:10.387Z
}
{
"ddoctitle" => "VSNL_R12Upgrade_TECH_UPG_Resource_Mix-DAA1_V1.12.xls",
"@version" => "1",
"@timestamp" => 2018-02-22T06:58:10.388Z
}
{
"ddoctitle" => "VSNL_R12Upgrade_TECH_UPG_Resource_Mix-DAA1_V1.12.xls",
"@version" => "1",
"@timestamp" => 2018-02-22T06:58:10.389Z
}

But the same query in the big config file is throwing the below error:

[2018-02-22T07:02:17,215][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"test-m-docs", :query=>"{ "query": {"match": { "ddocname" :"CNT1882742"} },"_source":"ddoctitle*" }", :event=>2018-02-22T06:01:35.599Z XYZServer XXX.XX.36.75 - - [20/Feb/2018:06:33:50 -0600] "GET /content/web/cnt146634 HTTP/1.1" 200 36 "https://xyz.com/index.html?ssFolder=5434D69720E6B1899BEB3972EE5604DE" "Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko" "10.16.195.106" "image/unknown" , :error=>#<Elasticsearch::Transport::Transport::Errors::BadRequest: [400] {"error":{"root_cause":[{"type":"parse_exception","reason":"parse_exception: Encountered " <RANGE_GOOP> "{\"match\": "" at line 1, column 11.\nWas expecting:\n "TO" ...\n "}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"test-myo-docs","node":"hdokW_3gQFeowO0oKgOOrQ","reason":{"type":"query_shard_exception","reason":"Failed to parse query [{ "query": {"match": { "ddocname" :"CNT1882742"} },"_source":"ddoctitle*" }]","index_uuid":"RKmSbV4CQsSZaoci6z73Wg","index":"test-myo-docs","caused_by":{"type":"parse_exception","reason":"parse_exception: Cannot parse '{ "query": {"match": { "ddocname" :"CNT1882742"} },"_source":"ddoctitle*" }': Encountered " <RANGE_GOOP> "{\"match\": "" at line 1, column 11.\nWas expecting:\n "TO" ...\n ","caused_by":{"type":"parse_exception","reason":"parse_exception: Encountered " <RANGE_GOOP> "{\"match\": "" at line 1, column 11.\nWas expecting:\n "TO" ...\n "}}}}]},"status":400}>}

Logstash file content:

if [docname] {
elasticsearch {
"hosts" => "XYZ.com:9200"
"index" => "test-m-docs"
query => '{ "query": {"match": { "ddocname" :"CNT1882742"} },"_source":"ddoctitle*" }'
}
}

What is causing this error ? and how can I store the results in to variables?


(system) #2

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.