hi every one , can i check the duplicate username and return true or false in logstash
input part { is rabbitmq server } / username is send by json
filter { what's happen here?}
output { what's happen here? }
hi every one , can i check the duplicate username and return true or false in logstash
input part { is rabbitmq server } / username is send by json
filter { what's happen here?}
output { what's happen here? }
It's not clear exactly what you want to do but there's an elasticsearch filter plugin that you can use to make queries against ES.
yes of course and i use it , but i have not any correct output and logstash return this error
this is my config :
input {
rabbitmq {
user => "user"
password => "pass"
exchange => "exc"
queue => "que"
durable => true
host => "ip address"
subscription_retry_interval_seconds => 5
codec => "json"
}
}
filter {
elasticsearch {
hosts => ["192.168.1.6:9200","192.168.1.7:9200"]
codec => "json"
index => "test"
document_type => "user"
query => "{"query":{"match":{"firstname": "myname1"}}}"
}
}
output {
file {
id => "All Input Data Logger"
path => "/logstash-repository/Log/rabbitmq_debug_events-%{+YYYY-MM-dd}"
codec => rubydebug
}
}
and return this error
[ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>
can you please tell me how can i solve this problem |?
yes of course and i use it
Then why didn't you include that in your question?
codec => "json"
Remove this. The elasticsearch filter doesn't have a codec option.
query => "{"query":{"match":{"firstname": "myname1"}}}"
If you want to have double quotes within a double-quoted string you need to escape them, or make the string single-quoted:
query => '{"query":{"match":{"firstname": "myname1"}}}'
Next time don't truncate any error messages. If I hadn't spotted the problems I pointed out above I would've wanted to see the part after ":backtrace=>".
thank you so much Magnus , for the first question apologize me ,wrong task scheduling is the main reason.
for the second part i removed
codec => "json"
and
i changed the query to this ==>>>> query => '{"query":{"match":{"firstname": "myname1"}}}'
and also delete document_type option
and here is the result in Logstash log :::
[2018-03-18T08:59:32,664][WARN ][logstash.filters.elasticsearch] Failed to query elasticsearch for previous event {:index=>"administration_test", :query=>"{"query":{"match":{"firstname": "myname1"}}}", :event=>#LogStash::Event:0x3e1aaaa2, :error=>#<Elasticsearch::Transport::Transport::Errors::BadRequest: [400] {"error":{"root_cause":[{"type":"parse_exception","reason":"parse_exception: Encountered " <RANGE_QUOTED> "\"myname1\" "" at line 1, column 32.\nWas expecting:\n "TO" ...\n "}],"type":"search_phase_execution_exception","reason":"all shards failed","phase":"query","grouped":true,"failed_shards":[{"shard":0,"index":"administration_test","node":"_RguZSByQ8yEK4br-EsDTw","reason":{"type":"query_shard_exception","reason":"Failed to parse query [{"query":{"match":{"firstname": "myname1"}}}]","index_uuid":"BKRH-J4XT_uwghXNL_lZ3w","index":"administration_test","caused_by":{"type":"parse_exception","reason":"parse_exception: Cannot parse '{"query":{"match":{"firstname": "myname1"}}}': Encountered " <RANGE_QUOTED> "\"myname1\" "" at line 1, column 32.\nWas expecting:\n "TO" ...\n ","caused_by":{"type":"parse_exception","reason":"parse_exception: Encountered " <RANGE_QUOTED> "\"myname1\" "" at line 1, column 32.\nWas expecting:\n "TO" ...\n "}}}}]},"status":400}>}
Judging by the documentation you need to use the query_template
option if you want to make a DSL query. The query
option is for query string queries.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.