Elasticsearch input plugin

Hello,

I am running a simple logstash configuration :

input {
elasticsearch {
hosts => "http://:9200/"
index => "A"
query => '{
"query": {"bool": {"must": [
{"term": {
"name": {
"value": "X"
}
}}
]}}
}'
size => 500
scroll => "5m"
docinfo => true
}
}
output {
elasticsearch {
hosts => "http://
:9200/"
index => "subA"
}
stdout { codec => json }
}

I'm using this command to run my configuration :slight_smile:
./logstash -f /usr/share/logstash/config/extractx.conf
Response of the command line :

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2018-11-30 11:16:39.166 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[INFO ] 2018-11-30 11:16:39.870 [LogStash::Runner] runner - Starting Logstash {"logstash.version"=>"6.3.0"}
[INFO ] 2018-11-30 11:16:42.529 [Converge PipelineAction::Create<main>] pipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[INFO ] 2018-11-30 11:16:43.044 [[main]-pipeline-manager] elasticsearch - Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://*****:9200/]}}
[INFO ] 2018-11-30 11:16:43.056 [[main]-pipeline-manager] elasticsearch - Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://******:9200/, :path=>"/"}
[WARN ] 2018-11-30 11:16:43.298 [[main]-pipeline-manager] elasticsearch - Restored connection to ES instance {:url=>"http://*****:9200/"}
[INFO ] 2018-11-30 11:16:43.525 [[main]-pipeline-manager] elasticsearch - ES Output version determined {:es_version=>6}
[WARN ] 2018-11-30 11:16:43.531 [[main]-pipeline-manager] elasticsearch - Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[INFO ] 2018-11-30 11:16:43.551 [[main]-pipeline-manager] elasticsearch - Using mapping template from {:path=>nil}
[INFO ] 2018-11-30 11:16:43.574 [[main]-pipeline-manager] elasticsearch - Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[INFO ] 2018-11-30 11:16:43.620 [[main]-pipeline-manager] elasticsearch - New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://******:9200/"]}
[INFO ] 2018-11-30 11:16:44.112 [Converge PipelineAction::Create<main>] pipeline - Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x3c737455 run>"}
[INFO ] 2018-11-30 11:16:44.237 [Ruby-0-Thread-1: /usr/share/logstash/lib/bootstrap/environment.rb:6] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2018-11-30 11:16:45.008 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}
[INFO ] 2018-11-30 11:16:46.641 [[main]-pipeline-manager] pipeline - Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x3c737455 run>"}

Nothing happened and No errors when I run the logstash configuration. Did someone have an idea?
I am using elaticsearch 6.4.1 logstash 6.3.0 kibana 6.4.1

Seems logstash started fine, what were you expecting?

could not find my index on Kibana

I assume you already have an index in elastic called "A" with data in it? Can you show some data?

This is a sample of my bulk sent to Elasticsearch and was indexed successfully :

 {"index":{"_index":"psd2-all","_id":0}}
{"@timestamp":"2018-11-27T05:59:09.413881610","@version":1,"message":"MyBusiness response to log.","logger_name":"lu.raiffeisen.soa.aisp.filters.BusinessLoggerFilter","thread_name":"http-nio-0.0.0.0-8080-exec-1","level":"ERROR","level_value":7000,"LH-Correlation-ID":"1234-1169-1353-1377","caller_class_name":"lu.raiffeisen.soa.aisp.filters.BusinessLoggerFilter","caller_method_name":"doFilter","caller_file_name":"BusinessLoggerFilter.java","caller_line_number":0,"appender_name":"business","hostname":"ocpnode3.sandbox.com","docker":{"container_id":"1ec455ef-b1fc-4ea3-87c5-8e42dccdf3bf"},"kubernetes":{"container_name":"sprint-boot","namespace_name":"test","pod_name":"spring-boot-camel-amq-te-5","pod_id":"b5b7b205-ccca-414d-8c95-c623e58191b8","host":"ocpnode3.sandbox.com","master_url":"https://kubernetes.default.svc.cluster.local","namespace_id":"2b7d3dd4-1c64-4d4c-ada2-58cd0bf75bc9","labels":{"deployment":"spring-boot-camel-amq-te-5","deploymentconfig":"spring-boot-camel-amq-te","group":"org.jboss.fuse.fis.arcetypes","project":"spring-boot-camel-amq-testing","provider":"fabric8","version":"2.2.195.redhat-000013"}},"pipeline_metadata":{"collector":{"ipaddr4":"10.129.0.51","ipaddr6":"fe80::858:aff:fe81:33","inputname":"fluent-plugin-systemd","name":"fluentd","received_at":1543319840344,"version":"0.12.39 1.6.0"}}}

In my conf file (the conf file that I shared before is a exemple to simplify) the real data is :

  1. the index name is psd2-all

  2. the field name that I'm quering for is "appender_name":"business"

Does your query work in kibana dev tools?

yes it works fine

I can only assume you have made a mistake in the query or something, your input and output look fine to me but we don't have the proper config so i can only make assumptions!

Here it is :

input {
elasticsearch {
hosts => "http://apibda01:9200/"
index => "psd2-all"
query => '{
"query": {"bool": {"must": [
{"term": {
"appender_name": {
"value": "business"
}
}}
]}}
}'
size => 500
scroll => "5m"
docinfo => true
}
}

output {
elasticsearch {
hosts => "http://apibda01:9200/"
index => "copypsd2"
}
stdout { codec => json }
}

Solved, the problem was in my hosts I replace it by http://apibda01:9200 without the "/"
Thank you @Eniqmatic for your support

Glad you got it sorted!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.