Problems getting started w/ log stash and elasticsearch

(John Cartwright) #1


I just installed elasticsearch and logstash on a Mac (Java 1.7). Elasticsearch seems to be running as expected and while logstash works with the simplest examples, it fails when I try to connect it to elastic search:

puma:logstash-1.5.0 jcc$ logstash -e 'input { stdin { } } output { elasticsearch { host => localhost } }'

testing 1,2,3...
log4j, [2015-05-16T21:39:32.070]  WARN: org.elasticsearch.discovery: [logstash-   puma.local-89867-2010] waited for 30s and no initial state was set by the discovery
Exception in thread ">output" org.elasticsearch.discovery.MasterNotDiscoveredException: waited for [30s]
at org.elasticsearch.cluster.service.InternalClusterService$
at java.util.concurrent.ThreadPoolExecutor.runWorker(java/util/concurrent/
at java.util.concurrent.ThreadPoolExecutor$

This log stash 1.5.0 and elastic search 1.5.2 on OS X 10.9.5 w/ JDK 1.7.0_60.

Can anyone point out what I'm missing here?



(Mark Walkom) #2

Do you know ES has started, have you tried checking the status?

(John Cartwright) #3

Thanks for your reply Mark. Yes, ES is running and seems to respond normally to http input via cURL. Using the same versions of ES and LogStash, I was able to run the same example on linux.


(Mark Walkom) #4

This could be a misconfiguration problem.

Try adding protocol => http to your output and trying again.

(John Cartwright) #5

thanks for the suggestion, but

echo "hello world" | ./bin/logstash -e 'input { stdin { } } output { elasticsearch { host => localhost protocol => http } }'

still doesn't seem to be getting anything over to ES. Any special considerations for OS X that I should be aware of?


(Magnus Bรคck) #6

That command should work. Try enabling verbose logging by passing --verbose or even --debug to get Logstash to log more information about what it's doing.

(Priyanku Konar) #7

I have a similar issue as above while connecting to elasticsearch from logstash using stdin input ,

attaching the dump with --debug flag in logstash

Logstash startup completed
send this message from stdin to elastic search
โ†[36moutput received {:event=>{"message"=>"send this message from stdin to elastic search\r", "@version"=>"1", "@timestamp"=>"2015-05-27T17:08:37.742Z", "type"=

"human", "inputsource"=>"stdin", "host"=>"PriyankuK-MSL"}, :level=>:debug, :file=>"(eval)", :line=>"25", :method=>"output_func"}โ†[0m
2015-05-27T17:08:37.742Z PriyankuK-MSL send this message from stdin to elastic search
โ†[36mFlushing output {:outgoing_count=>1, :time_since_last_flush=>50.355, :outgoing_events=>{nil=>[["index", {:_id=>nil, :_index=>"logstash-2015.05.27", :_type=
"human", :_routing=>nil}, #<LogStash::Event:0x728bd4 @metadata_accessors=#<LogStash::Util::Accessors:0x777dffd4 @store={"retry_count"=>0}, @lut={}>, @cancelled
=false, @data={"message"=>"send this message from stdin to elastic search\r", "@version"=>"1", "@timestamp"=>"2015-05-27T17:08:37.742Z", "type"=>"human", "input
source"=>"stdin", "host"=>"PriyankuK-MSL"}, @metadata={"retry_count"=>0}, @accessors=#<LogStash::Util::Accessors:0x18ec3689 @store={"message"=>"send this messag
e from stdin to elastic search\r", "@version"=>"1", "@timestamp"=>"2015-05-27T17:08:37.742Z", "type"=>"human", "inputsource"=>"stdin", "host"=>"PriyankuK-MSL"},
@lut={"type"=>[{"message"=>"send this message from stdin to elastic search\r", "@version"=>"1", "@timestamp"=>"2015-05-27T17:08:37.742Z", "type"=>"human", "inp
utsource"=>"stdin", "host"=>"PriyankuK-MSL"}, "type"], "inputsource"=>[{"message"=>"send this message from stdin to elastic search\r", "@version"=>"1", "@timest
amp"=>"2015-05-27T17:08:37.742Z", "type"=>"human", "inputsource"=>"stdin", "host"=>"PriyankuK-MSL"}, "inputsource"], "host"=>[{"message"=>"send this message fro
m stdin to elastic search\r", "@version"=>"1", "@timestamp"=>"2015-05-27T17:08:37.742Z", "type"=>"human", "inputsource"=>"stdin", "host"=>"PriyankuK-MSL"}, "hos
t"], "message"=>[{"message"=>"send this message from stdin to elastic search\r", "@version"=>"1", "@timestamp"=>"2015-05-27T17:08:37.742Z", "type"=>"human", "in
putsource"=>"stdin", "host"=>"PriyankuK-MSL"}, "message"]}>>]]}, :batch_timeout=>1, :force=>nil, :final=>nil, :level=>:debug, :file=>"/Priyanku/elasticsearch/lo
gstash/logstash-1.5.0/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb", :line=>"207", :method=>"buffer_flush"}โ†[0m
โ†[36mSending bulk of actions to client[0]: localhost {:level=>:debug, :file=>"/Priyanku/elasticsearch/logstash/logstash-1.5.0/vendor/bundle/jruby/1.9/gems/logst
ash-output-elasticsearch-0.2.4-java/lib/logstash/outputs/elasticsearch.rb", :line=>"461", :method=>"flush"}โ†[0m
โ†[31mGot error to send bulk of actions to elasticsearch server at localhost : blocked by: [SERVICE_UNAVAILABLE/1/state not recovered / initialized];[SERVICE_UNA
VAILABLE/2/no master]; {:level=>:error, :file=>"/Priyanku/elasticsearch/logstash/logstash-1.5.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.4
-java/lib/logstash/outputs/elasticsearch.rb", :line=>"464", :method=>"flush"}โ†[0m
โ†[33mFailed to flush outgoing items {:outgoing_count=>1, :exception=>org.elasticsearch.cluster.block.ClusterBlockException: blocked by: [SERVICE_UNAVAILABLE/1/s
tate not recovered / initialized];[SERVICE_UNAVAILABLE/2/no master];, :backtrace=>["org.elasticsearch.cluster.block.ClusterBlocks.globalBlockedException(org/ela
sticsearch/cluster/block/", "org.elasticsearch.cluster.block.ClusterBlocks.globalBlockedRaiseException(org/elasticsearch/cluster/block/Cl", "org.elasticsearch.action.bulk.TransportBulkAction.executeBulk(org/elasticsearch/action/bulk/", "org.elasti
csearch.action.bulk.TransportBulkAction.access$000(org/elasticsearch/action/bulk/", "org.elasticsearch.action.bulk.TransportBulkActi
on$1.onFailure(org/elasticsearch/action/bulk/", "$ThreadedActionListener$
elasticsearch/action/support/", "java.util.concurrent.ThreadPoolExecutor.runWorker(java/util/concurrent/",
"java.util.concurrent.ThreadPoolExecutor$", ""], :le
vel=>:warn, :file=>"/Priyanku/elasticsearch/logstash/logstash-1.5.0/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb", :line=>"231", :method=>"buffer

(Priyanku Konar) #8

below is the logstash conf file

input {
stdin {
add_field => {inputsource => "stdin"} # hash (optional), default: {}
#codec => ... # codec (optional), default: "plain"
#debug => ... # boolean (optional), default: false
#tags => ... # array (optional)
type => "human" # string (optional)

output {
stdout {

elasticsearch {
host => localhost


(Mark Walkom) #9

@Priyanku_konar please start your own thread for your question.

(Jimmy) #10

@jcc I was able to resolve a similar problem by adding the name of my elasticsearch cluster to the config file:

input { stdin { } }
output {
  elasticsearch {
    host => localhost
    cluster => elasticsearch_brew

I suspect the problem has something to do with the default Elasticsearch configuration that results from installing Elasticsearch via Homebrew, but I haven't dug in much further.

Hope this isn't too late and helps!

(Bradley Bristow-Stagg) #11

I had to do the same thing when I was initially playing with ELK together.

(Tory Berra) #12

Had to do same thing. Took a while to figure out from the lack of a useful error.

(system) #13