Logstash do not create an output to elasticsearch

Hi all,

I'm using the latest ELK-Stack (5.3.0) and installed ES, Kibana, and Logstash on the same machine (Win 2016 Server). In addition to that, I installed Filebeat on another machine (my notebook [Win 8.2]) that is reading a local log file and sending messages to logstash. On the one hand, ES and Kibana are working well. E.g. I'm able to use ES through the REST API to see all the indices and also, I'm able to see data from ES in Kibana. And Kibana is also monitoring my Logstash.

On the other hand, Logstash and Filebeat are also working well. When I update my logfile, I can see the new entry in the console of Logstash (stdout) as well as in the file (file output) on the server.

BUT using Elasticsearch as an (third) output for Logstash do not work at all. Although, I can see the new log entries in the stdout and the file output, Logstash do not write data to ES. In my Filebeat log, there is the following error occuring:

"ERR Failed to publish events caused by: write tcp xxx.xxx.xxx.47:64160->xxxxxx.xxx.168:5044: wsasend: An existing connection was forcibly closed by the remote host."

This is my Logstash conf:

input {
  beats {
    host => "xxx.xxx.xxx.168"
    port => 5044
  }
}

output {
  elasticsearch {
	hosts => "xxx.xxx.xxx.168:9200"
	user => "elastic"
        password => "<<very secure password>>"
  }
  stdout { codec => rubydebug }
  file {
   path => "\testLog.log"
   create_if_deleted => true
  }
}

The same problem also comes up, when I'm using Elastic search as the only output of Logstash.

I turned off all firewalls but this doesn't help. Neither the log file of Logstash, nor the logfile of Elastic search provide any further information about what is going wrong. All IP adresses and ports, as well as credentials provided in the config/yml-files were multiple checked.

I'm now struggling with that problem since days and I hope, that someone can provide me a helping hand.

Thank you very much!
Marcus

This is key. If this were not the case, then we'd be troubleshooting in a different way. Logstash sends to all 3 outputs at once (because you do not have any conditionals to direct otherwise). How many messages do you get? As steady stream? If you get a steady stream of messages, then things are, in fact, also being sent to Elasticsearch. If it were not so, the stream to the other two outputs would stop.

So, why aren't you seeing anything in Elasticsearch? That's a good question. Have you tried running Logstash with debug logging enabled? You will have a better chance of troubleshooting this if you do. Have you looked at the Elasticsearch logs? Is Elasticsearch receiving the data, but rejecting it for some reason?

Thanks a lot for the quick reply.

For test purpuses, I manipulate my log file on my local notebook manually. Thus, I do not have steady input streams in Logstash but more single events triggered by hand.

I started Logstash in the debug mode and carefully analyzed the log file. After a new log entry is received from Filebeat, the new entry is displayed in the console of Logstash and is written in the file (logstash.outputs.file). Unfortunatelly, the Logstash log file do not tell me anything about what is going wrong with ES. There is just no log file entry about this. It seems to me, that Logstash is completely ignoring the ES-output command in its config file. No success messages, no error messages, just nothing about ES..... The only information i got after a new beat was received is a couple of log entries related to "logstash.outputs.file". No "logstash.outputs.elasticsearch" or whatever.

Also the log file of ES isn't very informative. There is just nothing happening in the log file of ES, when Logstash receives a new beat from Filebeat. That, again, raise fears, that Logstash is just ignoring the ES output. But why???

So, ES seems to not receiving any data from Logstash. And it appears to me that Logstash do not send any data to ES although I placed an ES output section in the config file. (By the way: The Logstash log file tells me, that Logstash is definitifely using the correct config file on start up.)

What am I doing wrong?

UPDATE:
I also recognized, that logstash is not creating an index in ES. (I manually installed the filebeat.template.json on ES via REST API.)

Logstash doesn't create indices. Logstash sends a "please index this document into the index named ..." request to Elasticsearch.

If you're testing on your laptop, try to connect to a local, unsecured Elasticsearch on port 9200. That will validate that output is going to an Elasticsearch.

Unfortunatelly, I do not think that Logstash is sending anything to ES. Neither the log file of Logstash nor the log file of ES contains any records that log such activities. There are just no activities at all.

Logstash and ES are both running on the same machine. ES is secured via X-Pack. I checked the credentials, IP and Port multiple times. I don't think that the problems are coming from this part of the installation. As told, I'm able to interact with my ES instance via CURL without any problems. And also Kibana is working together properly with ES.

So, what is wrong with my installation / configuration? Why do Logstash do not send requestst to ES? Please help! Thanks.

I am trying to help, but you are not following the suggestions I have made. Let me make a few troubleshooting suggestions:

  1. Start with a simple logstash config with stdin in the input block, and stdout in the output block. (with rubydebug as the codec).
  2. Add your filters, so you can copy/paste a line and have it render exactly with the fields you expect, still in stdout
  3. After #2 has completed successfully, now add a local elasticsearch node (can run it on a separate port if you like) and have logstash send to that in addition to stdout.
  4. Verify that Logstash is sending to an unsecured Logstash, and you are getting the results you expect.
  5. Now add x-pack to the local node, and see if you can continue to send events through Logstash to this now-protected node.

If at any time you do not get the expected results, re-run that step with debug logging enabled in Logstash at least, if not Elasticsearch as well. I haven't seen any debug logging posted to this thread.

1 Like

Thank you very much for your help. I now re-installed everything (and updated my stack to 5.3.1). I carefully set up my new ELK-Stack in very small steps with a lot of testing.

Now, everything works fine. Unfortunatelly, I still do not know, what error occured in my previous installation. Therefore, I cannot report about a solution for my problem. Sorry.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.