Cannot get any output

HI All,
i recently upgraded to logstash-5.3.2.when i ran a 'x.conf' file in logstash i get the below message: INFO logstash.agent - Successfully started Logstash API endpoint {:port=>96000} but i could not see the output without anything displayed for some time an nothing can be moving forward.
could anybody suggest me how to resolve the issue.


What's the entire config look like?

Thanks for the reply Mark,
the config file looks like this:
input {
file {
path => "/home/ubuntu/log/test.log-20170303"
#port => 5044
#ssl => false
#client_inactivity_timeout => "86400"
start_position => "beginning"


filter {
if [type] == "log" {
if "dev1" in [message] { drop{} }
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program} |%{NUMBER:epoch:int}|%{IP:ip}|%{GREEDYDATA:path}|%{GREEDYDATA:title}|%{GREEDYDATA:referrer}|%{NUMBER:uid}|%{GREEDYDATA:sid}|%{NUMBER:timer:int}|%{GREEDYDATA:cache}|%{GREEDYDATA:user_agent}|%{NUMBER:peak_memory:int}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
overwrite => [ "message" ]

   mutate {
   convert => [ "[geoip][coordinates]", "float"]
   mutate {
   convert => { "timer" => "integer" }
   convert => { "epoch" => "integer" }
   convert => { "peak_memory" => "integer" }
   useragent {
     source => "user_agent"
   syslog_pri { }
   date {
     match => [ "epoch", "UNIX" ]


output {
if [type] == "log" {
elasticsearch {
hosts => ["localhost:9200"]
timeout => 30000
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"

This is the line where logstash is stuck:
Sending Logstash's logs to /usr/share/logstash/logs which is now configured via moving forward from here..


It's likely a sincedb issue then, try checking the docs and previous threads :slight_smile:

You appear to have changed from a beats input to a file input. I therefore suspect that the metadata variables are no longer defined and that you end up with an invalid type, preventing anything to be written to Elasticsearch.

Yes Christian,
Actually the setup is done to move the data from filebeat-> Logstash-> elasticSearch, but it is not working, so i want to see if i can load directly from logstash to Elasticsearch?

Before this issue an index which is building is terminated in between, so can that create any ripples in the data loading issues?
Thanks once again!

Hi jagan,
I have the same issue as you and it shows me the same thing :
Sending Logstash's logs to /usr/share/logstash/logs which is now configured via
Could you suggest me how to resolve this if you have found any solutions.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.