Stuck at "Successfully started Logstash API endpoint {:port=>9600}"

Hello guys

i get stuck at "Successfully started Logstash API endpoint {:port=>9600}" while trying to ingest txt log file into elasticsearch

here is my config file :

input {
file {
path => "C:\Users\mustapha\Desktop\test.txt"
start_position => "beginning"
}
}
filter {
grok {
match => {"message" => "%{WORD:username} %{WORD:email} %{WORD:hash}" }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "test"
}
}

my log file is :

username email hash
username email hash
username email hash
username email hash
username email hash

what is wrong in my setup? am i missing something?
thanks for your help

@Mustapha_MJ
One possibility is that this pipeline already ran at one point, thereby generating a sincedb file. That would lead each subsequent run against the same file to conclude that the data had already been processed.

Try doing something like:

sincedb_path => "/dev/null"
OR
#for Windows
sincedb_path => "NUL"

That will disable the sincedb mechanism and should reprocess the whole file (since you've added start_position: beginning) each time you run it.

Also, if you're on version 6.4+, check out the new read mode which might suite your use case.

3 Likes

i already tried to add
sincedb_path => "NUL"
but didnt solve the problem, still having logstash stuck and no data sent to elasticsearch

@Mustapha_MJ
Few other things:

  • change your path separators to /
  • try removing the filter {} and changing output to just stdout - see if you can get something written to the console
  • try changing input to stdin in combination with the above stdout and do something like type your-input-file.txt | logstash -f your-pipeline.cof - see if you can get something written to the console
3 Likes

thank you Sir for your time to answer me

i tried what you told me Sir and it worked but i dont know how to end stdin, just typed "ctrl+c" :

input {
stdin{}
}
filter {
grok {
match => { "message" => "%{WORD:username} %{WORD:email} %{WORD:hash}" }
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "logiis"
}
}

the console is :

[2018-11-02T11:59:59,300][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
user1 email1 pass1
user2 email2 pass2
user3 email3 pass3

then an index was created in elasticsearch and i can see in kibana 3 record

what i think is the problem is file input problem cuz i can read from stdin and send data to elasticsearch but cannot send data as txt file
logstash cannot end input file, after the final line in the file it stuck that is why it stop at : [Successfully started Logstash API endpoint {:port=>9600}”] and never send the data not exit automatically.

so how to inform logstash that the EOF is when you finish the lines ????

Logstash is meant to be a service that continuously processes data, so there is not a built-in way to stop it after processing a file. It is ok to kill it manually or via a script.

There are a bunch of other posts covering it if you want to dig in more:

1 Like

Forgot to mention, if stdin{} is working, then there is probably something wrong with your path setting in file{} (did you switch your backslashes to forward slashes?) or the file is unreachable in some way (permissions, maybe?)

3 Likes

it turns out that logstash service doesn't feel the changes in the config file unless you change its name, after rename the config file i can finally ingest my logs into elastic search

thank you for your help

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.