Strange JSON Codec Behavior

Hi,
I have been using logstash for quite some time and I have never seen behavior quite like this. I am attributing it to the JSON codec because I have never used it before. When I run logstash with the information below, the JSON data is fetched and can even be pushed to elastic just fine, however, logstash does not stop the pipeline automatically as it normally does. Also, If I do not delete the "data" folder and all its contents (including the sincedb files) after each time I run it, logstash does not fetch any data from the json file and just goes into the "pushing flush onto pipeline" loop (shown below). I find this very puzzling.
.conf:
input {
file {
start_position => "beginning"
path => "/home/username/logstash-5.6.4/bin/sql-to-elastic.json"
codec => "json"
}
}
output { stdout { codec => rubydebug } }

JSON:
[{"NAME":"A","SIZE":"2"},{"NAME":"B","SIZE":"3"},{"NAME":"C","SIZE":"4"},{"NAME":"D","SIZE":"5"}]

Command: ./logstash -w 1 -f json-test2.conf --verbose --debug

Strange Output loop after data is fetched (it also pushes to elastic just fine when I try that):
[2017-12-06T12:58:44,871][DEBUG][logstash.pipeline ] Pushing flush onto p
ipeline
[2017-12-06T12:58:49,872][DEBUG][logstash.pipeline ] Pushing flush onto p
ipeline
[2017-12-06T12:58:54,199][DEBUG][logstash.inputs.file ] _globbed_files: /hom
e/username/logstash-5.6.4/bin/sql-to-elastic.json: glob is: ["/home/username/logst
ash-5.6.4/bin/sql-to-elastic.json"]
[2017-12-06T12:58:54,877][DEBUG][logstash.pipeline ] Pushing flush onto p
ipeline
[2017-12-06T12:58:59,877][DEBUG][logstash.pipeline ] Pushing flush onto p
ipeline
[2017-12-06T12:59:04,878][DEBUG][logstash.pipeline ] Pushing flush onto p
ipeline
[2017-12-06T12:59:09,231][DEBUG][logstash.inputs.file ] _globbed_files: /hom
e/username/logstash-5.6.4/bin/sql-to-elastic.json: glob is: ["/home/username/logst
ash-5.6.4/bin/sql-to-elastic.json"]

...And it just goes on an on like that until I kill the process. The log file contains the same information.

My goal is for logstash to shut down the pipeline on its own normally without me killing the process and to not have to delete my data folder and all its contents every time.

If anyone has an insight, I would be very appreciative.

however, logstash does not stop the pipeline automatically as it normally does.

Logstash doesn't shut down itself when using the file input.

Also, If I do not delete the "data" folder and all its contents (including the sincedb files) after each time I run it, logstash does not fetch any data from the json file and just goes into the "pushing flush onto pipeline" loop (shown below).

Have you added any new data to the JSON files? Or replaced them? Or what's the scenario, i.e. why do you want Logstash to process the file again?

oh ok awesome I didn't know that. I was always using it against SQL before.

I am just in the setup phase of my logstash/elastic design. I have a java project which generates JSON files which logstash then feeds to elastic. I needed to stop logstash and then run it again because I am in the process of editing my .conf file and I like to test its output to make sure I am on the right track. When I am satisfied with my conf file, I was going to have the Java project delete and replace the JSON file that logstash uses. Maybe in this case I should just keep adding onto the same file though...

(The data that the java generates keeps getting updated and that is why i need to run both that and logstash multiple times so that the elastic data can get updated as well :grinning: )

Thanks!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.