File input doesn t read incoming Json file from web application

#1

Hi everyone,

I'm trying to parse incoming json files from my web app stored in folder /tmp/test/*.json.

The problem is that logstash doesnt read my files when there are created. I have to modify them so that logstash parses them.

I try to change parameters in my input file like ignore_older to parse all logs but changed nothing

I did this filter.conf:

input {
file {
path => "/tmp/test/*.json"
codec => json
start_position => "beginning"
ignore_older => 0
}
}

filter {
json {
source => "message"
}
}

output {
stdout {codec => json}
elasticsearch {
hosts => "127.0.0.1:9200"
index => "test-%{+YYYY.MM}"
codec => json
}
}

Below, this is my folder with files permissions

I started logstash with --debug and -v but there is no error message. The .sincedbpath is not updated when the log is created.

#2

This says to ignore all files older than 0 seconds. Which means it ignores all files.

#3

Thanks for your reply !

I have removed the ignore_older parameter from my config. Now logstash see my file and add an entry into .sincedbpath. However, The current byte offset within the file is 0 so I understand that logstash doesnt read the file ?

Sincedb information:
262169 0 2049 500 1552734148.288747 -> file parsed by logstash if I do a modification
262195 0 2049 0 1552734148.2914188 -> file incoming not parsed by logstash.

Still no error or info message on logs of logstash.
If you need more information, ask me.

#4

Run with '--log.level trace' to see what filewatch is doing.

#5

This is the log when I create a new file in the folder. The entry is added into sincedb but logstash detects a problem of delimiter if I understand well. My json file didnt end by \n so logstash never detects the end of the file. I edit the file, add \n and reloaded logstash and it worked well.
Thanks for the --log.level trace. was helpfull

[2019-04-03T20:03:14,268][TRACE][filewatch.tailmode.processor] Active - no change {"watched_file"=>"<FileWatch::WatchedFile: @filename='***_2019-03-07_suivi.json', @state='active', @recent_states='[:watched, :watched]', @bytes_read='501', @bytes_unread='0', current_size='501', last_stat_size='501', file_open?='true', @initial=false, @sincedb_key='262195 0 2049'>"}
[2019-04-03T20:03:14,272][TRACE][filewatch.tailmode.processor] Active - file grew: ***_2019-03-08_suivi.json: new size is 501, bytes read 0
[2019-04-03T20:03:14,273][TRACE][filewatch.tailmode.handlers.grow] handling: **_2019-03-08_suivi.json
[2019-04-03T20:03:14,278][TRACE][filewatch.tailmode.handlers.grow] reading... {"iterations"=>1, "amount"=>501, "filename"=>"_2019-03-08_suivi.json"}
[2019-04-03T20:03:14,279][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2019-04-03T20:03:14,288][TRACE][filewatch.tailmode.handlers.grow] buffer_extract: a delimiter can't be found in current chunk, maybe there are no more delimiters or the delimiter is incorrect or the text before the delimiter, a 'line', is very large, if this message is logged often try increasing the file_chunk_size setting. {"delimiter"=>"\n", "read_position"=>0, "bytes_read_count"=>501, "last_known_file_size"=>501, "file_path"=>"
/_2019-03-08_suivi.json"}

This topic can be closed. thanks a lot

(system) closed #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.