Actually i'm reading the logs to see what's really happening on it. And i find that logstashs really reads the file.
After all the "Adding pattern" lines, one it starts with the replacement_pattern says that Grok Compiled OK, and starts the pipeline, and it keeps going like that all the way up:
2017-11-13T15:58:17,376][DEBUG][logstash.filters.grok ] replacement_pattern => (?MONTH:month\b(?:[Jj]an(?:uary|uar)?|[Ff]eb(?:ruary|ruar)?|Mm?r(?:ch|z)
?|[Aa]pr(?:il)?|[Mm]a(?:y|i)?|[Jj]un(?:e|i)?|[Jj]ul(?:y)?|[Aa]ug(?:ust)?|[Ss]ep(?:tember)?|Oo?t(?:ober)?|[Nn]ov(?:ember)?|[Dd]e(?:c|z)(?:ember)?)\b)
[2017-11-13T15:58:17,377][DEBUG][logstash.filters.grok ] replacement_pattern => (?MONTHDAY:day(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))
[2017-11-13T15:58:17,377][DEBUG][logstash.filters.grok ] replacement_pattern => (?TIME:time(?!<[0-9])%{HOUR}:%{MINUTE}(?::%{SECOND})(?![0-9]))
[2017-11-13T15:58:17,377][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:2[0123]|[01]?[0-9]))
[2017-11-13T15:58:17,377][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:[0-5][0-9]))
[2017-11-13T15:58:17,377][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))
[2017-11-13T15:58:17,378][DEBUG][logstash.filters.grok ] Grok compiled OK {:pattern=>"%{MONTH:month} %{MONTHDAY:day} %{TIME:time}", :expanded_pattern=>"(?<MONTH:mo
nth>\b(?:[Jj]an(?:uary|uar)?|[Ff]eb(?:ruary|ruar)?|Mm?r(?:ch|z)?|[Aa]pr(?:il)?|[Mm]a(?:y|i)?|[Jj]un(?:e|i)?|[Jj]ul(?:y)?|[Aa]ug(?:ust)?|[Ss]ep(?:tember)?|O
o?t(?:ober)?|[Nn]ov(?:ember)?|[Dd]e(?:c|z)(?:ember)?)\b) (?MONTHDAY:day(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])) (?TIME:time(?!<[0-9])(?:(?:2[0123]|[
01]?[0-9])):(?:(?:[0-5][0-9]))(?::(?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)))(?![0-9]))"}
[2017-11-13T15:58:17,382][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"
=>5, "pipeline.max_inflight"=>250}
[2017-11-13T15:58:18,666][INFO ][logstash.pipeline ] Pipeline main started
[2017-11-13T15:58:18,699][DEBUG][logstash.inputs.file ] _globbed_files: /var/log/banana/auth.log: glob is: ["/var/log/banana/auth.log"]
[2017-11-13T15:58:18,699][DEBUG][logstash.inputs.file ] _discover_file: /var/log/banana/auth.log: new: /var/log/banana/auth.log (exclude is [])
[2017-11-13T15:58:18,709][DEBUG][logstash.inputs.file ] _open_file: /var/log/banana/auth.log: opening
[2017-11-13T15:58:18,710][DEBUG][logstash.inputs.file ] /var/log/banana/auth.log: sincedb last value 1262243, cur size 1262243
[2017-11-13T15:58:18,710][DEBUG][logstash.inputs.file ] /var/log/banana/auth.log: sincedb: seeking to 1262243
[2017-11-13T15:58:18,722][DEBUG][logstash.agent ] Starting puma
[2017-11-13T15:58:18,726][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2017-11-13T15:58:18,727][DEBUG][logstash.api.service ] [api-service] start
[2017-11-13T15:58:18,844][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-11-13T15:58:23,696][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:58:28,699][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:58:32,739][DEBUG][logstash.inputs.file ] _globbed_files: /var/log/banana/auth.log: glob is: ["/var/log/banana/auth.log"]
[2017-11-13T15:58:33,699][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:58:38,700][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:58:43,699][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:58:47,752][DEBUG][logstash.inputs.file ] _globbed_files: /var/log/banana/auth.log: glob is: ["/var/log/banana/auth.log"]
[2017-11-13T15:58:48,699][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:58:53,699][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:58:58,699][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:59:02,765][DEBUG][logstash.inputs.file ] _globbed_files: /var/log/banana/auth.log: glob is: ["/var/log/banana/auth.log"]
[2017-11-13T15:59:03,699][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:59:08,700][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:59:13,701][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:59:17,797][DEBUG][logstash.inputs.file ] _globbed_files: /var/log/banana/auth.log: glob is: ["/var/log/banana/auth.log"]
[2017-11-13T15:59:18,702][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:59:23,702][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:59:28,704][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:59:32,808][DEBUG][logstash.inputs.file ] _globbed_files: /var/log/banana/auth.log: glob is: ["/var/log/banana/auth.log"]
[2017-11-13T15:59:33,704][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:59:38,705][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
[2017-11-13T15:59:43,706][DEBUG][logstash.pipeline ] Pushing flush onto pipeline
e
I think i can't create the index on Kibana because no files or data are indexed on it, so the index doesn't really exists, but i don't understand what's going on with reading the file and trying to parse it. Is it really parsing the file?
Thanks Magnus,