I have ran filebeat but not showing up anything on kibana

Hi, I just ran filebeat with the typical commands of filebeat start within the directory of filebeat.

But when i open Kibana with my configured yml, i dont see anything too it but I see my previous defaulted filebeat outputs within kibana which makes not a ton of sense.

I am running 1.1.1 filebeat

Can you share some more details. What is your config file? Which LS, ES, Kibana version are you using? Did you check the log files from filebeat?

LS 2.2.2
ES 2.2.0
Kibana 4.4.1

and my configuration document had a very simple setup so far

filebeat: 
 prospectors: 
-
paths:
- "/var/log/*.log"
- C:\Users\qa1\Desktop\*.logs

input_type:logs 
document_type:logs

paths: 
-C:\Users\qa1\Desktop\logs

registry_files: "C:/ProgramData/filebeat/registry"

output:

logstash: 
hosts:["localhost:5044"]

shipper:

logging:
files:
rotateeverybytes:10485760

Also Ruflin, quick question. is there a shell cmd that is available that I could explicitly just write out and force any of the stack or beat to read?

I tried to format your config, but it seems like no indentation exists?

Not sure what you mean by your last question. You mean just print out the output for debugging? Then you can use -e -d "*" flags and all output is printed to stdout.

Ahh okay, ill use that command more frequently as I did not know it existed.

And Yeah I will update you within a couple of hours as I will just redownload the .yml confg from git and c&p the stuff over and try to maintain the consistent indentation and see if that was the main culprit of my problems. And much thanks =]

Hmm I just reused the default configuration document and it seems to be stuck after the start command. I also ran the -e - d "8" command and got these results

Based on the above output it looks like your log file does not get any updates in the 2m you posted the output. Were there any updates to logs in this time?

Please don't use screenshots but paste the code itself which is much easier to read.

Oh okay and sorry about the pasted img.

But in terms of updated logs, there wasn't any updates but I may have to reset the path as it looks like it may have looked into a place that may not have had the files it should have been looking for. I will run it and see what occurs, when I change the path to another directory.

Hi, I just adjusted the pathing for the logs and it does seem like I may have to change the duration in which it starts to ignore the log. Where within the yml can I do this? and it seems like it was defaulted to 24hrs as the result does say

INFO set ignore-older duration to 24h0m0s and my file is currently at 526h11m17s

ignore_older can be configured here: https://www.elastic.co/guide/en/beats/filebeat/1.2/configuration-filebeat-options.html#ignore-older

Be aware that the behaviour changed between 1.1 and 1.2. Per default in 1.2, ignore_older is set to infinity, also to prevent similar cases you have above. I strong recommend you to update to filebeat 1.2.2.

The ignore_older would also explain why the files were not shipped to elasticsearch.

Oh okay, and thank you for that link and I think I got my beat to work as it read the logs =].

So in order to get the messages to get indexed, I would then have to change the json format? And upgrading to 1.2.2 would it change anything else? and is it compatible with my ELK stack?

And much thanks =]

If 1.1 worked with your ELK stack, 1.2 should be compatible to.

What do you mean by changing JSON format?

the filebeat.template.json as my current filebeat is currently storing everything within properties -> message so what I will attempt currently would be to add more fields in message such as program version. And I was wondering if that would be the current path to take.

i.e: "message":{
"programversion": {
"version": 1.234,
"program" : "string"
} ,
etc
}

Hey ruflin, I was also wondering, as I did change some of the message inputs. But does filebeat basically parse the entire text document depending on what I put into quotations on the left side of the argument?
Such as---> "3DENGINE": "string"

Filebeat does not process the log messages. It just takes line by line and forwards it to Logstash or Elasticsearch. If you need log line processing and extraction, that is what Logstash is for.

The filebeat template has no affect on what filebeat itself does. It is for elasticsearch to know the types of the fields.

Oh okay, and sorry for the delayed reply.

And I did get everything to work and show up on kibana and much thanks for your help ruflin =].

But a minor question would be, my work around was to play around with the includes and I was wondering if I can include * within the includes

What exactly are you referring to with include? Which config option? paths?

the particular include is within filebeat.yml and it was for the
include_lines: ["^ERR", "^WARN" ]
And the paths where the same as the one stated earlier which was just the typical -C:\pwd*.log

And for my question I could have phrased it better, I would wondering within the include_lines: is there a option that I could do something that would automatically read past all the basic information of dates and times like these
Fri Dec 04 10:51:24 EST 2015:
Mon Dec 07 12:16:37 EST 2015:
May 01 17:15:16 EDT 2016:

And the include_lines would read the three dates and look for a word such as ERROR right afterwards like this and punch it out into kibana or logstash

May 01 17:15:16 EDT 2016: ERROR

And thank you for your support