I am new to Kibana, I have recently uploaded Apache Log File through Filebeat (Installed on Win10) but when I check on Kibana Discovery it is displaying only single entry of entire logfile.
My Setup:
Filebeat on Windows10 --> Logstash and Kibana Install on same machine (Debian)
what's your config in filebeat? Does it just send to logstash or directly to elasticsearch? (if it sends directly to elasticsearch, it will be in filebeat* pattern)
I have already done this one but, i am suspecting something i have to do it with logstash, as i have forwarded the log file but haven't done anything on the logstash.
Is there anything I have to performed on logstash.conf file to accept and parse the logs to elasticsearch?
1.configure filebeat
2.send to logstash and print-it out onto console to see if it reached logstash
3.If yes, then put the filters and then again print to console
4.if all good, then try sending to elasticsearch & then create index pattern
Step2 would be something like..
# logstash
input {
beats {
port => 5044
}
}
# just output to logstash console
output {
stdout {
codec => rubydebug
}
}
It is related with filebeat.yml and logstash config file. I have attached both the file configuration.
The major issue is with logstash as it is deleting all the entries except one line due to "document_id => "%{logstash_checksum}"" this is not working with Apache logs.
Now I have changed to match filtering with @timestamp it is displaying all the logs but i need to figure out unique value in apache as it shall not delete duplicate lines based on timestamp.
strange on you getting the issue. which version of filebeat you using?
You don't need to do the filters in logstash as the filebeat apache module is quite sufficient and does all the magic.
Please check in the filebeat, if you got
module/apache/access/manifest.yml (check contents and files are correct location & permission)
module/apache/access/ingest/pipeline.yml (check within it and you will see all the grok and logic)
in modules.d just change from modules.d/apache.yml.disabled to modules.d/apache.yml
After that restart/reload the filebeat and no need to put filters in logstash.. just incoming data and output and all fields should be good
(you can print stdout in logstash to see if all fields are coming)
What about this issue "document_id => "%{logstash_checksum}"" - this is the major problem, if i am not changing the checksum to @timestamp it is overwriting on single log (deleting all logs only single single log displayed as per below image)
Check first row, it has now docs.count 27xxx with this setting in logstash "document_id => "%{logstash_checksum}"", I have only one count and remaining in docs.deleted.
Where should that logstash_checksum field have come from? I don't see you creating it anywhere. So so wouldn't be surprised if your ES _id was always the same if that sprintf placeholder could not be filled and kept the value %{logstash_checksum} for every single document. Have I overlooked something?
When you look at your documents in Kibana, is there really a field called logstash_checksum? It if isn't, the data that you are trying to use as an ID doesn't exist.
I am complete new to kibana and as I said it was default, I was surprised in the begining to have such error, anyhow I have resolve by following solution from "Kelk".
Now, I want to use built-in filebeat modules for Apache logs (sending offline file) to Kibana.
Do you have any idea? where should I send the logs to Elastic search or Logstash?
I made those comments because you are new. Just wanted to make sure that you understand why you kept overwriting the same document before you move on to another solution
@kelk probably has more knowledge about this than I do. But I think if you are using the Apache module and there are no changes to the data structure that you'd like to make, you can send the data directly to ES using the ingest pipeline. Otherwise you might want to use Logstash (or a customized ingest pipeline maybe).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.