I have some log files in this path "D:\Sample\Logs*.log" which is in remote server \10...*.
And it needs credentials like some username and password.
So I want to upload those files from my machine.
Can anyone please tell me how should I configure input plugin, so that logstash should take logs from remote server and upload it into kibana.
Use the below logstash.conf file for sending logs from logstash to elasticsearch
input {
file {
path => [ "/var/log/.log", "/var/log/messages", "/var/log/syslog","/var/log/.log" ] #mention path of any file
tags=> [""] #mention the tag here e.g. tags=> ["sunil"]
}
}
output {
elasticsearch {
hosts => "10.0.x.x" #mention the elasticsearch ip address here
manage_template => false
index => "Testing-%{+YYYY.MM.dd}"
}
}
3) Save the file and exit.
4) Start the logstash
Note: Here, logs are transferred to elasticsearch without using filebeat. As, logs can be transferred to elasticsearch by simply using logstash only.
Make changes in the vim /etc/filebeat/filebeat.yml file as follows:-
a)
input_type: log
# Paths that should be crawled and fetched. Glob based paths.
paths:
/var/log/xyz.log #path of the logs that you want to transfer from remote machine to logstash
b) #----------------------------- Logstash output --------------------------------
output.logstash: # The Logstash hosts
hosts: ["10.0.x.x:5044"] # ip address of the logstash
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.