How to specified the target location to store the log file

Hi

I installed the ELK with filebeat few days ago.

I follow up with the blog https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-ubuntu-14-04#set-up-filebeat(add-client-servers)

My question is when the logstash server received the log info from the filebeat agent, where the info would be stored?

If i want transfer some logs from some filebeat agent hosts to a specified folder in the logstash server, how can i config it?

Thanks for all your help.

My question is when the logstash server received the log info from the filebeat agent, where the info would be stored?

I don't know, where do you want things stored? If you send events to Elasticsearch, isn't that enough?

If i want transfer some logs from some filebeat agent hosts to a specified folder in the logstash server, how can i config it?

Use a file output and set its path option to the desired destination path. I believe Filebeat creates a field with the path of the source file. But why?

Hi brother,

Thanks to got your reply.

Sorry for my strange question, because i am just a newbie of the logstash.

1)If the log info send events to Elasticsearch, then where the info it stored?

2)For some security issue, my team want to store the log to the nfs file system. So we have to specified the folder.
If i want to transfer the log file /var/log/logtest to the path /var/log/filebeat of logstash host, how can i config it.

Here is my filebeat config of my agent host.

############################# Filebeat ######################################
filebeat:

List of prospectors to fetch data.

prospectors:
# Each - is a prospector. Below are the prospector specific configurations

paths:
- /var/log/auth.log
- /var/log/syslog
document_type: syslog
-
paths:
- /var/log/testlog
input_type: log
document_type: logtest

output:
logstash:
# The Logstash hosts
hosts: ["10.0.0.4:5044"]
bulk_max_size: 1024
# Optional TLS. By default is off.
tls:
# List of root certificates for HTTPS server verifications
certificate_authorities: ["/etc/pki/tls/certs/logstash-forwarder.crt"]

File as output

file:
# Path to the directory where to save the generated files. The option is mandatory.
#path: "/tmp/filebeat"
path: "/var/log/filebeat"
filename: filebeat

shipper:

logging:
files:

rotateeverybytes: 10485760 # = 10MB

1)If the log info send events to Elasticsearch, then where the info it stored?

In Elasticsearch's database.

If i want to transfer the log file /var/log/logtest to the path /var/log/filebeat of logstash host, how can i config it.

Use the following in your Logstash configuration:

output {
  file {
    path => "/var/log/filebeat"
  }
}

Sorry for late response.

I will try it and hope it will work.

Thanks a for your help.

Hi,

I have update my config file

output {
if [type] == 'syslog'{
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
path => "/var/log/filebeat"
}
}
}

And both the logstash service of the server host and the filebeat service of the client host were restarted.

But when i check the folder of the logstash server, there is no any files were transferred to it.

azureuser@logstashsrv:/var/log/filebeat$ ll
total 8
drwxr-xr-x 2 root root 4096 Feb 26 03:46 ./
drwxrwxr-x 16 root syslog 4096 Mar 7 06:32 ../

I am not sure whether i should update some entries of the filebeat config file of the clinet host?

Thanks for your help.

No, path => "/var/log/filebeat" goes in a separate plugin instance. I realize that made a typo in my last post. I've corrected it now. To be clear, you need this:

output {
  elasticsearch {
     ...
  }
  file {
    path => "/var/log/filebeat"
  }
}