Shipping log files

This is my first post. If it is not the right place, kindly redirect me to the correct place.

I want to first ship the log files from the remote server to the central sever "as it is" with out any formatting using logstash. Later on i will think of for Elastic search and generate dash board using Kibana .

To achieve that i did the following changes. Below config is coping all the logs under directory /var/log/*.log to /home/logstash/logs/test.log.

logstash-forwarder.conf as follows.

"servers": [ "192.168.0.100:5043" ],
"ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt",
"timeout": 15
},

The list of files configurations

"files": [
{
"paths": [
"/var/log/*.log"
],
"fields": { "type": "syslog" }
}

and logstash.conf

input {
  lumberjack {
    port => 5043
    type => "logs"
    ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
    ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
  }
}

filter {}
output {
file {
message_format => " %{[message]}"
path => "/home/logstash/logs/test.log"
}
}

I want all the log files under directory /var/log/*.log should come under directory /home/logstash/logs/, instead of test.log but it is not working. For that i change the output section as below. Kindly correct me, how do i achieve this.

output {
file {
message_format => " %{[message]}"
path => "/home/logstash/logs/*.log"
}
}

Add a grok filter to capture the filename in a field, something like:

filter {
grok {
match => ["path","/var/log/%{DATA:filename}.log"]
}
}

Then use that field as the dynamic string for the output:

output {
file {
message_format => " %{[message]}"
path => "/home/logstash/logs/%{filename}.log"
}
}

V.

After change, I got the following output under /home/logstash/logs/

%{filename}.log

at this stage my config is like below.

logstash-forwarder.conf as follows.

"servers": [ "192.168.0.100:5043" ],
"ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt",
"timeout": 15
},

The list of files configurations

"files": [
{
"paths": [
"/var/log/*.log"
],
"fields": { "type": "syslog" }
}

and logstash.conf is like

input {
lumberjack {
port => 5043
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

filter {
grok {
match => ["path","/var/log/%{DATA:filename}.log"]
}
}

output {
file {
message_format => " %{[message]}"
path => "/home/logstash/logs/%{filename}.log"
}
}

following is the output of logstash-forwader.log file

2015/11/22 02:11:28.082132 harvest: "/var/log/dnf.log" position:3177754 (offset snapshot:3177754)
2015/11/22 02:11:28.082190 Registrar will re-save state for /var/log/dnf.rpm.log
2015/11/22 02:11:28.082203 All prospectors initialised with 3 states to persist
2015/11/22 02:11:28.082236 harvest: "/var/log/dnf.rpm.log" position:217854 (offset snapshot:217854)
2015/11/22 02:11:28.082286 harvest: "/var/log/test.log" (offset snapshot:0)
2015/11/22 02:11:28.082624 Setting trusted CA from file: /etc/pki/tls/certs/logstash-forwarder.crt
2015/11/22 02:11:28.082895 Connecting to [192.168.0.100]:5043 (192.168.0.100)
2015/11/22 02:11:28.215599 Connected to 192.168.0.100

Syntax error. I'm surprised it didn't throw an error when you ran.

  filter {
          grok {
              match => ["path" => "/var/log/%{DATA:filename}.log"]
      }

https://www.elastic.co/guide/en/logstash/current/plugins-filters-grok.html

Add this to your logstash output config and run it again.

 stdout { codec => rubydebug}

I want to see the fields of this document after the grok filter.

Following is the output of file and its content

tail -f %{filename}.log

[697874.511] (II) intel(0): EDID vendor "LGD", prod id 952
[697874.511] (II) intel(0): Printing DDC gathered Modelines:
[697874.511] (II) intel(0): Modeline "1366x768"x0.0 69.30 1366 1398 1430 1470 768 771 776 786 -hsync -vsync (47.1 kHz eP)
[698888.139] (II) intel(0): EDID vendor "LGD", prod id 952
[698888.182] (II) intel(0): Printing DDC gathered Modelines:
[698888.182] (II) intel(0): Modeline "1366x768"x0.0 69.30 1366 1398 1430 1470 768 771 776 786 -hsync -vsync (47.1 kHz eP)

I meant the stdout output, not the file output.

 match => ["path" => "/var/log/%{DATA:filename}.log"]

No... that's not a documented syntax. It might work but

match => ["path", "/var/log/%{DATA:filename}.log"]

or

match => { "path" => "/var/log/%{DATA:filename}.log" }

are the two accepted forms.

I changed both the option in filter but still i am getting out file %{filename}.log instead of all files which specified in logstash-forwarder.conf "/var/log/*.log"

My output section as below.

output {
file {
message_format => " %{[message]}"
path => "/home/logstash/logs/%{filename}.log"
}
}

ok.

I changed both the option in filter but still i am getting out file
%{filename}.log instead of all files which specified in
logstash-forwarder.conf "/var/log/*.log" I need all log files specified
under directory /var/log/ to ouput directory. /home/logstash/logs/

My output section as below.

output {
file {
message_format => " %{[message]}"
path => "/home/logstash/logs/%{filename}.log"
}
}

The name of the field containing the filename is file in the logstash-forwarder case (see below). It's Logstash that uses path. Hence, the grok filter to parse the file field instead.

I am new to logstash and not able to understand the code sorry for that.

Could you please help me where is the change required to get the desired out put.

Whether it is possible or not.

In the grok filter that extracts the filename field from the path field, change "path" to "file".

i tried that, not working.

We can try to help if you post additional details, including your configuration, the messages you do get, and what you expected to get. Just posting "it didn't work" is not the best way to get quick help.

apology for that.

Here is my initial configuration to achieve all the logs under directory /var/log/.log to ship /home/logstash/logs/.log. directory as it is without any formatting.

logstash-forwarder.conf

"servers": [ "192.168.0.100:5043" ],
"ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt",
"timeout": 15
},

The list of files configurations

"files": [
{
"paths": [
"/var/log/*.log"
],
"fields": { "type": "syslog" }
}

and logstash.conf

input {
lumberjack {
port => 5043
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
filter {}
output {
file {
message_format => " %{[message]}"
path => "/home/logstash/logs/test.log"
}
}

Post advise, removed filename field from the path field, change "path" to "file". Still i am getting output %{filename}.log under /home/logstash/logs/ instead of /home/logstash/logs/.log e.g on client server file /var/log/test.log should come as it is /home/logstash/logs/test.log on central server.

input {
lumberjack {
port => 5043
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

filter {
grok {
match => { "file" => "/var/log/%{DATA:}.log" } == it was match => { "path" => "/var/log/%{DATA:filename}.log" }
}
}

output {
file {
message_format => " %{[message]}"
path => "/home/logstash/logs/%{filename}.log"
}
}

Replace the file output with a stdout { codec => rubydebug } output and report what you get.

after change my output config is like this
output {
stdout { codec => rubydebug}
}

and i got the following logs under /var/log/logstash/logstash.stdout.

{
"message" => "Jul 14 09:45:19 Installed: putty-0.64-1.fc20.x86_64",
"@version" => "1",
"@timestamp" => "2015-11-25T18:27:06.468Z",
"type" => "syslog",
"file" => "/var/log/bak.log",
"host" => "localhost.localdomain",
"offset" => "23729",
"tags" => [
[0] "_grokparsefailure"
]
}
{
"message" => "Oct 28 23:18:26 Updated: clamav-data-empty-0.98.7-1.fc20.noarch",
"@version" => "1",
"@timestamp" => "2015-11-25T18:27:06.484Z",
"type" => "syslog",
"file" => "/var/log/bak.log",
"host" => "localhost.localdomain",
"offset" => "23781",
"tags" => [
[0] "_grokparsefailure"
]
}

Change

match => { "file" => "/var/log/%{DATA:}.log" }

to

match => { "file" => "/var/log/%{DATA:filename}\.log" }

I don't believe we suggested that you change %{DATA:filename} to %{DATA:}.

That's the point.
It works !!
Thanks for your help.