File o/p plugin not working

Hi Team,

Does anyone using file o/p plugin to just ship logs from remote location to a central location?

ES needs a lot of storage so that's not a viable option for me, so I need to ship logs and store logs as files only. Is that a solution which is not recommended at all? I mean can anyone foresee and point any pitfalls with it.

Also, I 'm facing some issue with the o/p plugin.

This is my logstash o/p configuration.

output {
file {
path => "/tmp/logstash_out.log"
codec =>
line {
format => "message: %{message}"
}
}
stdout {}
}

I can see in the logstash logs that it has opened the file but the file doesn't exist on my file-system so I don't have any logs.

Sep  6 16:14:44 rabbit2 0fedf398c594[2187]: [2017-09-06T10:44:44,133][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
Sep  6 16:14:44 rabbit2 0fedf398c594[2187]: [2017-09-06T10:44:44,153][INFO ][logstash.pipeline        ] Pipeline main started
Sep  6 16:14:44 rabbit2 0fedf398c594[2187]: [2017-09-06T10:44:44,203][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
Sep  6 16:14:58 rabbit2 0fedf398c594[2187]: [2017-09-06T10:44:58,361][INFO ][logstash.outputs.file    ] Opening file {:path=>"/tmp/logstash_out.log"}

Can someone please help here?

How you do know that logs are reaching Logstash? Is the stdout output working?

Yes, I can see the logs in stdout (it's a docker image so docker logs shows me the events) and also, when I replace the file o/p plugin with Elasticsearch it works.

Im not sure as im a complete noob to this, but i would try and define in the output section of my logstash where the files need to go.

Try somethig like this :slight_smile:

output {
stdout { codec => plain }
file {
path => "/tmp/logstash_out.log"
}
elasticsearch { hosts => ["ES_IP_ADDRESS:PORT"]
document_type => "text"
index => "what_ever!"
user => elastic
password => changeme
}
}

Also i think your "format" needs to be under your filter {}

@Ferrow, thanks for the help, I tried with the configuration you provided. The logs are in my Stdout but still no file /tmp/logstash_out.log exists on my system :frowning:

Now, I don't see the o/p plugin my logs, strange.

This is my new conf.
output {
stdout { codec => plain }
file {
path => "/tmp/logstash_out.log"
}
}

[2017-09-11T08:57:02,540][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-09-11T08:57:02,806][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2017-09-11T08:57:02,825][INFO ][logstash.pipeline ] Pipeline main started
[2017-09-11T08:57:02,868][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
2017-08-11T11:33:07.584Z aaalogs1 Sending Code=5, Id=53 to 172.16.110.79:55188
2017-08-11T11:33:07.584Z aaalogs1 Sending Code=5, Id=57 to 172.16.110.79:44846

are your log files in plain text? otherwise you would need to change your codec..

in my own path settings i am creating directory as well for them.
and then im adding date and time to the file

so "/tmp/archive/logstash_out.log"%{type}/%{YYYY-MM}/%{type}-%{+YYYY-MM-dd}.log"

would be my best quess.. hope this will help you out, stay in there, it will work before you know it :slight_smile:

the thing that comes to mind on linux is if the logstash user has write privilegs on the /tmp directory

ls -la /tmp

@Shaoranlaos, yes it does. I did initially had a diff directory where from the logs itself I did find because of permission issues logstash couldn't open the file, so I moved it to tmp just so I can complete my tests sooner.

anyway the permissions on my tmp is, so anyone can write in it.
drwxrwxrwt. 9 root root 4096 Sep 11 15:09 tmp

@Ferrow, thanks. I think I did progress a bit. :smiley:

Strange thing now is my logstash is refusing to start and it says the file already exists.

2017-08-11T11:38:15.405Z aaalogs1 Error in Calling Master Script. Status [2] (null) [2017-09-11T09:46:55,136][INFO ][logstash.outputs.file ] Opening file {:path=>"/tmp/logstash_out.log/%{YYYY-MM}/log-2017-08-11.log"}
[2017-09-11T09:46:55,137][INFO ][logstash.outputs.file ] Creating directory {:directory=>"/tmp/logstash_out.log/%{YYYY-MM}"}
[2017-09-11T09:46:55,141][INFO ][logstash.outputs.file ] Opening file {:path=>"/tmp/logstash_out.log/%{YYYY-MM}/log-2017-08-11.log"}
[2017-09-11T09:46:55,142][INFO ][logstash.outputs.file ] Creating directory {:directory=>"/tmp/logstash_out.log/%{YYYY-MM}"}
[2017-09-11T09:46:55,154][INFO ][logstash.outputs.file ] Opening file {:path=>"/tmp/logstash_out.log/%{YYYY-MM}/log-2017-08-11.log"}
[2017-09-11T09:46:55,157][INFO ][logstash.outputs.file ] Creating directory {:directory=>"/tmp/logstash_out.log/%{YYYY-MM}"}
[2017-09-11T09:46:55,158][INFO ][logstash.outputs.file ] Opening file {:path=>"/tmp/logstash_out.log/%{YYYY-MM}/log-2017-08-11.log"}
[2017-09-11T09:46:55,159][INFO ][logstash.outputs.file ] Creating directory {:directory=>"/tmp/logstash_out.log/%{YYYY-MM}"}
[2017-09-11T09:46:55,281][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<Errno::EEXIST: File exists - /tmp/logstash_out.log>, :backtrace=>["org/jruby/RubyDir.java:458:in `mkdir'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/fileutils.rb:247:in `fu_mkdir'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/fileutils.rb:221:in `mkdir_p'", "org/jruby/RubyArray.java:1693:in `reverse_each'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/fileutils.rb:219:in `mkdir_p'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/vendor/jruby/lib/ruby/1.9/fileutils.rb:205:in `mkdir_p'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.2/lib/logstash/outputs/file.rb:268:in `open'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.2/lib/logstash/outputs/file.rb:132:in `multi_receive_encoded'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.2/lib/logstash/outputs/file.rb:131:in `multi_receive_encoded'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-file-4.0.2/lib/logstash/outputs/file.rb:130:in `multi_receive_encoded'", "/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:90:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:13:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:47:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:420:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:419:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:365:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:330:in `start_workers'"]}

Strange thing now is my logstash is refusing to start and it says the file already exists.

It seems you have configured Logstash to write logs to /tmp/logstash_out.log/%{YYYY-MM}/log-2017-08-11.log, but /tmp/logstash_out.log already exists as a file when it needs to be a directory.

1 Like

@magnusbaeck No, I don't have any such file inside my /tmp/

[root@rabbit2 tmp]# ls -a /tmp/logs*
/tmp/logstash.conf

My guess here is logstash did open that file from my previous configuration (which I could see in the logs, you can see my 2nd post in this thread) but somehow it didn't close or flush to the filesystem something like that. I 'm a noob in terms of file system knowledge :frowning:

try and chance the index name and path and run the conf again.. that has worked for me in some cases.

@Ferrow, I did change the directory name so that the file exists error is resolved.

Now I can see logstash is now opening the new directory and log file but still can't see my directory or the file.

[2017-09-11T10:53:42,102][INFO ][logstash.outputs.file ] Opening file {:path=>"/tmp/logstash1_out.log/2017-08/log-2017-08-11.log"}
[2017-09-11T10:53:42,103][INFO ][logstash.outputs.file ] Creating directory {:directory=>"/tmp/logstash1_out.log/2017-08"}

Conf file:
output {
stdout { codec => plain }
file {
path => "/tmp/logstash1_out.%{type}/%{+YYYY-MM}/%{type}-%{+YYYY-MM-dd}.log"
}
}

Damn! found the issue. Thanks for your help @Ferrow and @magnusbaeck.

Since I 'm using docker, the file was getting created inside my docker and I was looking for the file on the host machine. I feel so stupid now. :disappointed_relieved:

So seems the issue is resolved but if you can provide some inputs on my 1st question..

Thanks,
Ashish

im not sure i understand your problem..

Cant see why it should be a problem to transfer files from LS to ES for storage in the manner you are trying. - can you formulate your question in another way? maybe i just dont understand your question :slight_smile:

@Ferrow, I meant will using logstash here would be good solution instead of using something like scp or rysnc instead, since I only need files in same format and I can live w/o any enrichment done on logs.

Also I tried to have some variables in my path so I can have different directory for different applications, since I log file for all applications will be a disaster.

So I tried something like this.
path => "/tmp/logstash5_out%{fields.Log_type}/%{+YYYY-MM}/%{type}-%{+YYYY-MM-dd}.log"

but the variables %{fields.Log_type} and %{type} are not getting replaced, these fields exist as I can see them in my ES o/p.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.