LogStash file plugin offline

Hi,
I am unable to install the logstash file input plugin.
On checking the output i found that I need to provide a proxy with access to Internet.
I do not have an Org proxy to download.
Is it possible to download the file input plugin as a zip and then install it locally.

Another point i missed to mention is that, I cannot install JRuby to download the source of this plugin from git and build it myself.
do we have a compiled and built version of the plugin available for download

Why do you want to install the file input plugin? It's included in Logstash.

do we have a compiled and built version of the plugin available for download

All plugins can be found on http://rubygems.org.

Have you read what's said about offline plugin packs on Offline Plugin Management | Logstash Reference [8.11] | Elastic?

Hi,
Thanks for the reply.
The reason i tried to do this is as follows.
i have installed a local version of ElasticSearch 5.4.
it is up and running successfully
I have also installed a version of logstash 5.4.1.
It is up and running successfully but I am unable to stash my first event from a local file.
I have the following init.conf

input {
file {
path => "C:\Shashi\Tools\LogRepository\wmbflowslog*.log"
type => "log"
}
}

output {
elasticsearch { host => ["localhost:9200"] }
}

However when i try to run the following command -
logstash.bat -f C:\Shashi\Tools\LogStash\logstash-5.4.1\RunConfig\init.conf

i get the following error -
[2017-06-23T13:18:19,263][ERROR][logstash.outputs.elasticsearch] Unknown setting 'host' for elasticsearch
[2017-06-23T13:18:19,271][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Something is wrong with your configuration."}

Hence i thought logstash does not have the file plugin so i was trying to get the file input plugin.
Can you tell me what is wrong with my setup...Please let me know if i need to provide any additional information.

The host option disappeared in Logstash 2.0. Since then the option is named hosts. The documentation for your version of Logstash contains all valid options for each plugin.

Hi Magnus thanks for the pointer.
I did go through the documentation and resolved the problem.
Now my logstash is running fine but it does not stash the data into elasticsearch.
i think there is some issue with my conf file.
When i run this conf file -
input {
file {
path => "C:\Shashi\Tools\LogRepository\wmbflowslog\wmbflows2.log"
type => "log"
}
}

filter {
grok {
match => { "message" => "[%{LOGLEVEL:log-level}] %{TIMESTAMP_ISO8601:timestamp}|%{DATA:owner}|%{DATA:eg}|%{DATA:msgFlow}|%{DATA:msgFlow}|%{DATA:userid}|" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
}

the data does NOT go into elasticsearch.
However I tried another option via the stdin input/output .
input { stdin { } }
filter {
grok {
match => { "message" => "[%{LOGLEVEL:log-level}] %{TIMESTAMP_ISO8601:timestamp}|%{DATA:owner}|%{DATA:eg}|%{DATA:msgFlow}|%{DATA:msgFlow}|%{DATA:userid}|" }
}
date {
match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

When i do this and enter the log data via the stdin, then the data does make it to the elasticsearch.
What am I missing in the original configuration.

Logstash is tailing the file and waiting for more lines to be added. Read the file input's documentation and past posts about the file input's sincedb_path and start_position options.

Thanks a lot....now i got it.....

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.