export JAVA_HOME=/usr/java/jdk1.8.0_73/
sudo /usr/share/logstash/bin/system-install /etc/logstash/startup.options
Using provided startup.options file: /etc/logstash/startup.options
Successfully created system startup script for Logstash
Started logstash and still logstash started with:
sudo systemctl start logstash does't appear to read the directory.
the logstash-plain.log shows:
[2017-02-21T14:11:30,181][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-02-21T14:11:30,190][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#]}
[2017-02-21T14:11:30,194][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250}
[2017-02-21T14:11:30,195][INFO ][logstash.pipeline ] Pipeline main started
[2017-02-21T14:11:30,228][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
I am starting it with /usr/share/logstash/bin/logstash --debug -f /etc/logstash/conf.d/logstash.conf to see if there are any new errors.
The contents of my conf file are:
input {
file {
path => "/home/tdesroch/test_url/*"
type => "urllog"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => " "
columns => ["timestamp","src_ip","src_port","dst_ip","dst_port","method","host","uri","referrer","user_agent"]
}
date {
match => [ "timestamp", "ISO8601" ]
}
}
output {
if [type] == "urllog" {
elasticsearch {
hosts => [ "host1" ]
index => "%{type}-%{+YYYY.MM.dd}"
}
}
# stdout { codec => rubydebug }
# }
}
Still confused. Works with starting from command line just not as a service.