Logstash not indexing logs when run as service

Hi,

I have configure ELK stack and ran logstash and when I checked the configuration using below command, it says "configuration OK". When I run the logstash using below command it is indexing logs to Elasticsearch and i could see the logs through Kibana.
" /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/sample.conf"

The problem is when I run logstash as a service, it is not sending logs to elasticsearch. I cannot see logs through Kibana. But logstash service is up and running. following is the logs present in the "logstash-plain.log" file.

"[2018-11-06T13:42:09,430][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.2"}
[2018-11-06T13:42:11,937][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-11-06T13:42:12,413][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://172.25.37.214:9200/]}}
[2018-11-06T13:42:12,423][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://172.25.37.214:9200/, :path=>"/"}
[2018-11-06T13:42:12,599][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://172.25.37.214:9200/"}
[2018-11-06T13:42:12,653][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-11-06T13:42:12,656][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-11-06T13:42:12,677][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["http://172.25.37.214:9200"]}
[2018-11-06T13:42:12,699][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-11-06T13:42:12,723][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-06T13:42:18,209][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_fa3bc1f5b266ecfd40c72246b23bc8e6", :path=>["/var/log/syslog/network2.log"]}
[2018-11-06T13:42:18,310][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x230619ba run>"}
[2018-11-06T13:42:18,364][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2018-11-06T13:42:18,367][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-11-06T13:42:18,697][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
"

logstash config file

input{
file {
path => "/var/log/syslog/network2.log"
start_position => "beginning"
}
}

filter {
grok {
match => {
"message" => ["%{MONTH:Month}%{SPACE}%{NUMBER:Day} %{TIME:Time} %{IP:Host} %{NUMBER:SEQ_NO}: %{DATA:Month_From_Device}%{SPACE}%{NUMBER:Day_From_Device} %{TIME:Time_From_Device}: %%{DATA:Facility}-%{DATA:Severity}-%{DATA:Event}: Login Success [user: %{NUMBER:Service_Number}] [Source: %{IP:Log_in_Source}] %{GREEDYDATA:Information}", "%{MONTH:Month}%{SPACE}%{NUMBER:Day} %{TIME:Time} %{IP:Host} %{NUMBER:SEQ_NO}: %{DATA:Month_From_Device}%{SPACE}%{NUMBER:Day_From_Device} %{TIME:Time_From_Device}: %%{DATA:Facility}-%{DATA:Severity}-%{DATA:Event}: Login failed [user: %{NUMBER:Service_Number}] [Source: %{IP:Log_in_Source}] %{GREEDYDATA:Information}", "%{MONTH:Month}%{SPACE}%{NUMBER:Day} %{TIME:Time} %{IP:Host} %{NUMBER:SEQ_NO}: %{DATA:Month_From_Device}%{SPACE}%{NUMBER:Day_From_Device} %{TIME:Time_From_Device}: %%{DATA:Facility}-%{DATA:Severity}-%{DATA:Event}: %{GREEDYDATA:Information}", "%{MONTH:Month}%{SPACE}%{NUMBER:Day} %{TIME:Time} %{IP:Host} %{NUMBER:SEQ_NO}: %{GREEDYDATA:Information}", "%{MONTH:Month}%{SPACE}%{NUMBER:Day} %{TIME:Time} %{IP:Host} %{DATA:Time_From_device} level=%{DATA:Log_Level} vd=%{DATA:VDOM_Name} srcip=%{IP:VPN_Source_IP} %{DATA:Misc_info} dstip=%{IP:Destination_IP} dstport=%{NUMBER:Destination_Port} %{DATA:Misc_info2} %{DATA:Misc_info3} proto=%{NUMBER:Protocol_Number} action=%{DATA:Action} user=%{DATA:Username} group=%{DATA:Group} policyid=%{NUMBER:Policy_ID} %{GREEDYDATA:Misc_info4}", "%{MONTH:Month}%{SPACE}%{NUMBER:Day} %{TIME:Time} %{IP:Host} %{DATA:Time_From_device} level=%{DATA:Log_Level} vd=%{DATA:VDOM_Name} srcip=%{IP:VPN_Source_IP} %{DATA:Misc_info} dstip=%{IP:Destination_IP} dstport=%{NUMBER:Destination_Port} %{DATA:Misc_info2} %{DATA:Misc_info3} proto=%{NUMBER:Protocol_Number} action=%{DATA:Action} user=%{DATA:Username} policyid=%{NUMBER:Policy_ID} %{GREEDYDATA:Misc_info4}", "%{MONTH:Month}%{SPACE}%{NUMBER:Day} %{TIME:Time} %{IP:Host} %{DATA:Time_From_device} level=%{DATA:Log_Level} vd=%{DATA:VDOM_Name} srcip=%{IP:VPN_Source_IP} %{DATA:Misc_info} dstip=%{IP:Destination_IP} dstport=%{NUMBER:Destination_Port} %{DATA:Misc_info2} %{DATA:Misc_info3} proto=%{NUMBER:Protocol_Number} action=%{DATA:Action} policyid=%{NUMBER:Policy_ID} %{GREEDYDATA:Misc_info4}", "%{MONTH:Month}%{SPACE}%{NUMBER:Day} %{TIME:Time} %{IP:Host} %{DATA:Time_From_device} %{DATA:Misc_Info1} subtype=%{DATA:Event_Type} %{DATA:Misc_Info2} vd=%{DATA:VDOM_Name} logdesc=%{DATA:Log_Description} action=%{DATA:Log_Type} %{DATA:Misc_Info2} remip=%{IP:Remote_IP} tunnelip=%{IP:Tunel_IP} user=%{DATA:User_Name} group=%{DATA:Group_Name} %{GREEDYDATA:Misc_Info3}"]
}
}
}

output {

elasticsearch {

    hosts => "http://[server IP]:9200"
    index => "logstash"
    }
    }

file permissions of log file

-rwxrwxrwx. 1 root root 274624283 Nov 6 14:01 network2.log

Please help me to sort this issue since i am heading nowhere to find the solution.

Thanks
[ Billz1026]

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.