Logstash_forwarder connected to lostash-server-IP but never receive event


(Soniaeli) #1

I installed elasticsearch, logstash, kibana, ngix and logstash-forwarder at same server to centralized logs. The log file (allapp.json) is a json file with logs entry like this:

"{\"timestamp\":\"2015-08-30 19:42:26.724\",\"MAC_Address\":\"A8:7C:01:CB:2D:09\",\"DeviceID\":\"96f389972de989d1\",\"RunningApp\":\"null{com.tools.app_logs\\/com.tools.app_logs.Main}{com.gtp.nextlauncher\\/com.gtp.nextlauncher.LauncherActivity}{com.android.settings\\/com.android.settings.Settings$WifiSettingsActivity}{com.android.incallui\\/com.android.incallui.InCallActivity}{com.tools.app_logs\\/com.tools.app_logs.Main}{com.gtp.nextlauncher\\/com.gtp.nextlauncher.LauncherActivity}{com.android.settings\\/com.android.settings.Settings$WifiSettingsActivity}{com.android.incallui\\/com.android.incallui.InCallActivity}\",\"PhoneName\":\"samsung\",\"IP\":\"192.168.1.101\"}"

my logstash.conf is:

 input {
lumberjack {
port => 5002
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}   
udp {
type   => "json"
port   => 5001 
}
  } 
 filter {
  json {
  "source" => "message"
  }
}
output {
elasticsearch { host => localhost }
stdout { codec => rubydebug }
}

my logstash-forwarder.conf (at same system that logstash is installed) is:

{
"network":{ 
"servers": [ "192.168.1.102:5002" ],
"timeout": 15,
"ssl ca": "/etc/pki/tls/certs/logstash-forwarder.crt" },
  "files": [
       {
           "paths":[ "/var/log/app-log/allapp.json" ],
           "fields": { "type": "json" }
       }
       ]
}

my elasticsearch.yml is:

network.host: localhost

when i enter tail -f /var/log/logstash-forwarder/logstash-forwarder.err in terminal i get this:

2015/09/04 11:33:05.282495 Waiting for 1 prospectors to initialise
2015/09/04 11:33:05.282544 Launching harvester on new file: /var/log/app-log/allapp.json
2015/09/04 11:33:05.282591 harvest: "/var/log/app-log/allapp.json" (offset snapshot:0)
2015/09/04 11:33:05.283709 All prospectors initialised with 0 states to persist
2015/09/04 11:33:05.283806 Setting trusted CA from file: /etc/pki/tls/certs/logstash-forwarder.crt
2015/09/04 11:33:05.284254 Connecting to [192.168.1.102]:5002 (192.168.1.102) 
2015/09/04 11:33:05.417174 Connected to 192.168.1.102

the allapp.json file has been update frequently and new log add in it but in above I never see the log which looks like :

Registrar received 1 events

Registrar received 23 events ...

In addition i have another client with logstash-forwarder to send its logs to kibana, logstash-forwarder on that client works correctly and logs from that shown in kibana but at this one client doesn't.
All result in kibana are look like this:

     Time                               file     
September 4th 2015, 06:14:00.942     /var/log/suricata/eve.json
September 4th 2015, 06:14:00.942     /var/log/suricata/eve.json
September 4th 2015, 06:14:00.942     /var/log/suricata/eve.json
September 4th 2015, 06:14:00.942     /var/log/suricata/eve.json 

I want to see logs from /var/log/app-log/allapp.json too in kibana, what is problem? why they aren't shown in kibana? why one client work correctly but logstash-forwarder on same system with logstash doesn't work?


(Mark Walkom) #2

What do the LS logs show?


(Soniaeli) #3

I don't understand your main, please explain more...


(Mark Walkom) #4

You've shown us your LSF logs, but what is happening with Logstash itself?


(Soniaeli) #5

in logstash.log:

{:timestamp=>"2015-09-04T00:39:34.250000+0430", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}
{:timestamp=>"2015-09-04T00:40:35.538000+0430", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}
{:timestamp=>"2015-09-04T11:31:55.563000+0430", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}
{:timestamp=>"2015-09-07T11:19:58.811000+0430", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}

in logstash.err:

Sep 07, 2015 11:20:28 AM org.elasticsearch.node.internal.InternalNode <init>
INFO: [logstash-ubuntu-3991-11624] version[1.7.0], pid[3991], build[929b973/2015-07-16T14:31:07Z]
Sep 07, 2015 11:20:28 AM org.elasticsearch.node.internal.InternalNode <init>
INFO: [logstash-ubuntu-3991-11624] initializing ...
Sep 07, 2015 11:20:28 AM org.elasticsearch.plugins.PluginsService <init>
INFO: [logstash-ubuntu-3991-11624] loaded [], sites []
Sep 07, 2015 11:20:31 AM org.elasticsearch.bootstrap.Natives <clinit>
WARNING: JNA not found. native methods will be disabled.
Sep 07, 2015 11:20:33 AM org.elasticsearch.node.internal.InternalNode <init>
INFO: [logstash-ubuntu-3991-11624] initialized
Sep 07, 2015 11:20:33 AM org.elasticsearch.node.internal.InternalNode start
INFO: [logstash-ubuntu-3991-11624] starting ...
Sep 07, 2015 11:20:33 AM org.elasticsearch.transport.TransportService doStart
INFO: [logstash-ubuntu-3991-11624] bound_address {inet[/0:0:0:0:0:0:0:0:9301]}, publish_address {inet[/192.168.1.101:9301]}
 Sep 07, 2015 11:20:33 AM org.elasticsearch.discovery.DiscoveryService doStart
 INFO: [logstash-ubuntu-3991-11624] elasticsearch/45uWWit4QGKWVLAL-k37lg
 Sep 07, 2015 11:20:36 AM org.elasticsearch.cluster.service.InternalClusterService$UpdateTask run
 INFO: [logstash-ubuntu-3991-11624] detected_master [Puff Adder][PVDxbetbRNGFZ_Fdo75BYA][ubuntu][inet[/127.0.0.1:9300]], added {[Puff Adder][PVDxbetbRNGFZ_Fdo75BYA][ubuntu][inet[/127.0.0.1:9300]],}, reason: zen-disco-receive(from master [[Puff Adder][PVDxbetbRNGFZ_Fdo75BYA][ubuntu][inet[/127.0.0.1:9300]]])
Sep 07, 2015 11:20:36 AM org.elasticsearch.node.internal.InternalNode start
INFO: [logstash-ubuntu-3991-11624] started

So what should i do now???


(Mark Walkom) #6

Is there new data being added to /var/log/app-log/allapp.json?


(Soniaeli) #7

Yes, i updated that file many times and add new data. It's not a old file that logstash forwarder skip it.


#8

I'm not positive, but one issue I've seen is with multiple types, like you have here:

"fields": { "type": "json" }

But you are then sending it to 5002, which you have as:
type => "logs"

It shouldn't matter for anything but filter activation, etc. but I would try to only put one "type" on the entry. Also, is it possible you are sending the output to another index that you aren't looking at in kibana?


(Soniaeli) #9

I changed type => "logs" to type => "json" but nothing had change, also I just have one index in kibana ( logstash-* )


(Mark Walkom) #10

Are you seeing these "missing" events in the stdout output?


(system) #11