Looking for a little assistance regarding ELK Stack and BRO logs


(B Iitzkrieg) #1

I think this fits more with Logstash (maybe Filebeat as well) so I'll place it here for now. I am currently collecting logs with BRO IDS, and sending from filebeat on the IDS system, to another system running logstash and Kibana. I am fine with the logs being in standard readable format (how BRO puts them out by default) but also have the need to send these logs elsewhere in JSON. Is there any way to convert these somewhere along the line in logstash? Or is what im proposing not possible within logstash? I can output bro logs as JSON, but would desire both, Maybe this is just a question that can be solved on BRO's end?

Thank you.


(Magnus Bäck) #2

Is there any way to convert these somewhere along the line in logstash?

Sure, just use a json or json_lines codec in whatever output plugin you want to use. Without knowing specifics I can't be more specific.


(B Iitzkrieg) #3

So in my /etc/logstash/conf.d/ directory I have "02-filebeat-input.yml", and I see the following:

########### Output Statement ##################

output {
elasticsearch {
hosts => ["localhost:9200"]
'# user => elastic
'# password => changeme
}
'# stdout { codec => rubydebug }
}

Would I place it in here? What would that look like? Or does it need to be in my logstash.yml file? Let me know if I need to provide any more information.


(Magnus Bäck) #4

Yes, outputs go in an output section in one of the pipeline configuration files (/etc/logstash/conf.d).

I still don't know exactly what you want to do. Produce files with the log events as JSON, one per line? Then use a file output and set its codec option to json_lines.


(B Iitzkrieg) #5

Magnus,

Forgive me as I am just learning how to use the whole ELK stack. Yes, let's say I wanted to write the events (being sent from the BRO system to the logstash system via filebeat in BRO log format) to a file, in JSON on the logstash system. This is now in "/etc/logstash/conf.d/02-filebeat-input.yml"

output {
  elasticsearch {
   hosts => ["localhost:9200"]
#   user => elastic
#   password => changeme
  }
#  stdout { codec => rubydebug }
#  stdout { codec => json }
   file {
    path => "/home/testout/test.log"
    codec => json_lines
  }

}

Now the error in /var/log/logstash/logstash-plain.log:

[2017-08-21T13:58:16,251][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>48, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>6000}
[2017-08-21T13:58:17,047][INFO ][logstash.inputs.beats    ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2017-08-21T13:58:17,137][INFO ][logstash.pipeline        ] Pipeline main started
[2017-08-21T13:58:17,318][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-08-21T13:58:29,515][INFO ][logstash.outputs.file    ] Opening file {:path=>"/home/testout/test.log"}
[2017-08-21T13:58:29,524][INFO ][logstash.outputs.file    ] Opening file {:path=>"/home/testout/test.log"}
[2017-08-21T13:58:29,531][INFO ][logstash.outputs.file    ] Opening file {:path=>"/home/testout/test.log"}
[2017-08-21T13:58:29,534][INFO ][logstash.outputs.file    ] Opening file {:path=>"/home/testout/test.log"}
[2017-08-21T13:58:29,541][INFO ][logstash.outputs.file    ] Opening file {:path=>"/home/testout/test.log"}
[2017-08-21T13:58:29,539][ERROR][logstash.filters.geoip   ] IP Field contained invalid IP address or hostname {:exception=>java.net.UnknownHostException: -: Name or service not known, :field=>"id.orig_h", :$
[2017-08-21T13:58:29,697][INFO ][logstash.outputs.file    ] Opening file {:path=>"/home/testout/test.log"}
[2017-08-21T13:58:29,699][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<Errno::EACCES: Permission denied - /home/testout/test.log>, :backtrace=>["org/jruby/RubyFile.java:370:in $
[2017-08-21T13:58:29,707][INFO ][logstash.outputs.file    ] Opening file {:path=>"/home/testout/test.log"}

For some reason it can't write to that directory? I am logged in as root. I see that error has happened to a few others but not real resolution was posted.

Update: Ended up being a permission issue. Had to chown -R logstash:logstash the directory I wanted to store the json output to. File is writing now. Leaving this for anyone who may come across the same issue.


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.