I'm trying to get bro output processed with logstash to elasticsearch, but for some reason I can't get it to work. Am I missing something?
Below is some sample output that I'm trying to process. I got this from a pcap file using bro, after changing some settings so it automatically outputs logs in this json format.
{"ts":"2017-03-23T09:07:09.539617Z","uid":"C9jTJj4ooYZ32LlG1k","id.orig_h":"xxx.xxx.xxx.xx","id.orig_p":52440,"id.resp_h":"xx.xxx.xxx.xxx","id.resp_p":80,"trans_depth":1,"method":"GET","host":"www.test.com","uri":"/","version":"1.1","user_agent":"Mozilla/5.0 (Windows NT 6.1; WOW64; rv:52.0) Gecko/20100101 Firefox/52.0","request_body_len":0,"response_body_len":142565,"status_code":200,"status_msg":"OK","tags":[],"resp_fuids":["FoB6Hh2xeEtKGfOZSh"],"resp_mime_types":["text/html"]}
{"ts":"2017-03-23T09:07:10.613164Z","uid":"CLN68d2HXrKwfY2D7a","id.orig_h":"xxx.xxx.xxx.xx","id.orig_p":52441,"id.resp_h":"xxx.xxx.xxx.xxx","id.resp_p":80,"trans_depth":1,"method":"GET","host":"cdn2.test.com","uri":"/images/default/logo.svg","referrer":"http://www.test.com/","version":"1.1","user_agent":"Mozilla/5.0 (Windows NT 6.1; WOW64; rv:52.0) Gecko/20100101 Firefox/52.0","request_body_len":0,"response_body_len":12179,"status_code":200,"status_msg":"OK","tags":[],"resp_fuids":["FO3xZlIqETcEaboTc"],"resp_mime_types":["text/plain"]}
{"ts":"2017-03-23T09:07:10.629316Z","uid":"CLN68d2HXrKwfY2D7a","id.orig_h":"xxx.xxx.xxx.xx","id.orig_p":52441,"id.resp_h":"xxx.xxx.xxx.xxx","id.resp_p":80,"trans_depth":2,"method":"GET","host":"cdn2.test.com","uri":"/images/images_45/s4/5811445_s21.jpg?v=1","referrer":"http://www.test.com/","version":"1.1","user_agent":"Mozilla/5.0 (Windows NT 6.1; WOW64; rv:52.0) Gecko/20100101 Firefox/52.0","request_body_len":0,"response_body_len":14450,"status_code":200,"status_msg":"OK","tags":[],"resp_fuids":["FaZT3o1MBcRyFO7Pi3"],"resp_mime_types":["image/jpeg"]}
{"ts":"2017-03-23T09:07:10.650839Z","uid":"C7nYia4yiQcO5WVAwg","id.orig_h":"xxx.xxx.xxx.xx","id.orig_p":52442,"id.resp_h":"xxx.xxx.xxx.xxx","id.resp_p":80,"trans_depth":1,"method":"GET","host":"cdn2.test.com","uri":"/images/default/fixture-lg.png","referrer":"http://www.test.com/","version":"1.1","user_agent":"Mozilla/5.0 (Windows NT 6.1; WOW64; rv:52.0) Gecko/20100101 Firefox/52.0","request_body_len":0,"response_body_len":3533,"status_code":200,"status_msg":"OK","tags":[],"resp_fuids":["F4bKlC4HWJ0augTjAj"],"resp_mime_types":["image/png"]}
I'm using the following config file. I also tried it without the codec option, no success either. Is there a way that this config file automatically detects the csv columns in the json files, like above? Instead of naming the specific columns?
input {
file {
path => "/home/user/bro-logs/testdata-01-json/http.log"
start_position => "beginning"
codec => "json"
}
}
filter {
csv {
columns => ["ts","uid","id.orig_h","id.orig_p","id.resp_h","id.resp_p","trans_depth","method","host","uri","referrer","version","user_agent","request_body_len","response_body_len","status_code","status_msg","info_code","info_msg","tags","username","password","proxied","orig_fuids","orig_filenames","orig_mime_types","resp_fuids","resp_filenames","resp_mime_types"]
}
}
output {
elasticsearch {
hosts => ["http://127.0.0.1:9200"]
index => "bro-http-%{+YYYY.MM.dd HH:mm:ss}"
}
stdout { codec => rubydebug }
}
I want to output the data to elasticsearch and create a new index every time I run the above config.
When I run the config it doesn't do anything. I just see a blinking cursor after the output shown below:
user@ubuntu:~$ sudo -u logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash/ -f /etc/logstash/conf.d/bro-http.conf --config.reload.automatic
Sending Logstash's logs to /var/log/logstash which is now configured via log4j2.properties
The logstash-plain.log file doesn't show any errors and I think it looks good. Even with --debug I can't see any FATAL errors. It keeps repeating with 'pushing flush onto pipeline'.
[2017-04-01T09:59:15,148][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2017-04-01T09:59:15,153][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-04-01T09:59:15,294][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x739b533e URL:http://localhost:9200/>}
[2017-04-01T09:59:15,295][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-04-01T09:59:15,393][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-04-01T09:59:15,397][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x34da530b URL://localhost>]}
[2017-04-01T09:59:15,400][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-04-01T09:59:15,631][INFO ][logstash.pipeline ] Pipeline main started
[2017-04-01T09:59:15,788][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9614}
I've been reading through the documentation, but I can't figure out what I'm doing wrong. Is my config file wrong, do I need to activate or install something else, do I need to make changes to ELK yml settings files?
Any help is much appreciated, thanks!