I've created the following grok filter; %{TIMESTAMP_ISO8601:Date} %{SYSLOG5424SD:Client_IP} %{WORD:Log_Level} %{JAVACLASS:Java_Class} %{GREEDYDATA:Freeform_Text}
Which appears to be working according to the grok debugger, all the data I want is being pulled correctly, here is an example (sanitized) log line; 2018-04-13 09:32:40,356 [Server.ClientConnection:192.168.55.1:28943] DEBUG uk.co.domain.pos.customercontroller.CustomerDbEngine - engine_getData 94339
Sorry, i'm not 100% sure what you mean by the resulting document - Do you mean the entries that appear in kibana?
I'm also not sure how I'll know whether or not the fields are being extracted correctly, from looking at the grok debugger output it appears they are. I'm very new to this/linux in general, trying to pick it up as I go, sorry if i'm being stupid.
Sorry, i'm not 100% sure what you mean by the resulting document - Do you mean the entries that appear in kibana?
Yes.
I'm also not sure how I'll know whether or not the fields are being extracted correctly, from looking at the grok debugger output it appears they are.
Yes, and that's a good sign but what ultimately matters is what Logstash produces. Inspecting the results in Kibana is one way but during debugging I always recommend taking elasticsearch out of the equation and using a stdout { codec => rubydebug } output to dump the raw events to the Logstash log.
Here are the entries as the appear in kibana. Screenshot
Setting the logstash output to the line above prevents logstash from starting. Logstash is also not writing anything to it's log since changing that line, so I'm not sure how to find out what's causing the crash.
Edit: Never mind, I missed a closing bracket in my output. One second while it loads, then i'll post the data
When using the stdout output, is the data written to a log file or does it only appear in the console? I can't see the correct log file in the output i've got & that output a lot of stuff to the console. Need to grep the source somehow otherwise I'm going to be looking through this output for a long time.
Logstash only seems to be writing out to its log file when initially starting up;
[2018-04-13T15:53:38,758][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-13T15:53:38,763][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-13T15:53:38,767][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-13T15:53:38,778][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//172.19.32.154:9200"]}
[2018-04-13T15:53:39,455][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2018-04-13T15:53:39,531][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main2", :thread=>"#<Thread:0x73b67d6c@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 run>"}
[2018-04-13T15:53:39,545][INFO ][logstash.agent ] Pipelines running {:count=>2, :pipelines=>[".monitoring-logstash", "main2"]}
[2018-04-13T15:53:39,558][INFO ][logstash.inputs.metrics ] Monitoring License OK
[2018-04-13T15:53:39,580][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2018-04-13T20:03:31,459][WARN ][logstash.runner ] SIGHUP received.
This is /var/log/logtstash/logstash-plain.log - If there is another log file I should be looking at could you let me know the path please? I've also checked logstash-stderr and stdout.
Logstash doesn't log that much by default. You can always crank up the loglevel if you're unsure whether the logging works. Assuming that it does work what you posted above indicates that everything is working fine. How do you know that Logstash's beats inputs is receiving any messages?
I really want to remove the Elasticsearch dependency and verify the Filebeat and Logstash setup first. It's highly suspect that the stdout output didn't appear to be produce anything. Try using a file output instead. Are events flowing into that file?
Logstash gives me this error; ConfigurationError", :message=>"Expected one of #, \", ', -, [, { at line 17, column 13
The corresponds to the "/" at the start of the path to the file. My syntax looks correct comparing to the example in the documentation, so i'm not sure what's causing the crash.
Sorted that now, didn't realise it needed to be in square brackets, I don't think that's mentioned in the file output documentation page.
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:01:49,398 [TASK_ENGINE_7] INFO Engine.checkEngineParameters(185) - TASK_ENGINE_7 : Before Sleep : N : 2 : 1000","offset":81543218,"source":"/home/eposc02/ers/log/TASK_ENGINE_7.log","@timestamp":"2016-01-09T12:56:02.609Z"}
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:01:31,198 [TASK_ENGINE_9] INFO Engine.checkEngineParameters(185) - TASK_ENGINE_9 : After Sleep : N : 20 : 500","source":"/home/eposc02/ers/log/TASK_ENGINE_9.log","offset":56623837,"@timestamp":"2016-01-09T12:56:02.609Z"}
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:01:36,259 [MonitorSessCons] INFO ConnectionManager.closeFreeConn ections(445) - Closed 0 connections for pool repwfm","source":"/home/eposc02/ers/log/DFR_ENGINE.log","offset":65014757,"@timestamp":"2016-01-09T12:56:02.609Z"}
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:01:35,922 [ROUTING_ENGINE] DEBUG RouteFilesEngine.processRun(406) - Done reading some files for this loop.","offset":82403066,"source":"/home/eposc02/ers/log/ROUTING_ENGINE.log","@timestamp":"2016-01-09T12:56:02.609Z"}
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:01:58,984 [COMPLETION_ENGINE] INFO Engine.checkEngineParameters( 185) - COMPLETION_ENGINE : Before Sleep : N : 40 : 3000","source":"/home/eposc02/ers/log/COMPLETION_ENGINE.log","offset":9282710,"@timestamp":"2016-01-09T12:56:02.60 9Z"}
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:01:59,788 [REFERRAL_ENGINE] INFO Engine.checkEngineParameters(18 5) - REFERRAL_ENGINE : After Sleep : N : 10 : 2000","offset":8422731,"source":"/home/eposc02/ers/log/REFERRAL_ENGINE.log","@timestamp":"2016-01-09T12:56:02.609Z"}
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:01:42,694 [FAKE_ACK_ENGINE] DEBUG AutoAckEngineTestOnly.processRu n(236) - Done reading some files for this loop.","source":"/home/eposc02/ers/log/FAKE_ACK_ENGINE.log","offset":1359969,"@timestamp":"2016-01-09T12:56:02.609Z"}
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:02:34,090 [TASK_ENGINE_5] INFO Engine.checkEngineParameters(185) - TASK_ENGINE_5 : Before Sleep : N : 2 : 10000","source":"/home/eposc02/ers/log/TASK_ENGINE_5.log","offset":9302560,"@timestamp":"2016-01-09T12:56:02.609Z"}
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:04:33,773 [DATAFLOW_ENGINE] INFO Engine.checkEngineParameters(18 5) - DATAFLOW_ENGINE : After Sleep : N : 2 : 10000","offset":8411106,"source":"/home/eposc02/ers/log/DATAFLOW_ENGINE.log","@timestamp":"2016-01-09T12:56:02.609Z"}
{"@version":"1","beat":{"hostname":"eposc02app.e4bh.internal","version":"6.2.3","name":"eposc02app.e4bh.internal"},"prospector":{"type":"log"},"tags":["beats_input_c odec_plain_applied","_grokparsefailure"],"host":"eposc02app.e4bh.internal","message":"2016-01-09 11:04:33,772 [BACS_ENGINE] INFO Engine.checkEngineParameters(185) - BACS_ENGINE : After Sleep : N : 1 : 10000","source":"/home/eposc02/ers/log/BACS_ENGINE.log","offset":7895490,"@timestamp":"2016-01-09T12:56:02.609Z"}
I can see I'm getting a grok parsing failure for the files which are moving in to logstash - I'm not sure if there is anything wrong with the grok I posted above, as the grok constructor thinks it's alright. Any suggestions?
Also, logstash now crashes after a short time with the error "Error writing to stream /var/log/logstash/logstash-plain.log". This doesn't happen when using the normal output (which writes to logstash-plain.log)
The corresponds to the "/" at the start of the path to the file. My syntax looks correct comparing to the example in the documentation, so i'm not sure what's causing the crash.
Which example in the documentation gave you the impression that the path wouldn't have to be quoted?
Description
This output writes events to files on disk. You can use fields from the event as parts of the filename and/or path.
By default, this output writes one event per line in json format. You can customise the line format using the line codec like
output {
file {
path => ...
codec => line { format => "custom format: %{message}"}
}
}
As I just glanced at this for the syntax - The example further down the page however does have the path in quotes, so this is more my mistake, though I do think it could be made a little more user-friendly by putting the ellipses in quotes to give it some more visibility.
I did try wrapping the path in quotes which didn't resolve the issue, I had to put it in square brackets and quotes to get it working.
I can see I'm getting a grok parsing failure for the files which are moving in to logstash - I'm not sure if there is anything wrong with the grok I posted above, as the grok constructor thinks it's alright. Any suggestions?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.