Hi,
I am trying to pull out relevant fields from a log file message and pass it into elasticsearch. I am still in testing and have managed to use the grok debugger to pull out the fields I need. An example of the log message is:
[09/10/2020 12:11:39] hostname - HTTP Connections Spiking Ok 2.00 Perf Counter test (Current Connections) 4871
[09/10/2020 12:15:05] hostname - HTTP Connections Spiking Bad 6.00 Perf Counter test (Current Connections) 4481
[09/10/2020 12:15:23] hostname - HTTP Connections Spiking Bad 4.00 Perf Counter test (Current Connections) 4857
The two grok patterns I have been able to create is:
\[%{DATESTAMP:Time}\]%{SPACE}%{HOSTNAME:hostname} - (?<result>\w+ \w+ \w+)%{SPACE}%{WORD:status}%{SPACE}%{NUMBER:reply}%{SPACE}%{GREEDYDATA:message}
%{SYSLOG5424SD:Time}%{SPACE}%{HOSTNAME:hostname} - (?<result>\w+ \w+ \w+)%{SPACE}%{WORD:status}%{SPACE}%{NUMBER:reply}%{SPACE}%{GREEDYDATA:message}
I am my logstash running on Windows 2012 R2 and have not even got as far as running it as a service, I am simply running a powershell script to run the batch file along with the config file I have created, to output to stdout, rather than even pass it to ElasticSearch yet. It works and reads the file content without issue when I do not have the filter section in, however when I add the filter section and my pattern it does not like it.
input {
file {
path => "C:/Temp/http2.txt"
type => "log"
start_position => "beginning"
sincedb_path => "NULL"
}
}
filter {
grok {
match => { "message" => \[%{DATESTAMP:Time}\]%{SPACE}%{HOSTNAME:hostname} - (?<result>\w+ \w+ \w+)%{SPACE}%{WORD:status}%{SPACE}%{NUMBER:reply}%{SPACE}%{GREEDYDATA:message} }
}
}
output {
stdout {}
}
I have tried putting " and ' around the pattern after "message => neither work. I have also stripped the pattern down to simply try and extract [%{DATESTAMP:Time}] or %{SYSLOG5424SD:Time} and still get the same error:
[2020-10-15T15:24:39,021][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \\t\\r\\n], \"#\", [A-Za-z0-9_-], '\"', \"'\", [A-Za-z_], \"-\", [0-9], \"[\", \"{\" at line 11, column 27 (byte 178) after filter {\r\n\tgrok {\r\n\t\tmatch => { \"message\" => ", :backtrace=>["C:/Logstash/logstash-core/lib/logstash/compiler.rb:32:in
compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:183:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:69:in
initialize'", "C:/Logstash/logstash-core/lib/logstash/java_pipeline.rb:44:in initialize'", "C:/Logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in
execute'", "C:/Logstash/logstash-core/lib/logstash/agent.rb:357:in block in converge_state'"]}
Any help with this would be appriciated as I'm tearing my hair out?
thanks
Ian