Grok issues / Fingerprint issues: Value not imported into ES after 6.x - 7-x update

Hello,

I am new to ELK stack especially the filtering / Grok in Logstash,

We are having issues importing DATA:servicename value into ES after moving from 6.3.X to a 7.14 version.

The grok below is part of our application log filter.conf and was previously working without issue. The missing DATA:servicename value comes from the filename of the log itself.

     grok {
            add_tag => [ "valid", "elastic" ]
            match => [ "message", "%{DATESTAMP:log_date} \[%{DATA:value1}\]\[%{DATA:value2}\] %{LOGLEVEL:level}\s+%{NOTSPACE:logger_name} %{GREEDYDATA:message}",
                       "source", ".+\\Application.log.%{DATA:servicename}\..+\.log" ]
            break_on_match => false
            overwrite => [ "message" ] 

If anyone is able to advise or point us in the right direction that would be greatly appreciated

Provide us with few lines from the log.

I would be more than happy to but the the value we are trying to extract is from the filename of the log rather than something that is contained within it.

The file name format goes like this, we are looking to extract the servicename part of the filename and capture it in ES

Company.Application.servicename.exe.log

Hope that makes sense?

I don't think grok pattern is OK, however, this is an option how to extract only the service name:
%{DATA}\.%{WORD:servicename}\.exe\.log$

How are you sending the data? Are you using filebeat? If yes, did you upgrade it from version 6.X to version 7 as well?

Yes, the whole ELK and Filebeat stack are on 7.14

Filebeat 7 does not send the source field anymore, and in your grok you use this field, the source feidl was deprecated on 6.7 and removed on 7.0.

You need to use [log][file][path] instead of source

Welcome to the ELK stack world! It seems like your grok pattern might need some adjustments for the newer version. Double-checking your log file's format and tweaking the grok pattern accordingly should help you regain that missing DATA:servicename value. Good luck with your troubleshooting! AC Football Cases

Thank you all

Maybe a dumb question but will the grok work if we use filebeat 6.7 while the rest of the stack is on 7.14, and if so is this detrimental. I have read somewhere it is recommended that the entire stack is the same version.

Is it so simple that we can change out the source value under match or does is the syntax completely different in 7.14?

grok {
add_tag => [ "valid", "elastic" ]
match => [ "message", "%{DATESTAMP:log_date} [%{DATA:value1}][%{DATA:value2}] %{LOGLEVEL:level}\s+%{NOTSPACE:logger_name} %{GREEDYDATA:message}",
"[log][file][path]", ".+\Application.log.%{DATA:servicename}..+.log" ]
break_on_match => false
overwrite => [ "message" ]

Thanks in advance

Still having some issues here, this is what i have come up with so far but not having any luck.

I trying to start off small and work from there.

This is my sample config for log file coming in and trying to extract data from the log file path.

filter {
if [log_type] == "Software_Service_log_files" {

#first filter

grok {
add_tag => [ "valid", "elastic" ]
match => { "message" => "%{DATESTAMP:log_date} [%{DATA:di_version}][%{DATA:tool_name}] %{LOGLEVEL:level}\s+%{NOTSPACE:logger_name} %{GREEDYDATA:message}" }
break_on_match => false
overwrite => [ "message" ]
}

#log timestamp
date {
match => ["log_date", "yy-MM-dd HH:mm:ss.SSS"]
}

#Service name extraction
grok {
match => {"[log][file][path]" => ".+\Company.Software.Service.%{DATA}.%{WORD:Software_servicename}..+.log" }
tag_on_failure =>
}

}
}

Here is the error I am currently getting as below

[2023-10-16T01:43:20,869][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "else", "if", [A-Za-z0-9_-], '"', "'", "}" at line 18, column 1 (byte 648) after filter {\r\nif [log_type] == "Software_Service_log_files" {\r\n\t\t\r\n\t\t# first filter\r\n grok {\r\n add_tag => [ "valid", "elastic" ]\r\n match => [ "message", "%{DATESTAMP:log_date} \[%{DATA:di_version}\]\[%{DATA:tool_name}\] %{LOGLEVEL:level}\s+%{NOTSPACE:logger_name} %{GREEDYDATA:message}",\r\n "[log][file][path]", ".+\\Company.Software.Service.%{DATA}\.%{WORD:Software_servicename}\..+\.log" ]\r\n break_on_match => false\r\n overwrite => [ "message" ]\r\n }\r\n date {\r\n match => ["log_date", "yy-MM-dd HH:mm:ss.SSS"]\r\n }\r\n\t\t \r\n}\r\n\r\n", :backtrace=>["C:/logstash/logstash-core/lib/logstash/compiler.rb:32:in compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:187:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in initialize'", "C:/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in initialize'", "C:/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in execute'", "C:/logstash/logstash-core/lib/logstash/agent.rb:391:in block in converge_state'"]}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.