Logstash Output doesnt have grok fields but Event.original only

I am using Logstash 8.3 and i have provided my logs below . When logstsh ingests the data to elastic its ingesting data as whole line as text instead of individual values that i defined in grok pattern.

//My Code here.

logstash config file: 

input {
        file {
                path => "/path-to-logs*/access_log*"
                #type => "webserver-logs"
                sincedb_path => "/dev/null"
                start_position => "beginning"
        }
        file {
                path => "/path-to-logs*/SystemOut_20.01.28*"
                #type => "appserver-logs"
                sincedb_path => "/dev/null"
                start_position => "beginning"
                codec => multiline {
                                pattern => "^*%{DATESTAMP}"
                                negate => true
                                what => "previous"
                }
        }
}
filter {
        if[path] =~ "access" {
                grok {
                        match => {"message" => "%{COMBINEDAPACHELOG} %{QS:jsessionid} Time=\s*%{NUMBER:time_taken} TrueIP=%{IPORHOST:true_ip} Status=%{QS:connection_status} %{IPORHOST:appserver_hostname}:%{NUMBER:appserver_port}"}
                        #overwrite => [ "message"]
                        add_field => [ "received_at", "%{@timestamp}" ]
                        add_field => [ "received_from", "%{host}" ]
                        add_field => [ "log_source", "adhoc-webserver-logs" ]
                }
        } else if [path] =~ "SystemOut" {
                grok {
                        match => {"message" => "(?m)\[%{DATESTAMP:timestamp} %{TZ:timezone}\] %{WORD:thread_id} %{WORD:partial_class_name} \s*%{WORD:log_level} %{GREEDYDATA:log_message}"}
                        #overwrite => [ "message"]
                        add_field => [ "received_at", "%{@timestamp}" ]
                        add_field => [ "received_from", "%{host}" ]
                        add_field => [ "log_source", "adhoc-appserver-logs" ]
                }
        }
        mutate {
                remove_field => ["message"]
        }
}
output {
        if "_grokparsefailure" not in [tags] {
                elasticsearch {
                        hosts => ["myelastichost:9200"]
                        sniffing => true
                        manage_template => false
                        #index => "%{log_source}-%{+YYYY.MM.dd}"
                        index => "adhoc-webserver-logs-%{+YYYY.MM.dd}"
                }
                stdout { codec => rubydebug }
        }
}





Output:


{
         "event" => {
        "original" => "127.0.0.1 - - [18/May/2021:08:51:22 +0000] \"POST /url-path-from-my-application/ HTTP/1.1\" 200 4851 \"https://mywebsite-url.com\" \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/1.1.1.1 Safari/537.36\" \"someaplanumbericjessionid\" Time= 0 TrueIP=1.1.1.1 Status=\"+\" myserverhost:port"
    },
    "@timestamp" => 2022-08-02T11:24:33.617139Z,
      "@version" => "1",
           "log" => {
        "file" => {
            "path" => "/pathtologs/access_log"
        }
    },
          "host" => {
        "name" => "mylogstash-node1"
    }
}

If you look at the event you will see that the event does not have a [path] field, it has a [log][file][path] field. The field was renamed with the introduction of ECS compatibility. Change your conditionals to reference the actual field name.

Hi Badger,

I did not quite get you .

My conditions were, if the path has word access then a different index is created and it has SystemOut it creates a different index.

My issue is, instead of different fields being ingested, whole log entry is being ingested into elastic as one entry as Original.

You do not have a field called [path] that logstash can test. It is called [log][file][path]

So should i change path in my input conditions from [path] to [log] [file] [path] ?

Yes, you should.

This Worked Wohooooo !!!

One last thing, when i check on kibana, its filtering time on @timestamp which is the time logs got ingseted .

Do you know how do i get the timestamp of the lgos to be primary key for data visualization on kibana ? Also my timestamp(log) is ingested as text.

You would have to use a date filter to parse a field from the event. What do your log lines look like?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.