Only title/first row get inserted in in elasticsearch

Hello team,
I am sending data to Elasticsearch , I have .log files which containes below data. But Only title/first row get inserted in in Elasticsearch

Sample log:

Server Started	Timestamp	SeverityID	EventID	Severity	Message
20161121T183241.000+0400	20161121T183241.000+0400	4	700	Information	NUMA: QVS configured to get adapt to NUMA environment

Logstash:

filter {
    if [type] == "facebook" {
        grok {
            match => {
                message => [
                    '%{NUMBER}\s*%{NUMBER}\s*%{WORD:level}\s*%{GREEDYDATA}\s*EBIL\\%{WORD:userid}',
                    '%{NUMBER}\s*%{NUMBER}\s*%{WORD:level}\s*%{GREEDYDATA:message1}',
                    '%{GREEDYDATA}\s*user\\%{WORD:userid}',
                    '%{GREEDYDATA:message1}'
                ]
            }
        }
		mutate  {
            add_field => { "log" => "%{time} %{message}" }
        }     
        date {
            match => ["time", "yyyyMMdd'T'HHmmss.SSS"]
            target => "@timestamp"
        }
        mutate  {
            remove_field => ["message1","message","kafkatime"]
        }                                 
    }
}

Kibana data looks like:

input {
beats {
port => 5044
codec => multiline {
pattern => "(\d){4}-(\d){2}-(\d){2} (\d){2}:(\d){2}:(\d){2},(\d){3}"
negate => true
what => "previous"
}
}
}
dissertation writing services
filter {
if [type] == "catalina" {
grok {
match => [ "message", "(?(\d){4}-(\d){2}-(\d){2} (\d){2}:(\d){2}:(\d){2},(\d){3})" ]
}
date {
match => [ "sourcestamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]
timezone => "UTC"
}
}
}

sending data throgh fluentd agent to logstash

Your timestamps appear to be from 2016. Have you adjusted the time scale picker in kibana to include that, or are you just looking at recent events?

On recent events as well giving same issue

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.