Logstash/Kibana breaks my log line into multiple events/documets

Its strange though, Logstash/kibana is breaking my single log line into multiple events/documents. Whereas it should be a single event , otherwise it doesnt solve my purpose to even put these logs into ELK.

Any idea? how this can be fixed.

How could we possibly help without knowing what your configuration and log entries look like?

Sorry about the incomplete info:

Here is the logstash config

input {

beats {
port => 9997
}
}
filter {
if [fields][source] == “monic_tomcat_perf” {

 grok {
    match => {"message" => "%{MONTHDAY} %{MONTH} %{YEAR} %{TIME},%{NUMBER:duration} %{WORD:loglevel}  %{WORD:Activity}   \[\{%{DATA:foo1}\}\]:(.*) execution time: %{NUMBER:executionTime:float} ms"}
    }
 kv   {
      source => "foo1"
      field_split => ", "
     }
 mutate {
   remove_field => [ "foo1" ]
    }

}
else if [fields][source] == "monic_web_log" {
grok {
match => {"message" => "(%{COMMONAPACHELOG})? Client-Correlation-Id=%{NOTSPACE:id} ResponseSecs=%{NUMBER:responsesecs} ResponseMicros=%{NUMBER:responseMicros} ("%{URI:url}")? %{GREEDYDATA:device}"}
}

}

else if [fields][source] == "monic_tomcat_app" {
 grok {
    match => {"message" => "%{MONTHDAY} %{MONTH} %{YEAR} %{TIME},%{NUMBER:duration} %{WORD:loglevel}%{SPACE}%{WORD:Activity} \[\{(%{DATA:foo1})?\}\]: %{GREEDYDATA:foo2} User=\"Associate\(%{DATA:foo3}\)\""}

        }
 kv   {
      source => "foo1"
      field_split => ", "
    }
 kv   {
      source => "foo2"
      field_split => " "
    }

 kv   {
      source => "foo3"
      field_split => ", "
    }

}
}
output {
elasticsearch {
hosts => “localhost:9200”
#manage_template => false
index => harmonic_dev
user => elastic
password => elasticpassword
}
stdout { codec => rubydebug}
}

This the log file line that i injested for reproducing this issue, the output which i see in Kibana is truncated. In the screenshot i am attaching, the first event i am talking about with latest time stamp.

[%t] 08 Aug 2017 18:55:38,203 INFO HomeBaseApiConsumer [{applicationSystemCode=monicapp-app, clientIP=10.218.87.153, clusterId=Cluster-Id-NA, containerId=Container-Id-NA, correlationId=205c2806-2f97-f42f-00f5-9a43aafb9eb3, domainName=defaultDomain, hostName=ip-202-100-x.domain.com, messageId=10.202.100.34-4041d41d-75f3-4282-9aab-dd1ab17ecdf3, userId=ANONYMOUS, webAnalyticsCorrelationId=B347BC083EB9DCE4ED5005506F1F1E63|}]: KpiMetric=“Cta” TransactionName=“ApplicationDetail” TransactionStatus=“Success” User=“Associate(firstName=mike, lastName=henry, role=Consultant, email=mike@domain.com, electronicId=M422)”.

Also i dont see the fields being parsed by logstash, it should have invoked the condition else if [source][fields] = “monic_tomcat_app”

Not sure if my conditional statements are wrong. But two summarise the issues

Kibana truncating the output whereas logstash shoes full message on STDOUT
Fields are not being parsed and no GROk failure error.

Don't post screenshots. Expand an event, switch to the JSON tab, and copy the raw text from there instead.

What does your Filebeat configuration look like? Make sure you post it as preformatted text so it doesn't get mangled.

This is the filebeat.yml

  • input_type: log

    Paths that should be crawled and fetched. Glob based paths.

    paths:

    • /prod/xxx/logs/chassis_logs/xxx_chassis_tomcat_01/monic_app*.log
      fields:
      index: monic_dev
      source: monic_tomcat_app

I am adding few test lines into this file only .

Basically logstash is not all looking at my GROK filters.

I was able to change the setting( truncate:maxHeight) on kibana to 0. and fixed the filebeat.yml file . Everything looks perfect now.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.