Multiline plugin for logstash not working for gelf input


(Sam Flint) #1

I have configured my logstash inputs as:

input {
  stdin { }
  gelf {
    host => "0.0.0.0"
    port => 12201
    codec => multiline {
      pattern => "^\s"
      what => "previous"
    }
  }

  udp {
    codec => json
    port => 5001
  }

  tcp {
    port => 5000
    codec => json
  }

  beats {
    port => 5044
  }

  http {
    port => 8000
    type => "elb-healthcheck"
  }

}

I can see that the plugin is loading in logs :slight_smile:

[2018-10-17T15:32:31,548][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@pattern = "^\\s"
[2018-10-17T15:32:31,549][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@what = "previous"
[2018-10-17T15:32:31,549][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@id = "7d130818-b874-458a-ae2b-95b4ed9b803e"
[2018-10-17T15:32:31,549][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@enable_metric = true
[2018-10-17T15:32:31,549][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@negate = false
[2018-10-17T15:32:31,550][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@patterns_dir = []
[2018-10-17T15:32:31,550][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@charset = "UTF-8"
[2018-10-17T15:32:31,550][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@multiline_tag = "multiline"
[2018-10-17T15:32:31,551][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@max_lines = 500
[2018-10-17T15:32:31,551][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@max_bytes = 10485760
....
....
....
[2018-10-17T15:32:31,786][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns/maven"}
[2018-10-17T15:32:31,786][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns/squid"}
[2018-10-17T15:32:31,804][TRACE][logstash.codecs.multiline] Registered multiline plugin {:type=>nil, :config=>{"pattern"=>"^\\s", "what"=>"previous", "id"=>"7d130818-b874-458a-ae2b-95b4ed9b803e", "enable_metric"=>true, "negate"=>false, "patterns_dir"=>[], "charset"=>"UTF-8", "multiline_tag"=>"multiline", "max_lines"=>500, "max_bytes"=>10485760}}

But I am still not getting a multiline compression with java logs that follow this pattern. See logstash logs below:

[2018-10-17T15:34:20,357][DEBUG][logstash.pipeline        ] output received {"event"=>{"version"=>"1.1", "message"=>"2018-10-17 19:34:20.170  INFO 1 --- [           main] o.apache.catalina.core.StandardService   : Starting service [Tomcat]", "container_name"=>"ecs-sand-accounting-gl-service-18-sand-accounting-gl-service-ea8dd4c1be9ad2a7dd01", "image_name"=>"038131160342.dkr.ecr.us-east-1.amazonaws.com/accounting-gl-service:efffa2b850182bea02e4b4a2b80c55818245ed98", "level"=>6, "container_id"=>"f4c7500a7920ba28e09c0050557c559cd53f424279db92c3e81b99218dc5918e", "tag"=>"sand-accounting-gl-service", "@version"=>"1", "created"=>"2018-10-17T19:33:51.747006428Z", "command"=>"java -jar accounting-gl-api.jar", "host"=>"ip-10-90-66-54", "@timestamp"=>2018-10-17T19:34:20.171Z, "image_id"=>"sha256:37659d83db2c18a0e51052036340456c41432e513b0f3463b0a4624af6a3455d", "source_host"=>"10.90.66.54"}}
[2018-10-17T15:34:20,357][DEBUG][logstash.pipeline        ] output received {"event"=>{"version"=>"1.1", "container_name"=>"ecs-sand-multitenant-microservice-1-sand-multitenant-microservice-92e4f092d9eef8996200", "image_name"=>"038131160342.dkr.ecr.us-east-1.amazonaws.com/multitenant-microservice:12584d13310d3ff836ff4f37842d1556ea2a98a6", "level"=>6, "short_message"=>"", "container_id"=>"4c0be547f8a7724ccab95ab2f64cf09a6ab3f75ce5f93adedadba3119b45ac72", "tag"=>"sand-multitenant-microservice", "@version"=>"1", "created"=>"2018-10-17T19:30:10.123095574Z", "command"=>"java -jar multitenant-microservice.jar", "host"=>"ip-10-90-66-168", "@timestamp"=>2018-10-17T19:34:20.243Z, "image_id"=>"sha256:12f59bfb89234141d9b0b1e583489563c8644e63b145f0a266a9476a3ddb1f90", "source_host"=>"10.90.66.168"}}
[2018-10-17T15:34:20,358][DEBUG][logstash.pipeline        ] output received {"event"=>{"version"=>"1.1", "message"=>" /\\\\ / ___'_ __ _ _(_)_ __  __ _ \\ \\ \\ \\", "container_name"=>"ecs-sand-multitenant-microservice-1-sand-multitenant-microservice-92e4f092d9eef8996200", "image_name"=>"038131160342.dkr.ecr.us-east-1.amazonaws.com/multitenant-microservice:12584d13310d3ff836ff4f37842d1556ea2a98a6", "level"=>6, "container_id"=>"4c0be547f8a7724ccab95ab2f64cf09a6ab3f75ce5f93adedadba3119b45ac72", "tag"=>"sand-multitenant-microservice", "@version"=>"1", "created"=>"2018-10-17T19:30:10.123095574Z", "command"=>"java -jar multitenant-microservice.jar", "host"=>"ip-10-90-66-168", "@timestamp"=>2018-10-17T19:34:20.244Z, "image_id"=>"sha256:12f59bfb89234141d9b0b1e583489563c8644e63b145f0a266a9476a3ddb1f90", "source_host"=>"10.90.66.168"}}
[2018-10-17T15:34:20,358][DEBUG][logstash.pipeline        ] output received {"event"=>{"version"=>"1.1", "message"=>" \\\\/  ___)| |_)| | | | | || (_| |  ) ) ) )", "container_name"=>"ecs-sand-multitenant-microservice-1-sand-multitenant-microservice-92e4f092d9eef8996200", "image_name"=>"038131160342.dkr.ecr.us-east-1.amazonaws.com/multitenant-microservice:12584d13310d3ff836ff4f37842d1556ea2a98a6", "level"=>6, "container_id"=>"4c0be547f8a7724ccab95ab2f64cf09a6ab3f75ce5f93adedadba3119b45ac72", "tag"=>"sand-multitenant-microservice", "@version"=>"1", "created"=>"2018-10-17T19:30:10.123095574Z", "command"=>"java -jar multitenant-microservice.jar", "host"=>"ip-10-90-66-168", "@timestamp"=>2018-10-17T19:34:20.246Z, "image_id"=>"sha256:12f59bfb89234141d9b0b1e583489563c8644e63b145f0a266a9476a3ddb1f90", "source_host"=>"10.90.66.168"}}
[2018-10-17T15:34:20,358][DEBUG][logstash.pipeline        ] output received {"event"=>{"version"=>"1.1", "message"=>" =========|_|==============|___/=/_/_/_/", "container_name"=>"ecs-sand-multitenant-microservice-1-sand-multitenant-microservice-92e4f092d9eef8996200", "image_name"=>"038131160342.dkr.ecr.us-east-1.amazonaws.com/multitenant-microservice:12584d13310d3ff836ff4f37842d1556ea2a98a6", "level"=>6, "container_id"=>"4c0be547f8a7724ccab95ab2f64cf09a6ab3f75ce5f93adedadba3119b45ac72", "tag"=>"sand-multitenant-microservice", "@version"=>"1", "created"=>"2018-10-17T19:30:10.123095574Z", "command"=>"java -jar multitenant-microservice.jar", "host"=>"ip-10-90-66-168", "@timestamp"=>2018-10-17T19:34:20.246Z, "image_id"=>"sha256:12f59bfb89234141d9b0b1e583489563c8644e63b145f0a266a9476a3ddb1f90", "source_host"=>"10.90.66.168"}}
[2018-10-17T15:34:20,390][DEBUG][logstash.pipeline        ] filter received {"event"=>{"version"=>"1.1", "message"=>"2018-10-17 19:34:20.130  INFO 1 --- [

I believe the above should be one INFO for the output.

Thanks for the help!!!!


Logstsash Multiline Codec with Filebeat is a must
(Sam Flint) #2

@Christian_Dahlqvist you have any idea on this? I assume I have configured it correctly. Would love to take advantage of this multilline plugin with our gelf logs.


(Christian Dahlqvist) #3

What does the logs you are receiving look like? What does Logstash generate?


(Sam Flint) #4

Those are posted in the thread. Well part of it. You can see in the "logstash logs" . the output and message. These are basic java logs on STDOUT and STDERR from the application being pushed to logstash as gelf.

[2018-10-17T15:34:20,357][DEBUG][logstash.pipeline        ] output received {"event"=>{"version"=>"1.1", "container_name"=>"ecs-sand-multitenant-microservice-1-sand-multitenant-microservice-92e4f092d9eef8996200", "image_name"=>"038131160342.dkr.ecr.us-east-1.amazonaws.com/multitenant-microservice:12584d13310d3ff836ff4f37842d1556ea2a98a6", "level"=>6, "short_message"=>"", "container_id"=>"4c0be547f8a7724ccab95ab2f64cf09a6ab3f75ce5f93adedadba3119b45ac72", "tag"=>"sand-multitenant-microservice", "@version"=>"1", "created"=>"2018-10-17T19:30:10.123095574Z", "command"=>"java -jar multitenant-microservice.jar", "host"=>"ip-10-90-66-168", "@timestamp"=>2018-10-17T19:34:20.243Z, "image_id"=>"sha256:12f59bfb89234141d9b0b1e583489563c8644e63b145f0a266a9476a3ddb1f90", "source_host"=>"10.90.66.168"}}
[2018-10-17T15:34:20,358][DEBUG][logstash.pipeline        ] output received {"event"=>{"version"=>"1.1", "message"=>" /\\\\ / ___'_ __ _ _(_)_ __  __ _ \\ \\ \\ \\", "container_name"=>"ecs-sand-multitenant-microservice-1-sand-multitenant-microservice-92e4f092d9eef8996200", "image_name"=>"038131160342.dkr.ecr.us-east-1.amazonaws.com/multitenant-microservice:12584d13310d3ff836ff4f37842d1556ea2a98a6", "level"=>6, "container_id"=>"4c0be547f8a7724ccab95ab2f64cf09a6ab3f75ce5f93adedadba3119b45ac72", "tag"=>"sand-multitenant-microservice", "@version"=>"1", "created"=>"2018-10-17T19:30:10.123095574Z", "command"=>"java -jar multitenant-microservice.jar", "host"=>"ip-10-90-66-168", "@timestamp"=>2018-10-17T19:34:20.244Z, "image_id"=>"sha256:12f59bfb89234141d9b0b1e583489563c8644e63b145f0a266a9476a3ddb1f90", "source_host"=>"10.90.66.168"}}
[2018-10-17T15:34:20,358][DEBUG][logstash.pipeline        ] output received {"event"=>{"version"=>"1.1", "message"=>" \\\\/  ___)| |_)| | | | | || (_| |  ) ) ) )", "container_name"=>"ecs-sand-multitenant-microservice-1-sand-multitenant-microservice-92e4f092d9eef8996200", "image_name"=>"038131160342.dkr.ecr.us-east-1.amazonaws.com/multitenant-microservice:12584d13310d3ff836ff4f37842d1556ea2a98a6", "level"=>6, "container_id"=>"4c0be547f8a7724ccab95ab2f64cf09a6ab3f75ce5f93adedadba3119b45ac72", "tag"=>"sand-multitenant-microservice", "@version"=>"1", "created"=>"2018-10-17T19:30:10.123095574Z", "command"=>"java -jar multitenant-microservice.jar", "host"=>"ip-10-90-66-168", "@timestamp"=>2018-10-17T19:34:20.246Z, "image_id"=>"sha256:12f59bfb89234141d9b0b1e583489563c8644e63b145f0a266a9476a3ddb1f90", "source_host"=>"10.90.66.168"}}
[2018-10-17T15:34:20,358][DEBUG][logstash.pipeline        ] output received {"event"=>{"version"=>"1.1", "message"=>" =========|_|==============|___/=/_/_/_/", "container_name"=>"ecs-sand-multitenant-microservice-1-sand-multitenant-microservice-92e4f092d9eef8996200", "image_name"=>"038131160342.dkr.ecr.us-east-1.amazonaws.com/multitenant-microservice:12584d13310d3ff836ff4f37842d1556ea2a98a6", "level"=>6, "container_id"=>"4c0be547f8a7724ccab95ab2f64cf09a6ab3f75ce5f93adedadba3119b45ac72", "tag"=>"sand-multitenant-microservice", "@version"=>"1", "created"=>"2018-10-17T19:30:10.123095574Z", "command"=>"java -jar multitenant-microservice.jar", "host"=>"ip-10-90-66-168", "@timestamp"=>2018-10-17T19:34:20.246Z, "image_id"=>"sha256:12f59bfb89234141d9b0b1e583489563c8644e63b145f0a266a9476a3ddb1f90", "source_host"=>"10.90.66.168"}}
[2018-10-17T15:34:20,390][DEBUG][logstash.pipeline        ] filter received {"event"=>{"version"=>"1.1", "message"=>"2018-10-17 19:34:20.130  INFO 1 --- [```


I am assuming these should be compressed but logstash is making them separate outputs?

(Christian Dahlqvist) #5

I have not used the gelf input plugin, so it would help to see what it generates, e.g. by sending it directly to a stdout output plugin with a rubydebug codec. If it actually generates a full event it may be that it will not work properly with the multiline codec. Do you know if data is sent to it via UDP or TCP?


(Sam Flint) #6

GELF is over UDP. I did what you said and have looked at the output. The multiline is actually working but the issue is that pattern only matches certain uses cases.

Here is one use case with "/t"

{
           "command" => "java -jar hello-world-api.jar",
           "message" => "2018-10-18 16:50:19.162  INFO 1 --- [           main] com.zaxxer.hikari.HikariDataSource       : HikariPool-1 - Starting...",
          "@version" => "1",
       "source_host" => "10.90.66.26",
    "container_name" => "ecs-sand-hello-world-service-16-sand-hello-world-service-f8d0d082d5d9ccc55400",
          "image_id" => "sha256:78fd2409069aee1892205596f15b2ab383eec8fdc9922722c06d46425986b609",
              ....
}
{
           "command" => "java -jar hello-world-api.jar",
           "message" => "\tat org.springframework.jdbc.datasource.DataSourceUtils.doGetConnection(DataSourceUtils.java:115) [spring-jdbc-5.0.4.RELEASE.jar!/:5.0.4.RELEASE]",
        ...
}
{
           "command" => "java -jar hello-world-api.jar",
           "message" => "\tat org.springframework.jdbc.support.JdbcUtils.extractDatabaseMetaData(JdbcUtils.java:318) [spring-jdbc-5.0.4.RELEASE.jar!/:5.0.4.RELEASE]",
          "@version" => "1",
       "source_host" => "10.90.66.26",
    "container_name" => "ecs-sand-hello-world-service-16-sand-hello-world-service-f8d0d082d5d9ccc55400",
          "image_id" => "sha256:78fd2409069aee1892205596f15b2ab383eec8fdc9922722c06d46425986b609",
              "host" => "ip-10-90-66-26",
           "version" => "1.1",
             "level" => 6,
        "@timestamp" => 2018-10-18T16:50:19.296Z,
           "created" => "2018-10-18T16:50:02.686667785Z",
      "container_id" => "1e1f6e91bd588833f3c28d1e450f57f40e8e079790678ee396469dbb5cd4a11e",
               "tag" => "sand-hello-world-service",
        "image_name" => "038131160342.dkr.ecr.us-east-1.amazonaws.com/hello-world-api:fe34084dd9b01d0ef2f7dcac5e16dbe6eb6dacaf"
}

Second use case is

{
           "command" => "java -jar hello-world-api.jar",
           "message" => "2018-10-18 16:50:18.904  INFO 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean   : Mapping filter: 'requestContextFilter' to: [/*]",
          "@version" => "1",
       "source_host" => "10.90.66.26",
    "container_name" => "ecs-sand-hello-world-service-16-sand-hello-world-service-f8d0d082d5d9ccc55400",
          "image_id" => "sha256:78fd2409069aee1892205596f15b2ab383eec8fdc9922722c06d46425986b609",
              "host" => "ip-10-90-66-26",
           "version" => "1.1",
             "level" => 6,
        "@timestamp" => 2018-10-18T16:50:18.904Z,
           "created" => "2018-10-18T16:50:02.686667785Z",
      "container_id" => "1e1f6e91bd588833f3c28d1e450f57f40e8e079790678ee396469dbb5cd4a11e",
               "tag" => "sand-hello-world-service",
        "image_name" => "038131160342.dkr.ecr.us-east-1.amazonaws.com/hello-world-api:fe34084dd9b01d0ef2f7dcac5e16dbe6eb6dacaf"
}
{
           "command" => "java -jar hello-world-api.jar",
           "message" => "2018-10-18 16:50:18.904  INFO 1 --- [ost-startStop-1] o.s.b.w.servlet.FilterRegistrationBean   : Mapping filter: 'webMvcMetricsFilter' to: [/*]",
          "@version" => "1",
       "source_host" => "10.90.66.26",
    "container_name" => "ecs-sand-hello-world-service-16-sand-hello-world-service-f8d0d082d5d9ccc55400",
          "image_id" => "sha256:78fd2409069aee1892205596f15b2ab383eec8fdc9922722c06d46425986b609",
              "host" => "ip-10-90-66-26",
           "version" => "1.1",
             "level" => 6,
        "@timestamp" => 2018-10-18T16:50:18.933Z,
           "created" => "2018-10-18T16:50:02.686667785Z",
      "container_id" => "1e1f6e91bd588833f3c28d1e450f57f40e8e079790678ee396469dbb5cd4a11e",
               "tag" => "sand-hello-world-service",
        "image_name" => "038131160342.dkr.ecr.us-east-1.amazonaws.com/hello-world-api:fe34084dd9b01d0ef2f7dcac5e16dbe6eb6dacaf"
}

last use case is what the multiline pattern is catching i believe

{
           "command" => "java -jar hello-world-api.jar",
           "message" => "2018-10-18 16:50:14.872  INFO 1 --- [           main] o.s.b.f.s.DefaultListableBeanFactory     : Overriding bean definition for bean 'environmentWebEndpointExtension' with a different definition: replacing [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=org.springframework.boot.actuate.autoconfigure.env.EnvironmentEndpointAutoConfiguration; factoryMethodName=environmentWebEndpointExtension; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/springframework/boot/actuate/autoconfigure/env/EnvironmentEndpointAutoConfiguration.class]] with [Root bean: class [null]; scope=; abstract=false; lazyInit=false; autowireMode=3; dependencyCheck=0; autowireCandidate=true; primary=false; factoryBeanName=org.springframework.cloud.autoconfigure.LifecycleMvcEndpointAutoConfiguration$EndpointConfiguration; factoryMethodName=environmentWebEndpointExtension; initMethodName=null; destroyMethodName=(inferred); defined in class path resource [org/springframework/cloud/autoconfigure/LifecycleMvcEndpointAutoConfiguration$EndpointConfiguration.class]]",
          "@version" => "1",
       "source_host" => "10.90.66.26",
    "container_name" => "ecs-sand-hello-world-service-16-sand-hello-world-service-f8d0d082d5d9ccc55400",
          "image_id" => "sha256:78fd2409069aee1892205596f15b2ab383eec8fdc9922722c06d46425986b609",
              "host" => "ip-10-90-66-26",
           "version" => "1.1",
             "level" => 6,
        "@timestamp" => 2018-10-18T16:50:14.872Z,
           "created" => "2018-10-18T16:50:02.686667785Z",
      "container_id" => "1e1f6e91bd588833f3c28d1e450f57f40e8e079790678ee396469dbb5cd4a11e",
               "tag" => "sand-hello-world-service",
        "image_name" => "038131160342.dkr.ecr.us-east-1.amazonaws.com/hello-world-api:fe34084dd9b01d0ef2f7dcac5e16dbe6eb6dacaf"
}```

(Sam Flint) #7

Going to remove the multi line and see if the out put changes. If I have multiple patterns is there a way to accommodate for all of them in the same pipeline with multiline codec? Can you have multi patterns?


(Christian Dahlqvist) #8

Multiline for data coming over UDP does not really make sense as UDP does not guarantee ordering or even delivery. I would recommend trying to get messages assembled correctly at the source before it reaches Logstash.


(Sam Flint) #9

Thanks. I completely agree on getting it ordered before getting to logstash, but like most companies they put everything on the DevOps team so the developers don't have to actually work. I will try putting in the \t for the pattern match. This I think will get us 70% there. I will let the developers know they need to pick up some of this at the source.

Thanks so much for your time and help.


(Sam Flint) #10

@Christian_Dahlqvist
Okay i updated pattern and it seems to not be working?

input {
stdin { }
gelf {
host => "0.0.0.0"
port => 12201
codec => multiline {
pattern => "^\t"
what => "previous"
}
}
....
....
output {
stdout { codec => rubydebug }
}

This is what I am seeing in the output. SHouldn't it be put into one like? They all start with \t

{
           "version" => "1.1",
        "@timestamp" => 2018-10-18T17:33:53.767Z,
           "created" => "2018-10-18T17:32:54.813785142Z",
       "source_host" => "10.90.66.151",
           "message" => "java.net.UnknownHostException: practitioner.db.sand.corvesta.net",
             "level" => 6,
          "@version" => "1",
      "container_id" => "f928a2d6801e7e71bdf230ed3fb59b681253256de6fd47f8af72c7e91fa8fa2e",
              "host" => "ip-10-90-66-151",
           "command" => "java -jar practitioner-svc.jar",
               "tag" => "sand-practitioner-service",
    "container_name" => "ecs-sand-practitioner-service-22-sand-practitioner-service-d288e59bd8b997863000",
        "image_name" => "038131160342.dkr.ecr.us-east-1.amazonaws.com/practitioner-service:f4ea6543653e99cac9b896406962801321cb5385",
          "image_id" => "sha256:5d771b6d82b518f9de3555793fe67c9fc714e74f09b290121259b4b086682b16"
}
{
           "version" => "1.1",
        "@timestamp" => 2018-10-18T17:33:53.767Z,
           "created" => "2018-10-18T17:32:54.813785142Z",
       "source_host" => "10.90.66.151",
           "message" => "\tat java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) ~[na:1.8.0_111-internal]",
             "level" => 6,
          "@version" => "1",
      "container_id" => "f928a2d6801e7e71bdf230ed3fb59b681253256de6fd47f8af72c7e91fa8fa2e",
              "host" => "ip-10-90-66-151",
           "command" => "java -jar practitioner-svc.jar",
               "tag" => "sand-practitioner-service",
    "container_name" => "ecs-sand-practitioner-service-22-sand-practitioner-service-d288e59bd8b997863000",
        "image_name" => "038131160342.dkr.ecr.us-east-1.amazonaws.com/practitioner-service:f4ea6543653e99cac9b896406962801321cb5385",
          "image_id" => "sha256:5d771b6d82b518f9de3555793fe67c9fc714e74f09b290121259b4b086682b16"
}
{
           "version" => "1.1",
        "@timestamp" => 2018-10-18T17:33:53.769Z,
           "created" => "2018-10-18T17:32:54.813785142Z",
       "source_host" => "10.90.66.151",
           "message" => "\tat org.postgresql.core.PGStream.<init>(PGStream.java:69) ~[postgresql-42.2.1.jar!/:42.2.1]",
             "level" => 6,
          "@version" => "1",
      "container_id" => "f928a2d6801e7e71bdf230ed3fb59b681253256de6fd47f8af72c7e91fa8fa2e",
              "host" => "ip-10-90-66-151",
           "command" => "java -jar practitioner-svc.jar",
               "tag" => "sand-practitioner-service",
    "container_name" => "ecs-sand-practitioner-service-22-sand-practitioner-service-d288e59bd8b997863000",
        "image_name" => "038131160342.dkr.ecr.us-east-1.amazonaws.com/practitioner-service:f4ea6543653e99cac9b896406962801321cb5385",
          "image_id" => "sha256:5d771b6d82b518f9de3555793fe67c9fc714e74f09b290121259b4b086682b16"
}
{
           "version" => "1.1",
        "@timestamp" => 2018-10-18T17:33:53.769Z,
           "created" => "2018-10-18T17:32:54.813785142Z",
       "source_host" => "10.90.66.151",
           "message" => "\tat org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:49) [postgresql-42.2.1.jar!/:42.2.1]",
             "level" => 6,
          "@version" => "1",
      "container_id" => "f928a2d6801e7e71bdf230ed3fb59b681253256de6fd47f8af72c7e91fa8fa2e",
              "host" => "ip-10-90-66-151",
           "command" => "java -jar practitioner-svc.jar",
               "tag" => "sand-practitioner-service",
    "container_name" => "ecs-sand-practitioner-service-22-sand-practitioner-service-d288e59bd8b997863000",
        "image_name" => "038131160342.dkr.ecr.us-east-1.amazonaws.com/practitioner-service:f4ea6543653e99cac9b896406962801321cb5385",
          "image_id" => "sha256:5d771b6d82b518f9de3555793fe67c9fc714e74f09b290121259b4b086682b16"
}
{
           "version" => "1.1",
        "@timestamp" => 2018-10-18T17:33:53.769Z,
           "created" => "2018-10-18T17:32:54.813785142Z",
       "source_host" => "10.90.66.151",
           "message" => "\tat org.postgresql.Driver.makeConnection(Driver.java:452) [postgresql-42.2.1.jar!/:42.2.1]",
             "level" => 6,
          "@version" => "1",
      "container_id" => "f928a2d6801e7e71bdf230ed3fb59b681253256de6fd47f8af72c7e91fa8fa2e",
              "host" => "ip-10-90-66-151",
           "command" => "java -jar practitioner-svc.jar",
               "tag" => "sand-practitioner-service",
    "container_name" => "ecs-sand-practitioner-service-22-sand-practitioner-service-d288e59bd8b997863000",
        "image_name" => "038131160342.dkr.ecr.us-east-1.amazonaws.com/practitioner-service:f4ea6543653e99cac9b896406962801321cb5385",
          "image_id" => "sha256:5d771b6d82b518f9de3555793fe67c9fc714e74f09b290121259b4b086682b16"
}```

(Christian Dahlqvist) #11

I do not know the internals of the multiline codec, but do not see how you can possibly get it to work correctly with Gelf input over UDP where there are a lot of additional fields and no streams or ordering. I would recommend either fixing this at the source or selecting a different delivery mechanism. if you have more than one process sending data these will be interleaved, potentially causing all kind of incorrect combinations.


(Sam Flint) #12

I have found something that I think may be a root cause or bug.

[2018-10-18T16:35:21,174][TRACE][logstash.codecs.multiline] Registered multiline plugin {:type=>nil, :config=>{"pattern"=>"^\\t", "what"=>"previous", "id"=>"27e36f0b-9aad-44ff-b3de-4da18659b951", "enable_metric"=>true, "negate"=>false, "patterns_dir"=>[], "charset"=>"UTF-8", "multiline_tag"=>"multiline", "max_lines"=>500, "max_bytes"=>10485760}}
[2018-10-18T16:35:21,189][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"gelf", :type=>"input", :class=>LogStash::Inputs::Gelf}
[2018-10-18T16:35:21,207][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@pattern = "^\\t"

This pattern ^\t will not match a tab. This is not what I have in the configuration.

  gelf {
    host => "0.0.0.0"
    port => 12201
    codec => multiline {
      pattern => "^\t"
      what => "previous"
    }
  }

When I remove the \ and try ^t the pattern shows up that way

[2018-10-18T16:29:20,653][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"multiline", :type=>"codec", :class=>LogStash::Codecs::Multiline}
[2018-10-18T16:29:20,671][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@pattern = "^t"

Is there a reason why there is an extra \ being added to the pattern?


(system) #13

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.