The error message stack trace log is being printed in separate rows instead of a single row

Hi Elastic Discuss Community,

In kibana when i seee the logs message when an error log messges comes it print every error messge in every different line i want that the error message whole stack trace log should get in only one single field can any one help me with this below are my configuartion and the flow is logstash>elasticsearch>kibana

This is my log error message

[${sys:hostname}] [2024-05-02 10:37:05,071] [ERROR] [215644, 56545, ghhf-hsdhh-dhdtgh-hdhfdhfh-554] cfgcdgdfgjfi.djsvjdvsj [com.fdgdrg.dsf.dsd.mgmt.hgtt.imphhl.dhdrthrehh.verify(dhdedhdf.java:281)] 
- Unknown message
com.dgdgd.dhf.hdfhdh.sdrthtrh.gdhnyhbhrt.jnffngg: fghgfhhhgfh nothtt sdhh tht hgstht dhdh : dfhd , hdddfhdfhhhf
tionId : 44546 in edit copy table
        at com.dhhfhfd.hdhd.hdfhdf.fhdfh.jjfgf.impl.gjgfjgfjgjgj.ydyuyjuuyku(fjgfdjytkjtdtyktyytdkytdkytky
ytjytjdyj.java:124) ~[jyjdyjyj-15461-dyjydjydjyd.jar!/:1.41556-yjdyjydrjyj]
        at com.ihdhfhsg.djdjdgj.fjfgjfgjfgj.fjjfjfgjgf.fjffgjgfjgf.impl.fjfjfgjfgjkfgdgtgdjg.jgfjgfjfgjfgjgf(jdfjdfgjsjgjgfsd.java:255) [fjfgjfgdjgf
mt-1454.2-jgfjdfjgfj.jar!/:155.gfjfdjgf]
        at com.dhdh.htht.sjhrdyrj.jtjwttjngvfg.shtrstrjhthst.htfhdffhdf(dhdhdfhshthrtstre.java:191) [rtrhstrhtht-565.2-dhdhd.j
ar!/:14865415-SNAPSHOT]
        at sun.jyyyjntf.shdhdfdf.invoke0(hsdhfdds Method) ~[?:1.56256fh.java:62) ~[?:1746542]
        at sun.tsyttyytgh.dhdshsdrhtherts.sdyhthtsstytyt(thusthstytgfsff.java:43) ~[?:154545442]
        at java.dyytuj.yduyyj.htrtshtrhtrs.tthths(thsthytrht.java:498) ~[?:1745564642]
        at org.dyjyhfxgxhdrtj.rdjdtrdtrj.fgjtrdjdrttr.sujdyrjrdtjdtrpport.drjddrjdrjdttr.fdddthbvr(strusrgred.java:205) [.18.j
a18]
        at org.srtyhedrehb.btitity.nfuyitif.brftyintyi.bvtyiyibhtiyt.bytiittityi(bdirybrirdgfjyjd.java:150) [ytejdtyjtj-web-
855498549845.3.18]b
this is my abc.conf file that i have created in logstash
input {
  file {
    path => "/home/test/abc.log"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}


filter {
    dissect {
        mapping =>{
            "message" => "[%{[system][hostname]}] [%{[log][date]} %{[log][time]}] [%{[log][level]}] [%{[log][id]}, %{[system][id]}, %{[system][tname]}] %{[log][package][notation]} %{[log][package][name]} %{[log][msg]}"
           
        }
    }

    mutate {
        add_field => { "service" => "abc"}
        add_field => { "application" => "def"}
        add_field => { "environment" => "ghi"}
        add_field => { "server" => "klm"}
    }
}


output {
   elasticsearch {
    user => vdsgfsdsdg
    password => gsagergergrargtg
    hosts => ["https://0.0.0.0:9200"]
    data_stream => "true"
    data_stream_type => "logs"
    data_stream_dataset => "testing"
    data_stream_namespace => "test"
  }
  
}

NOTE EDITED BY MOD : Please format your code by putting ``` before and after the code block

Looks like you need a multiline input codec see here

you will probably need it to match a pattern like

[${sys:hostname}] [2024-05-02 10:37:05,071]

Hi @stephenb,

As you suggest i have tried the multiline input codec but still The error message stack trace log is being printed in separate rows instead of a single row, below is the code and output of it.

Can you help me out with this issue.

input {
  file {
    path => "/home/test/abc.log"
    start_position => "beginning"
    codec => multiline
    {
      pattern => "^ \[${sys:hostname}\] \[\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3}\]"
      negate => true
      what => "previous"
    }
  }
}


filter {
    dissect {
        mapping =>{
            "message" => "[%{[system][hostname]}] [%{[log][date]} %{[log][time]}] [%{[log][level]}] [%{[log][id]}, %{[system][id]}, %{[system][tname]}] %{[log][package][notation]} %{[log][package][name]} %{[log][msg]}"
           
        }
    }

    mutate {
        add_field => { "service" => "abc"}
        add_field => { "application" => "def"}
        add_field => { "environment" => "ghi"}
        add_field => { "server" => "klm"}
    }
}

output {
   elasticsearch {
    user => vdsgfsdsdg
    password => gsagergergrargtg
    hosts => ["https://0.0.0.0:9200"]
    data_stream => "true"
    data_stream_type => "logs"
    data_stream_dataset => "testing"
    data_stream_namespace => "test"
  }
  
}

image

Well you will need to provide a better example

[${sys:hostname}]

Is it really hostname or did you sanitize... it is pattern matching...

So I think you need to do a little regexing...

Is it really $ etc...

@stephenb Thankyou for the helps its worked i have sanitize the hostname and after regex its working fine.

1 Like