Parsing log with logstash

i have the same problem please kindly help me, here are my scenarios
i currently learning using logstash and beat, and my first lesson is ingesting csv file using filebeat to logstash and visualize it with kibana on elasticsearch. It works, the config file i use is logstash.conf
now i try to ingesting file with grok filter, on the same config file by editing the previous config file that i use to ingesting csv file. and this problem appears.
questions are :

  1. how do logstash pipeline works? I mean i've deleted the previous config and according to the error it seems there is still a running pipeline?
  2. how to fix this error.

please kindly help me with my silly questions. thank you so much

yes you can remove the space function is called split, read about it what it does.

mutate { strip => [ "field_name"] }

actually it would be good if you open another thread.

but pipeline is something if you want to run config all the time with schedule
command line something if you want to test it.

you have not got the funda down yet.

when you start the logstash daemon, it reads the /etc/logstash/pipeline.yml file and runs all the config listed under that pileline file. and it keeps running. basically it is logstash daemon stays in memory and runs it with all the parameter you have in pipeline.yml file

if you have that daemon running that means you won't be able to run another logstash command on same machine.

once you run logstash daemon it reads all the config and all the setting and puts it in memory. that means after starting daemon if you change your config file it will not take effect of your changes. you have to restart your daemon.
There is a option in logstash.yml which let you do it. but I have not try it

I tried this . But spaces is still there....result below

{
       "message" => [
        [0] "myapp.myproject.notice.student.request-time = 2019-12-13 12:37:01.4 ",
        [1] " myapp.myproject.notice.student.response-time = 2019-12-13 12:37:19.276"
    ],
         "partA" => "myapp.myproject.notice.student.request-time ",
         "partD" => " 2019-12-13 12:37:19.276",
    "@timestamp" => 2019-12-19T16:45:18.527Z,
      "@version" => "1",
         "part1" => [
        [0] "myapp.myproject.notice.student.request-time ",
        [1] " 2019-12-13 12:37:01.4 "
    ],
         "partC" => " myapp.myproject.notice.student.response-time ",
         "part2" => [
        [0] " myapp.myproject.notice.student.response-time ",
        [1] " 2019-12-13 12:37:19.276"
    ],
         "partB" => " 2019-12-13 12:37:01.4 ",
         "host" => "ip-xx-0-0-xx"
}

I added strip in below conf file

conf file

 mutate {
       split => ["message","#"]
       add_field => {"part1" =>"%{[message][0]}"}
       add_field => {"part2" =>"%{[message][1]}"}
       strip => [ "part1"]
       strip => [ "part2"]
       }

           mutate {
           split => ["part1","="]
           add_field => {"partA" =>"%{[part1][0]}"}
           add_field => {"partB" =>"%{[part1][1]}"}
           strip => [ "partA"]
           strip => [ "partB"]
           }

           mutate {
           split => ["part2","="]
           add_field => {"partC" =>"%{[part2][0]}"}
           add_field => {"partD" =>"%{[part2][1]}"}
           strip => [ "partC"]
           strip => [ "partD"]
           }

Is there anything missing ? why spaces are still there ?

This seems way too complicated for me. Why not simply :

filter {

  kv {
    source => "message"
    value_split => "="
    field_split_pattern => "\s*#\s*"
  }
  
  date {
    match => [ "myapp.myproject.notice.student.request-time", "ISO8601" ]
    target => "myapp.myproject.notice.student.request-time"
  }
  
  date {
    match => [ "myapp.myproject.notice.student.response-time", "ISO8601" ]
    target => "myapp.myproject.notice.student.response-time"
  }
  
}

Output :

{
  "@timestamp": "2019-12-19T17:37:42.646Z",
  "@version": "1",
  "host": "localhost",
  "message": "myapp.myproject.notice.student.request-time = 2019-12-13 12:37:01.4 # myapp.myproject.notice.student.response-time = 2019-12-13 12:37:19.276",
  "myapp.myproject.notice.student.request-time": "2019-12-13T12:37:01.400Z",
  "myapp.myproject.notice.student.response-time": "2019-12-13T12:37:19.276Z"
}

im sorry but i thought i have the pretty same problem which showing the same error. so how do we restart the daemon?
and is the daemon still running even if i run another config with -f flag? i mean i ever run logstash.conf and it successfully sending log to my elastic, then i want to send different log so i edit the whole logstash.conf file, then run it with -f flag.
thank you for the responds.

@Transrian
This is not working ...... I get this error logstash log

][main] Exception while parsing KV {:exception=>"Invalid FieldReference: `2019-12-21 15:33:50.055

Is it the issue with ISODate which is used in date match ?

How to fix this ?

@elasticforme

Is there any way to remove the spaces ? strip did not work . result posted.

filter for removing space is strip

https://www.elastic.co/guide/en/logstash/current/plugins-filters-mutate.html

@elasticforme
yes. I checked it. I also did the same thing but it did not work,

Is there anything missing? Could you please check at your end ?

my conf file

mutate {
       split => ["message","#"]
       add_field => {"part1" =>"%{[message][0]}"}
       add_field => {"part2" =>"%{[message][1]}"}
       strip => [ "part1"]
       strip => [ "part2"]
       }

           mutate {
           split => ["part1","="]
           add_field => {"partA" =>"%{[part1][0]}"}
           add_field => {"partB" =>"%{[part1][1]}"}
           strip => [ "partA"]
           strip => [ "partB"]
           }

           mutate {
           split => ["part2","="]
           add_field => {"partC" =>"%{[part2][0]}"}
           add_field => {"partD" =>"%{[part2][1]}"}
           strip => [ "partC"]
           strip => [ "partD"]
           }

Result : It did not remove spaces.
{
"message" => [
[0] "myapp.myproject.notice.student.request-time = 2019-12-13 12:37:01.4 ",
[1] " myapp.myproject.notice.student.response-time = 2019-12-13 12:37:19.276"
],
"partA" => "myapp.myproject.notice.student.request-time ",
"partD" => " 2019-12-13 12:37:19.276",
"@timestamp" => 2019-12-19T16:45:18.527Z,
"@version" => "1",
"part1" => [
[0] "myapp.myproject.notice.student.request-time ",
[1] " 2019-12-13 12:37:01.4 "
],
"partC" => " myapp.myproject.notice.student.response-time ",
"part2" => [
[0] " myapp.myproject.notice.student.response-time ",
[1] " 2019-12-13 12:37:19.276"
],
"partB" => " 2019-12-13 12:37:01.4 ",
"host" => "ip-xx-0-0-xx"

Is there any solution to error ?

mutate processes operations in a fixed order. strip comes before add_field, so when it tries to do the strip the fields do not exist, so strip is a no-op. split them each into two mutate filters.

got confused now. ,,, I already have a "split" there. ......my issue is with strip...not split....right ?

please clarify

You have to use more than one mutate to make things happen in the right order

mutate {
    split => ["part1","="]
    add_field => {"partA" =>"%{[part1][0]}" "partB" =>"%{[part1][1]}" }
}
mutate { strip => [ "partA", "partB" ] }

@Badger @elasticforme
I have tested this now. This works okay in logstash but Kibana is not allowing array types i.e part1 , part2 as used here.

How do you fix this ?

end goal is to show the data in Kibana visualization. but Kibana is not supporting array types.

Kibana is showing this

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.