Parse string data with ; separator in logstash


(Charlotte Dupont) #1

Hi,

I'm trying to parse data that looks like this :

“timestamp1;value1; timestamp2;value2;… timestampN;valueN;”

I'm trying csv filter (as they originally comes from csv), but it is not working well,as it doesn't understand that there is only 2 features.

I know that grok filter can do a lot of things, but i don't know how to configure it so that logstash understand that there is 2 repeated features only.

Any help ?

Thanks


(Sylfaen) #2

Hi

You can try mutate split. It'll create an array [timestamp1,value1,timestamp2...] then you can remove the even indexes which match with the timestampX field's name and have [value1,value2,...]. But what about your output ?


(Charlotte Dupont) #3

Thanks,

I can change the input format for “timestamp1,value1; timestamp2,value2;… timestampN,valueN;”

Then i tried to split on ";" but csv filter does not recognize the array then.

So i tried implementing a new line instead of ";" with "gsub" filter. Here is my config :

 filter {
    mutate{
        gsub => [ "message",";","^M"]
    }

   csv {
     separator => ","
     columns => ["Time","Temp"]
     convert => {"Temp" => "float"}
   }
}

So I get the following message before csv filter :
“timestamp1,value1
timestamp2,value2

timestampN,valueN;”

That's what I wanted, to look like a csv file...

But after csv filter, i get only the first line parsed...
How can I parse all the lines ?


(Sylfaen) #4

It's already like a CSV file. If you create a .csv file with those datas, you'll have a CSV file if you choose , as separator no ? So you don't need the csv filter plugin exceptf if you want to add the columns.

What do you want is to transform a string with a pattern to a CSV file right ? So your mutate clause is sufficient to do what you want or I miss something ^^'


(Charlotte Dupont) #5

I don't want to create a csv file, I just want to parse the data as if they were a csv file, because I have 2 repeated features.
If they is another way than csv, i would be happy to try it.

But the output is into elasticsearch, and i need to do it for streaming events that will come into logstash.


(Charlotte Dupont) #6

I succeed to obtain what I wanted by adding

  split {
     terminator => "^M"
  }

after mutate{} and before csv {}


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.