CSV output plugin

Hi,

I am trying to take the output of parsed input file into a csv file.I am using logstash 5.1.1 version.

Here is my output configuration:-

output
{
csv
{ path => "/root/output/logstash/csvdata.csv"
fields => ["Date","ErrorCode" , "ErrorMessage"]
}

stdout { codec => "json"}
}

It seems that input file has parsed properly as i am able to see the json output on console but output csv file is not getting generated.

I searched on net and it seems that there is some issue with logstash version 5 and logstash 2.3 gives proper result.

If so what is the workaround for logstash version 5.1.1 and should i use logstash 2.3 to generate the csv file

Please advice for the same.

Thanks

Hi ,

CSV file got generated but it took around 30 minutes .

Similary I parsed 4 log files of around 12kb but the csv file generation is taking so much time that i have to kill the process in between.

I have to parse 1000 files of around 12kb and if it will take this much of time for just 4 files how i will be able to see the data for 1000 files.

Here is my filter :-

filter
{ if "XYZ::BaseException" in [message]
grok
{
}}

output {
csv {
path => "/output/data.csv"
codec =>json
}
stdout { codec => "json"}

}

Please advice for the same

Thanks

Hi,
How can we help you ? 30 minuts ist definitly long time, but we have no idea about your configuration, where are you running logstash ? How much memory. Which version ? Can we have some debug logs, memory and host resources ?

Thanks

pts0

Thanks pts0,

I am using CentOs 7 linux OS
filebeat-5.1.1 , logstash-5.1.1 , elasticsearch-5.1.1,kibana-5.1.1-x86_64 all are installed on single instance of CentOS OS.

But presently i am using only filebeat and logstash and want to save the output of logstash in a csv file.

CentOs details -

CPU op-mode(s) : 32-bit, 64-bit
CPU(s) : 2
CPU MHz: 2333.000

output og free -h
total used free share buff/cache
Mem: 3.4G 2.6G 687M 7.3M 219M
Swap : 3.6G 411 3.2G

sample Text of logstash-plain.log

starting server on port :5443
starting pipeline
pipeline main started
successfully started logstah API
opening file{ path => "/out/csvdata.csv"}

If this much of information is sufficent.

Please advice for the same

Thanks

Difficult to track the bottleneck here.
I would try to upgrade to newest version 5.4.1, if is not too much work.
Then check routing, connection to port and so on. I don't think is slow due to server performance, is more a settings problem.

Thanks pts0

Issue has resolevd there was some error in conf file.

Hi ,

I am using file output plugin to save data in csv file.
file
{ path => "D:/file.csv"
codec => line { format => "%{field1}, %{field2}, %{field3}"}
}

This works fine if i am using using single grok filter .

In my senario there are 3 grok filter, each grok is giving me a field.
e.g i am getting field1 through 1st grok , field2 through 2 and field3 throgh 3 , now when i am using same file output to save data in csv file . data is not properly formated and %{field} word is also coming multiple times.

So my question is how can i save data from multiple grok in a csv line file

Thanks
Richa Gautam

Hi ,

I am using csv filter after grok filter to save output in a csv file but its giving me classcastexception:StringBiValue cannot be cast to java.lang.String

grok
{
}
grok
{
}

csv{
cloumns => ["path" ,"filename", "error","errodesc"]
separator => ","
}

output
{
csv
{
path => "/file.csv"
fields => ["path" ,"filename", "error","errodesc"]
csv_option => ["col_sep" => "," "row_sep" => "\n"]
}
}

Can i use this csv filter to after grok to save data in csv format.

Pelase advice

Thanks

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.