File not closed with JDBC input and CSV output

Hi,

I'm using a JDBC input to fetch data from a specific table. I want to put this data in a new file. For this I'm using the CSV output. What I'm seeing, is that the file remains open. I want this file to be closed, because after the creation of the CSV file, I want to place it into a different endpoint (like a blob storage). Is there a way to do this? My conf looks like this:

input {
  jdbc {
    jdbc_driver_library => "/driver/mssql-jdbc-7.2.2.jre8.jar"
    jdbc_driver_class => "com.microsoft.sqlserver.jdbc.SQLServerDriver"
    jdbc_connection_string => "jdbc:sqlserver://xxxxx\xxxx;database=xxxx;user=xxxxx;	password=xxxxxx"
jdbc_user => nil
schedule => "16 15 * * *"
statement => "SELECT * from [dbo].[yyyyyy]
    }
}

filter {

mutate {
	add_field => {"filename" => "%{+YYYY-MM-dd}" }
}

}

output {
csv {
    path => "zzz-yyyy-%{filename}.csv"
	fields => ["@timestamp","seq_num","delete_session_id","create_user"]
}
}

Thanks

This is something different requirement.
you can put output from mysql straight to csv file. why use logstash.

anyway how do you know that this file remains open?

I know, but we want to maintain this ingesting via Logstash and not via MSSQL. What I see is if I run a seperate script, the file is still locked.

I don't see any option for close in output-csv section.

https://www.elastic.co/guide/en/logstash/current/plugins-outputs-csv.html#plugins-outputs-csv-csv_options

No exactly. What I do see now is that the file closes based on the scheduler cron. So lets say you've set it to "* * * * *", the file will close after each next write actions. If you would run the cron every day, that would mean that the file remains open for an extremely long time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.