I downloaded logstash 5.0.0 today and set it up to work with the configuration file from 2.4.0.
Here's what the configuration looks like:
input {
tcp {
port => 24224
host => "127.0.0.1"
mode => "server"
codec => json
type => nodejs
}
}
filter {
}
output {
file { path => "/var/log/logstash/logstash.out.log" }
csv {
csv_options => {"col_sep" => "," "row_sep" => "\n"}
fields => ["[username] [group]"]
path => "/var/www/html/csv/out-%{+dd-MM-YYYY}.csv"
}
}
When I post the following data using curl:
curl 'http://grabber.deltamktgresearch.com/logger/INFO' -H 'Accept-Encoding: gzip, deflate, br' -H 'Accept-Language: en-US,en;q=0.8,en-GB;q=0.6' -H 'Content-Type: application/json;charset=UTF-8' -H 'Accept: application/json, text/plain, */*' --data-binary $'{"message": "foo", "username": "somebody", "group": "somegroup"}' --compressed
I can see in the corrosponding entry in /var/log/logstash.out.log as
{"@timestamp":"2016-11-03T13:46:19.400Z","level":"info","port":22000,"@version":"1","host":"127.0.0.1","label":"frontend","message":"foo","type":"nodejs","username":"somebody", "group":"somegroup"}
But in the csv file, it adds the entry as
2016-11-03T13:46:19.400Z 127.0.0.1 foo
instead of the expected
"somebody", "somegroup"
the latter works fine on logstash 2.4.0.
I added log.level debug as well as --debug and --verbose flags to logstash but it doesn't give me any extra useful insight. Here's what I get
[2016-11-03T13:46:06,039][DEBUG][logstash.outputs.csv ] config LogStash::Outputs::CSV/@workers = 1
[2016-11-03T13:46:06,039][DEBUG][logstash.outputs.csv ] config LogStash::Outputs::CSV/@flush_interval = 2
[2016-11-03T13:46:06,039][DEBUG][logstash.outputs.csv ] config LogStash::Outputs::CSV/@gzip = false
[2016-11-03T13:46:06,042][DEBUG][logstash.outputs.csv ] config LogStash::Outputs::CSV/@filename_failure = "_filepath_failures"
[2016-11-03T13:46:06,042][DEBUG][logstash.outputs.csv ] config LogStash::Outputs::CSV/@create_if_deleted = true
[2016-11-03T13:46:06,042][DEBUG][logstash.outputs.csv ] config LogStash::Outputs::CSV/@dir_mode = -1
[2016-11-03T13:46:06,042][DEBUG][logstash.outputs.csv ] config LogStash::Outputs::CSV/@file_mode = -1
[2016-11-03T13:46:06,043][DEBUG][logstash.outputs.csv ] config LogStash::Outputs::CSV/@spreadsheet_safe = true
and
[2016-11-03T13:46:19,721][DEBUG][logstash.pipeline ] filter received {"event"=>{"@timestamp"=>2016-11-03T13:46:19.400Z, "level"=>"info", "port"=>21483, "@version
"=>"1", "host"=>"127.0.0.1", "label"=>"frontend", "message"=>"foo", "type"=>"nodejs", "username"=>"somebody", "group": "somegroup"}}
[2016-11-03T13:46:19,726][DEBUG][logstash.pipeline ] output received {"event"=>{"@timestamp"=>2016-11-03T13:46:19.400Z, "level"=>"info", "port"=>21483, "@version
"=>"1", "host"=>"127.0.0.1", "label"=>"frontend", "message"=>"foo", "type"=>"nodejs", "username"=>"somebody", "group": "somegroup"}}
[2016-11-03T13:46:19,728][DEBUG][logstash.outputs.file ] File, writing event to file. {:filename=>"/var/log/logstash/logstash.out.log"}
[2016-11-03T13:46:19,728][DEBUG][logstash.outputs.file ] Starting stale files cleanup cycle {:files=>{"/var/log/logstash/logstash.out.log"=>#<IOWriter:0x381ffe57 @active=true, @io=#<File:/var/log/logstash/logstash.out.log>>}}
[2016-11-03T13:46:19,732][DEBUG][logstash.outputs.file ] 0 stale files found {:inactive_files=>{}}
[2016-11-03T13:46:19,733][DEBUG][logstash.outputs.csv ] File, writing event to file. {:filename=>"/var/www/html/csv/out-03-11-2016.csv"}
[2016-11-03T13:46:19,733][DEBUG][logstash.outputs.csv ] Starting stale files cleanup cycle {:files=>{"/var/www/html/csv/out-03-11-2016.csv"=>#<IOWriter:0x6e477afa @
active=true, @io=#<File:/var/www/html/csv/out-03-11-2016.csv>>}}
[2016-11-03T13:46:19,733][DEBUG][logstash.outputs.csv ] 0 stale files found {:inactive_files=>{}}
How do I debug what's going wrong with the csv output plugin?