Unable to output csv file from Logstash

I am unable to figure out how to generate *.csv files from Logstash. My goal is to generate csv files from multiple inputs i.e. winlogbeat, filebeat (from Mac Syslogs) and syslog (from our router). I have these 3 inputs working and producing output to Elasticsearch and availabe to view in Kibana. Keeping the config file simple for trouble shooting I have the following logstash conf file working and producing output to stdout and Kibana. I just cannot seem to create a csv file.

Thank you for any assistance and education you can provide.
Rick

input {
beats {
port => 5044
}
}

filter {
csv {}
}

output {
file {
path => "/home/user/winlogbeat_out.csv"
codec => plain
}
csv {
fields => ["1","2","3","4","5","6"]
path => "/home/user/winlogbeat1_out.csv"
}
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["localhost:9200"]
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

Here is a sample of the stdout data

{
"message" => "An account was successfully logged on.\n\nSubject:\n\tSecurity ID:\t\tS-1-0-0\n\tAccount Name:\t\t-\n\tAccount Domain:\t\t-\n\tLogon ID:\t\t0x0\n\nLogon Type:\t\t\t3\n\nImpersonation Level:\t\tDelegation\n\nNew Logon:\n\tSecurity ID:\t\tS-1-5-21-96102576-1364680283-2145283710-1121\n\tAccount Name:\t\tNEBBIOLO$\n\tAccount Domain:\t\tCTCI\n\tLogon ID:\t\t0x2770E0EF\n\tLogon GUID:\t\t{983A4897-CD3E-62D0-5626-02B60DF03A42}\n\nProcess Information:\n\tProcess ID:\t\t0x0\n\tProcess Name:\t\t-\n\nNetwork Information:\n\tWorkstation Name:\t\n\tSource Network Address:\t192.168.5.151\n\tSource Port:\t\t54675\n\nDetailed Authentication Information:\n\tLogon Process:\t\tKerberos\n\tAuthentication Package:\tKerberos\n\tTransited Services:\t-\n\tPackage Name (NTLM only):\t-\n\tKey Length:\t\t0\n\nThis event is generated when a logon session is created. It is generated on the computer that was accessed.\n\nThe subject fields indicate the account on the local system which requested the logon. This is most commonly a service such as the Server service, or a local process such as Winlogon.exe or Services.exe.\n\nThe logon type field indicates the kind of logon that occurred. The most common types are 2 (interactive) and 3 (network).\n\nThe New Logon fields indicate the account for whom the new logon was created, i.e. the account that was logged on.\n\nThe network fields indicate where a remote logon request originated. Workstation name is not always available and may be left blank in some cases.\n\nThe impersonation level field indicates the extent to which a process in the logon session can impersonate.\n\nThe authentication information fields provide detailed information about this specific logon request.\n\t- Logon GUID is a unique identifier that can be used to correlate this event with a KDC event.\n\t- Transited services indicate which intermediate services have participated in this logon request.\n\t- Package name indicates which sub-protocol was used among the NTLM protocols.\n\t- Key length indicates the length of the generated session key. This will be 0 if no session key was requested.",
"@version" => "1",
"@timestamp" => "2016-05-09T20:37:21.430Z",
"beat" => {
"hostname" => "ctci-01",
"name" => "ctci-01"
},
"category" => "Logon",
"computer_name" => "ctci-01.CTCI.local",
"count" => 1,
"event_id" => 4624,
"level" => "Information",
"log_name" => "Security",
"record_number" => "43672042",
"source_name" => "Microsoft-Windows-Security-Auditing",
"type" => "wineventlog",
"host" => "ctci-01",
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"column1" => "An account was successfully logged on."
}

You do not seem to have any fields with these names in the event.

Thank you for the education, I incorrectly assumed that data in colum1 would be put into field1, etc.

For future reference is there a way to dynamically create these field names as the content will change with various event logs.

I have updated my conf file with the following field names.
fields => ["message","@version","@timestamp","beat","hostname","name","category","computer_name","count","event_id","level","log_name","record_number","source_name","type","host","tags","column1"]

However a csv file is still being created.
Rick

correction, still NOT being created.

I found the problem with my conf file. Logstash did not have permissions to write to /home/user.
Rick