Logstash CSV Output all fields are not working

Hello,
I am testing to get the all the data in .CSV file and unable to get the all the fields and nested in the.CSV file,My config file as below
input {
stdin {}
beats {
port => 5044
}
}

output {
csv {
#fields => ["@timestamp","message","version","architecture"]
fields => ["@timestamp","event.dataset","host.mac","host.hostname","host.os.name","host.ip","log.file.path","message","service.type"]
path => "/tmp/excel/syslogout.csv"
}
}

If i run this i will get the only fields @timestamp and message in my syslogout.csv output file.
2021-09-01T04:25:10.274Z,,,,,,,"type=USER_END msg=audit(1630470302.625:1690): pid=26515 uid=0 auid=987 ses=203 msg='op=PAM:session_close grantors=pam_loginuid,pam_keyinit,pam_limits,pam_systemd acct=""pcp"" exe=""/usr/sbin/crond"" hostname=? addr=? terminal=cron res=success'",

Using version Elasticsearch7.14,kibana 7.14,logstash7.14,filebeat 7.14

Hi,

In the csv output plugin documentation, you can read this about the fields option :
If a field does not exist on the event, an empty string will be written. Supports field reference syntax eg: fields => ["field1", "[nested][field]"].

So in your case, to take into account nested fields, your fields option should be like this :

fields => ["@timestamp","[event][dataset]","[host][mac]","[host][hostname]","[host][os][name]","[host][ip]","[log][file][path]","message","[service][type]"]

If the result does not change, it means that your nested fields do not exist.

Cad.

Hi Cad,
Thanks for the suggestion,I will check and revert.

Yogesh