j2man
March 2, 2017, 5:05pm
1
We need to stop letting this fall off due to inactivity.
Hi have the following logstash config file:
input {
elasticsearch {
hosts => ["localhost"]
index => "myindex"
type => "mytype"
query => '{"query" : { "match_all" : {} }}'
}
}
output {
csv {
fields => [my_fields_array]
path => "export.csv
csv_options => {"col_sep" => "\t" "row_sep" => "\r\n"}
}
}
Opening export.csv i don't get any mapped data fields but only a list of something like:
2016-11-02T07:41:15.917Z %{host} %{message}2016-11-02T07:41:16.708Z %{host}...
Any idea about …
Also bug report has been generated here:
opened 09:46PM - 01 Mar 17 UTC
closed 03:02PM - 17 Mar 17 UTC
bug
Seems that 5.2 broke the CSV output plugin.
This will generate a "csv" with … all the message fields all the time NOT separated by comma's but spaces even though "blah" does not exist as a field. This works correctly in 2.3 and is broken in 5.2.
To recreate pass in a file with this in it:
00:00:00.0 COMM_TURNED_ON YODA
Use this grok pattern:
EVENT_COMM_TURNED_ON %{TIME:event_time}%{SPACE}%{NOTSPACE:event_type}%{SPACE}%{NOTSPACE:name}
input { stdin { } }
filter {
grok {
patterns_dir =>["C:/src/elk/broken"]
match =>["message", "%{EVENT_COMM_TURNED_ON}"]
}
}
output {
if "_grokparsefailure" not in [tags] {
elasticsearch {
index => "raw-data-%{+YYYY.MM.dd}"
}
if "COMM_TURNED_ON" in [message] {
csv {
fields => ["blah"]
csv_options => {"col_sep" => "," "row_sep" => "\r\n"}
path => "C:/src/elk/comm_turned_on.csv"
}
}
}
}
-j
warkolm
(Mark Walkom)
March 19, 2017, 5:25am
2
Looks like this has been sorted in that GH issue
system
(system)
Closed
April 16, 2017, 5:25am
3
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.