Hi,
i'm using an ELK stack (now) at 5.4.0 on a debian machine and had before version 5.3.1 running.
my usecase: read from rabbitMQ and depending on filter either put events directly into an ES index or output specific events into CSV file. logstash is running as a service.
the ES output part seems to be fine. but the CSV output does not look (anymore) as it should. With version 5.3.1 running, everything was fine.
plugins used:
logstash-input-rabbitmq (5.2.2)
logstash-filter-grok (3.3.1)
logstash-filter-mutate (3.1.3)
logstash-output-csv (3.0.2)
logstash-output-elasticsearch (6.3.0)
logstash-output-stdout (3.1.0)
my config looks like:
#input {
rabbitmq {
host => "xxx"
# queue => "xxx"
durable => false
key => ""
exchange => "traces"
prefetch_count => 50
port => 5672
user => "xxx"
password => "xxxx"
type => "some_event"
vhost => "symphony"
exclusive => true
auto_delete => true
codec => "plain"
}
}
#filter {
if [type] == "some_event" {
mutate {
gsub => [ 'message','"', "" ]
gsub => [ 'message','/', "-" ]
}
grok {
match => { "message" => "%{DATE:date}-%{TIME:time} {source:%{DATA:source},timestamp:%{NUMBER:unix_timestamp},name:%{DATA:name},toolname:%{DATA:toolname},type:%{WORD:event_type},key:%{DATA:key}} " }
patterns_dir => ["/usr/share/logstash/patterns"]
}
mutate {
add_field => ["date_time","%{date} %{time}"]
add_field => ["event_source","raw_some_event"]
}
if "dafuq" == [source] {
mutate {
add_tag => ["dastool_install"]
}
}
else if [source] != "dafuq" {
mutate {
add_tag => ["dastool_other"]
}
}
}
}
#output {
if [type] == "dastool_event" {
if "dastool_install" in [tags] {
csv {
csv_options => {"col_sep" => ";"}
fields => ["@version","@timestamp","date_time","type","source","unix_timestamp","name","toolname","event_type","key"]
path => "/xxxx/logfilename_%{+yyyy.MM.dd}.log"
}
}
elasticsearch {
action => "index"
hosts => ["localhost"]
index => "tool_raw-%{+YYYY.MM.dd}"
}
stdout {
codec => rubydebug
}
}
}
previous output looked like the following:
1;2017-05-08T10:55:32.202Z;2017-05-08 12:55:39;some_event;dafuq;1494240939.354620;xxxxx;einrichtezeit;post;xxxx 1;2017-05-08T10:55:40.578Z;2017-05-08 12:55:47;some_event;dafuq;1494240947.705;xxxx;einrichtezeit;pre;xxxx 1;2017-05-08T10:55:59.278Z;2017-05-08 12:56:06;some:event;dafuq;1494240966.335118;xxxx;einrichtezeit;post;xxxx
output now looks like the following (without linebreaks and it seems not to write my specific fields)
2017-05-08T11:06:11.871Z %{host} 2017-05-08-13:06:19 {source:dafuq,timestamp:1494241579.345582,xxxxx,toolname:einrichtezeit,type:post,key:xxxx} 2017-05-08T11:06:11.871Z %{host} 2017-05-08-13:06:19 {source:dafuq,timestamp:1494241579.411888,xxxx,toolname:einrichtezeit,type:post,key:xxxx} 2017-05-08T11:06:13.139Z %{host} 2017-05-08-13:06:20 {source:dafuq,timestamp:1494241580.418222,xxxxx,toolname:einrichtezeit,type:post,key:xxxx}
When i just have a look at the incoming message, it seems to be fine so far:
14:21:38.602 [[main]-pipeline-manager] INFO logstash.pipeline - Pipeline main started
14:21:38.695 [Api Webserver] INFO logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
14:21:38.825 [[main]<rabbitmq] INFO logstash.inputs.rabbitmq - Connected to RabbitMQ at
#{
"date" => "2017-05-08",
"toolname" => "einrichtezeit",
"event_source" => "raw_some_event",
"source" => "dafuq",
"message" => "2017-05-08-14:23:01 {source:dafuq,timestamp:1494246181.760,name:xxxx,toolname:einrichtezeit,type:pre,key:xxxxx} ",
"type" => "some_event",
"tags" => [
[0] "dastool_install"
],
"@timestamp" => 2017-05-08T12:22:54.295Z,
"event_type" => "pre",
"date_time" => "2017-05-08 14:23:01",
"@version" => "1",
"name" => "xxxxx",
"time" => "14:23:01",
"unix_timestamp" => "1494246181.760",
"key" => "xxxxx"
}
So, does anybody have an idea what has changed since 5.3.1? I'm puzzled