LS crashing when trying to read CSV with undefined method `tv_sec' error


#1

Hi,

i'm trying to read logifles (written on my own app) with csv format with logstash and output them to an ES index.

i'm using:
logstash 2.4.1
logstash-filter-csv (2.1.3)
logstash-input-file (2.2.5)
logstash-output-elasticsearch (2.7.1)

my csv-input looks like the following:
2016-07-22T12:56:41,774|3|638613696|85807187|0|19,136

my config:

input {
file {
path => "/home/fuxxi/logstash/enriched_om_events-*.log"
start_position => "beginning"
type => "enriched_event"
}
}

filter {
if [type] == "enriched_event" {
csv {
columns => ["@timestamp","service_id","provisioning_id","contract_id","artikelnummer","featurenummer","duration"]
separator => "|"
}
}
}

output {
if [type] == "enriched_event" {
elasticsearch {
action => "index"
hosts => ["localhost"]
index => "om_enriched-%{+YYYY.MM.dd}"
}
stdout {
codec => rubydebug
}
}
}

but, after startup, i get this error:

/opt/logstash/bin/logstash -f /etc/logstash/conf.d/om_elastic.conf
Settings: Default pipeline workers: 8
Pipeline main started
NoMethodError: undefined method `tv_sec' for "2016-07-23T12:47:23,315":String
evaluate at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-event-2.4.1-java/lib/logstash/string_interpolation.rb:153
evaluate at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-event-2.4.1-java/lib/logstash/string_interpolation.rb:90
collect at org/jruby/RubyArray.java:2409
evaluate at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-event-2.4.1-java/lib/logstash/string_interpolation.rb:90
evaluate at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-event-2.4.1-java/lib/logstash/string_interpolation.rb:27
sprintf at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-event-2.4.1-java/lib/logstash/event.rb:202
event_action_params at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:131
event_action_tuple at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:35
multi_receive at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:29
map at org/jruby/RubyArray.java:2414
multi_receive at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:29
each_slice at org/jruby/RubyArray.java:1653
multi_receive at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:28
worker_multi_receive at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/output_delegator.rb:130
multi_receive at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/output_delegator.rb:114
output_batch at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:301
each at org/jruby/RubyHash.java:1342
output_batch at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:301
worker_loop at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:232
start_workers at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.1-java/lib/logstash/pipeline.rb:201

i tried, to find a solution and better understanding while googling for this error, but i'm stuck and not able to figure out, how to solve :frowning:


#2

When i change the filter config like this:

 if [type] == "enriched_event" {
     csv {
         columns => ["timestamp","service_id","provisioning_id","contract_id","artikelnummer","featurenummer","duration"]
         separator => "|"
     }
}

(removing the @ in timestamp) i'm able to forward my csv log to ES but a new @timestamp field is created with the current date/time. In Kibana, i can still use my own timestamp field. But i'm still lacking in knowledge regarding the @timestamp field :frowning:


(Rivaanbechan) #3

I'm no expert but @timestamp is an internal field, so your field is conflicting with it.

I would think you need to name the column timestamp as you've done. Then parse the date through the date filter after the csv filter.

date {
  match => [ "timestamp" , "YYYY-MM-ddHH:mm:ss,SSS" ]
  target => "@timestamp"   
}

I haven't tested this, might have to go lookup the match pattern syntax. Although this should put you on the right track :slight_smile:

Thereafter remove the timestamp field.

mutate {
  remove_field => [ "timestamp" ]
}

Hope this helps :slight_smile:


(Magnus Bäck) #4

Yes, @rivaanbechan is right, a date filter is needed. I suggest the remove_field option is added to the date filter instead.

date {
  match => [ "timestamp" , "YYYY-MM-ddHH:mm:ss,SSS" ]
  remove_field => [ "timestamp" ]
}

#5

Thanks @magnusbaeck and @rivaanbechan, your tipps were helpful!
I was able to solve my problem by using match, target and remove as suggested.

I really appreciated your advice!


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.