Super_L
(Super L)
January 13, 2017, 6:43am
1
Hi all
I need get 4 chars in a field as es type.
My conf as follow:
filter {
if [driveid] =~ "^CENNAVIVOMS" {
drop { }
} else {
ruby {
code => "
event['@metadata']['doc_type'] = event['rowkey'][0,8]
event['@metadata']['index_suffix'] = event['rowkey'][0,6]
"
}
}
}
output {
elasticsearch {
codec => "json"
hosts => ["192.168.0.19"]
index => "logstash-bmwrtti-%{[@metadata][index_suffix]}"
document_id => "%{rowkey}"
document_type => "%{[@metadata][doc_type]}"
workers => 6
template_name => "template_bmwrtti"
}
}
I get a lot of same exceptions as below after startup,
Ruby exception occurred: undefined method `[]' for nil:NilClass {:level=>:error}
Is there another solution for the requirement ?
Online waiting~
Pls help ~
Ruby exception occurred: undefined method ` ' for nil:NilClass {:level=>:error}
This indicates that an event field didn't exist (or possibly that it existed but contained nil). Are you sure the rowkey
field exists?
Super_L
(Super L)
January 13, 2017, 6:51am
3
Yes , it exists.
BTW My logstash version is 2.0.0
Super_L
(Super L)
January 13, 2017, 6:55am
4
When logstash startup, es will create a index which name is logstash-bmwrtti-%{[@metadata ][index_suffix]} , and then create another index which name is logstash-bmwrtti-201701 .
But at background, the exception is still thrown
Yes , it exists.
That would be very surprising. Please provide an example input event that results in that exception.
Super_L
(Super L)
January 13, 2017, 7:04am
6
input as follow:
input {
kafka {
add_field => {
"datatype" => "request"
}
zk_connect => "192.168.80.1:2181,192.168.80.37:2181,192.168.80.4:2181,192.168.80.5:2181,192.168.80.6:2181,192.168.80.19:2181"
decoder_class => "cn.test.kafka.v08.RTTIDecoder"
consumer_timeout_ms => 30000
topic_id => "RP10000012F901"
auto_offset_reset => "largest"
}
}
and the event json like this:
{
"rowkey": "201701105TMR064JYDQEHFNZR20170110200700465",
"driveid": "5TMR064JYDQEHFNZR",
"uri": "/bmw/gateway/index.do?1=1484050020465&Vers=1511.05.64.00&Velocity=0&OP=gtm&Dlat=0&Clat=477329633&TP_SID=&TP_Apps=TEC-TFP&Guidance=ND&TReq=&Dlon=0&DriveID=5TMR064JYDQEHFNZR&Bearing=&Clon=1390593917&Decoding_Feat=&Reset=1",
"reqtime": 1484050020464,
"restime": 1484050020489,
"successflag": 1,
"answertime": 25,
"serverip": "192.168.59.160",
"adcode": 110105,
"ccplonlat": "40.00930764712393,116.55823563225567",
"deslonlat": "0.0,0.0",
"cversion": "1511.05.64.00",
"dversion": "14",
"responsesize": 34076,
"statuscode": 200,
"@version": "1",
"@timestamp": "2017-01-10T12:07:05.028Z",
"project": "bmw_rtti",
"datatype": "request"
}
Super_L
(Super L)
January 13, 2017, 7:09am
7
When I use
stdout{ codec => rubydebug{ } }
as output, everything is fine.
How do you know that it's the exact same message? What if you wrap the field access in a conditional that you only attempt to access the field if it exists? Does that make a difference?
Super_L
(Super L)
January 13, 2017, 7:30am
9
When I change the output as below, it's working
elasticsearch {
codec => "json"
hosts => ["192.168.59.159"]
index => "logstash-bmwrtti-bbb"
document_id => "%{rowkey}"
document_type => "aaa"
workers => 6
template_name => "template_bmwrtti"
}
Super_L
(Super L)
January 13, 2017, 7:34am
10
Does that means the field rowkey always exists ?
system
(system)
Closed
February 10, 2017, 7:34am
11
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.