Shodan query returns an ugly result i cant seem to fix

hi, i'm querying with some ip addresses.. every so often i get this message :
[2020-04-06T20:37:51,625][WARN ][logstash.outputs.elasticsearch][darkwebrdp-hospitals] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"darkwebrdp-hospitals", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x5e471612>], :response=>{"index"=>{"_index"=>"dw-clinic", "_type"=>"_doc", "_id"=>"sSk2UXEBT27AkRjbsxsZ", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [] of type [long] in document with id 'sSk2UXEBT27AkRjbsxsZ'. Preview of field's value: '5839668960810396903895068807565469154'", "caused_by"=>{"type"=>"i_o_exception", "reason"=>"Numeric value (5839668960810396903895068807565469154) out of range of long (-9223372036854775808 - 9223372036854775807)\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@21da2d84; line: 1, column: 11107]"}}}}}

i think i get it.. that field "" is too long for a long type variable.. and its freaking out..

i've tried converting it to a string, i've tried removing it entirely (i dont need it), i've tried replacing any value in there with "na", i've tried replacing that value with 0.. nothing works.

i've tried both:

i'm out of ideas.. any suggestions would be appreciated

It is possible that one of the field names contains a period. For example, it could be




I suggest you add

output { stdout { codec => rubydebug } }

which will allow you to see the actual structure of the field.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.