hello
i try to pars json data that has string data which one of these json data just has one string line like : 021965425836
and another one has string line that separate with "," look like this:
0201923165000,0202000415001,0202302800000
the problem is logstash cannot pars each json data that separate with "," and give back this error
rror"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [deposit]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: "0201923165000,0202000415001,0202302800000""}}}}}
how can i fix that?
How should this data be represented in ES? Do you really want to store it as a string of comma-separated sequences of digits?
actually these data are part of mongodb collection and these digits are part of json data that store in mongodb collection so i have to separate them with ",".
I'm not sure that really answers my question, but the problem is probably that the deposit
field has been mapped as an integer and an integer field can't store "0201923165000,0202000415001,0202302800000". You can use an index template to force the mapping of the deposit
field to be a string. You'll have to reindex the data.
could you give me some example of "index template"?
Have you read the Elasticsearch documentation on index templates?
yes but there isn't any input templates for mongoDB
You're looking for the wrong thing. Index templates configure, among other things, the fields, their data types, and other indexing-related properties. From where your data originates has absolutely nothing to do with any of that.
could you send me some links ? and i am really appreciated for your response.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.