logstash 7.17.0
Elasticsearch 7.17.0
logstash with all default settings, fresh install
use sudo systemctl start logstash to start, sudo status shows logstash actively running
However, curl -XGET 'localhost:9600/?pretty' returns can not connect to 9600
do a netstat, seems no 9600 port is running at all
Sorry, no need for that. I believe rubydebug already logged the error, that was taken care of.
Do you know how to map mongodb data to Elasticsearch without messing up the document structure? logstash will parse(flatten) any object into string id: like user:{firstName: x} to user_firstName: x. do you know any way to keep the structure?
(maybe in filter copy and remove one by one, but that's too tedious I believe)
Maybe I'll start a new topic?
works like a charm, but mongodb ISODate is giving me troubles
"caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2022-02-22 01:38:05 UTC] with format [strict_date_optional_time||epoch_millis]"
I have a field created_at_iso: ISODate("2022-02-22T01:38:05.150Z") like this, logstash/elastic don't like it.
Do I need to change the format on mongodb side?
or can I do it on logstash side(filter section Date match)?
or can i do it on elastic side(someone mentioned templating)?
and is it related to the reparsing you were talking about?
"failed to parse date field [2022-02-22 01:38:05 UTC] with format [strict_date_optional_time||epoch_millis]"
strict_date_optional_time supports a date, and an optional time. Examples: yyyy-MM-dd'T'HH:mm:ss.SSSZ or yyyy-MM-dd . It will fail to parse when it reaches the UTC at the end. You could try using mutate+gsub to remove it.
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [user] tried to parse field [user] as object, but found a concrete value"
yup, I gsub all UTC to empty strings, It passed that., that works!
However, it returns to the old problem, I have a user field as an object, but it seems just parses it as a JSON string, for example: "{\"firstName\"=>\"x\", \"lastName\"=>\"y\"}"
OK, that is what I was referring to when I wrote "you may then need to re-parse any hashes that it calls .to_s on". If you look at the code, "simple" parsing will .to_s any top level fields in the Mongo data except a Numeric, an Array, or the string "NaN". That converts an object to a string of JSON.
Is it a single field with a constant name? If so, just add a json filter to re-parse it.
Also, I see there is an open issue for the fact that it adds UTC instead of Z to an ISODate when it parses one. Nobody has updated the code in 5 years, that it is unlikely to change.
it is one level deep object, but I got the same error.
But reason is different in details.
I applied filter { json { source => user}}
it found the user field, it tries to parse it, but it seems to be already modified by logstash or something else, just like the example above "{\"firstName\"=>\"x\", \"lastName\"=>\"y\"}"
so the parser complains about ParserError: Unexpected character ('=' (code 61)): was expecting a colon to separate field name and value
is there any step I can do before?
"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [user] tried to parse field [user] as object, but found a concrete value"}
weird same error again, and => did parse to :
As I can tell from the parsed document, it seems move all four fields(email firstName lastName userId) to the top level, and keeps user field with a string like "{"firstName":"a", "lastName":"b", "email":"c", "username":"d"}". And it complains about this user field with a string value instead of an object, I am confused.
So I assume it parsed user object, spread them into top level, and somehow keeps a JSON string for the actual user field
cool! but created_at_iso is still giving troubles, why logstash is not showing all errors at once? I guess it's a fail safe.
Anyway, so "failed to parse date field [2022-02-23 19:09:15 ] with format [strict_date_optional_time||epoch_millis], we got rid of UTC, that seems to be out of the way.
now the white space between are giving me troubles(not sure, but looks like it).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.