Can not connect to localhost:9600 for logstash

logstash 7.17.0
Elasticsearch 7.17.0
logstash with all default settings, fresh install
use sudo systemctl start logstash to start, sudo status shows logstash actively running

However, curl -XGET 'localhost:9600/?pretty' returns can not connect to 9600
do a netstat, seems no 9600 port is running at all

Please advise

Check Logstash logs: /var/log/logstash to see is Logstash running at all.

Add to logstash.yml:

config.debug: true
log.level: debug

Sorry, no need for that. I believe rubydebug already logged the error, that was taken care of.
Do you know how to map mongodb data to Elasticsearch without messing up the document structure? logstash will parse(flatten) any object into string id: like user:{firstName: x} to user_firstName: x. do you know any way to keep the structure?
(maybe in filter copy and remove one by one, but that's too tedious I believe)
Maybe I'll start a new topic?

On a mongodb input, the default for the parse_method option is "flatten". You may want "simple" instead.

Although you may then need to re-parse any hashes that it calls .to_s on.

works like a charm, but mongodb ISODate is giving me troubles

"caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2022-02-22 01:38:05 UTC] with format [strict_date_optional_time||epoch_millis]"

I have a field created_at_iso: ISODate("2022-02-22T01:38:05.150Z") like this, logstash/elastic don't like it.
Do I need to change the format on mongodb side?
or can I do it on logstash side(filter section Date match)?
or can i do it on elastic side(someone mentioned templating)?
and is it related to the reparsing you were talking about?

"failed to parse date field [2022-02-22 01:38:05 UTC] with format [strict_date_optional_time||epoch_millis]"

strict_date_optional_time supports a date, and an optional time. Examples: yyyy-MM-dd'T'HH:mm:ss.SSSZ or yyyy-MM-dd . It will fail to parse when it reaches the UTC at the end. You could try using mutate+gsub to remove it.

1 Like

"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [user] tried to parse field [user] as object, but found a concrete value"

yup, I gsub all UTC to empty strings, It passed that., that works!
However, it returns to the old problem, I have a user field as an object, but it seems just parses it as a JSON string, for example: "{\"firstName\"=>\"x\", \"lastName\"=>\"y\"}"

OK, that is what I was referring to when I wrote "you may then need to re-parse any hashes that it calls .to_s on". If you look at the code, "simple" parsing will .to_s any top level fields in the Mongo data except a Numeric, an Array, or the string "NaN". That converts an object to a string of JSON.

Is it a single field with a constant name? If so, just add a json filter to re-parse it.

Also, I see there is an open issue for the fact that it adds UTC instead of Z to an ISODate when it parses one. Nobody has updated the code in 5 years, that it is unlikely to change.

it is one level deep object, but I got the same error.
But reason is different in details.
I applied filter { json { source => user}}
it found the user field, it tries to parse it, but it seems to be already modified by logstash or something else, just like the example above "{\"firstName\"=>\"x\", \"lastName\"=>\"y\"}"
so the parser complains about ParserError: Unexpected character ('=' (code 61)): was expecting a colon to separate field name and value
is there any step I can do before?

my filter so far

filter {
        json {
                source => "user"
        }
        mutate {
                gsub => ["created_at_iso","UTC",""]
        }
        mutate {
                remove_field => ["log_entry"]
        }
}

@TomYang1993, that statement is incorrect. It uses "=>" to separate the key and value, not ":". You can fix this using mutate+gsub

input { generator { count => 1 lines => [ '' ] } }
filter {
    mutate { add_field => { "[@metadata][b]" => "1" "[@metadata][c]" => "2" } }
    ruby { code => 'event.set("foo", event.get("@metadata").to_s)' }
    mutate { gsub => [ "foo", "=>", ":" ] }
    json { source => "foo" }
}

"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [user] tried to parse field [user] as object, but found a concrete value"}

weird same error again, and => did parse to :

As I can tell from the parsed document, it seems move all four fields(email firstName lastName userId) to the top level, and keeps user field with a string like "{"firstName":"a", "lastName":"b", "email":"c", "username":"d"}". And it complains about this user field with a string value instead of an object, I am confused.

So I assume it parsed user object, spread them into top level, and somehow keeps a JSON string for the actual user field

filter {
        mutate {
                gsub => ["user", "=>",":"]
        }
        json {
                source => "user"
        }
        mutate {
                gsub => ["created_at_iso","UTC",""]
        }
        mutate {
                remove_field => ["log_entry"]
        }
}

You might need to change that to

json { source => "foo" target => "user" }

Worst case, mutate+rename user to a [@metadata] sub-field, then parse it with a json filter that targets [user]

1 Like

cool! but created_at_iso is still giving troubles, why logstash is not showing all errors at once? I guess it's a fail safe.
Anyway, so "failed to parse date field [2022-02-23 19:09:15 ] with format [strict_date_optional_time||epoch_millis], we got rid of UTC, that seems to be out of the way.
now the white space between are giving me troubles(not sure, but looks like it).

my updated filter:

filter {
        mutate {
                gsub => ["user", "=>",":"]
        }
        json {
                source => "user"
                target => "user"
        }
        mutate {
                gsub => ["created_at_iso","UTC|[\s]+$",""]
        }
        date {
                match => ["created_at_iso", "yyyy-MM-dd HH:mm:ss"]
                target => "created_at_iso"
        }
        mutate {
                remove_field => ["log_entry","_id"]
        }
}

I assume logstash regular expression is different? I comes from JS background.

That will globally replace either the string UTC or trailing whitespace, not both. Just use

gsub => [ "created_at_iso", " UTC", "" ]
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.