Can't get text on a START_OBJECT at 1:330

Hi

I am taking over from a colleague but I don't have any experience yet with logstash and filters (still learning :))

Hopefully someone can help me.

Customer is doing this:

sends a json message. without tag it is ok, with tag it is not ok. They are all key values so it seems a valid json message.
However it is not being indexed. And these are the filters and errors on our side:

Filter that is used:

cat 21_filter_licenses.conf
filter {
if [type] == "nxlog" and "app-Genesys" in [tags] and "inx_license_usage_in" in [SourceModuleName] {
json {
source => "original"
target => "l_info"
}
ruby {
code => "
h = event.get('[l_info]').to_hash
h.each do |k,v|
if k =~ /Usage/ && k !~ /^Inx/ || k =~ /Total/ || k == 'Environment'
event.set('[c][customer][license]' + (k), (v))
end
end
"
}
mutate {
remove_field => [ "l_info" ]
}
}
}

errors:
[2018-07-30T08:29:52,883][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-genesys-2018.07.30", :_type=>"doc", :_routing=>nil},
#LogStash::Event:0x4354347a], :response=>{"index"=>{"_index"=>"logstash-genesys-2018.07.30", "_type"=>"doc", "_id"=>"oFjg6WQBiUbzVMJL-p1j", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception",
"reason"=>"failed to parse [c.customer.license]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:330"}}}}}

I am sorry I have no idea where to look sorry. But hopefully somebody can guide me here. Of course i am willing to provide more information.

The error is coming from logstash.outputs.elasticsearch, so it is happening when it tries to index the event. If you look at the elasticsearch log file the message there may be more informative.

like this?

[2018-07-30T14:23:50,561][DEBUG][o.e.a.b.TransportShardBulkAction] [logstash-genesys-2018.07.30][0] failed to execute bulk item (index) BulkShardRequest [[logstash-genesys-2018.07.30][0]] containing [index {[logstash-genesys-2018.07.30][doc][9ZUl62QBiUbzVMJLCngf], source[{"port":60020,"domain":"hosting.corp","tags":["app-Genesys"],"@version":"1","m":{"application":"","supportgroup":"","serviceimpact":"","servicecomponent":"","service":"","devicetype":["WINDOWS Server"],"customer":["Achmea.GITS"],"cmz":"","infrastructure":"","location":"","bubble":""},"SourceModuleName":"inx_license_usage_in","@timestamp":"2018-07-30T12:23:50.267Z","severity":"","hostname":"aisrv61721","original":"{"EventReceivedTime":"2018-07-30 14:23:50","SourceModuleName":"inx_license_usage_in","SourceModuleType":"im_file","Time":"2018-07-30T14:23:49.999","AgentSeatUsage":"3","AgentSeatTotal":"1120","EmailUsage":"1","EmailTotal":"480","ChatUsage":"1","ChatTotal":"120","SMSUsage":"0","SMSTotal":"0","OpenMediaUsage":"0","OpenMediaTotal":"520","Hostname":"aisrv61721.hosting.corp","Environment":"test","InxLicenseUsage":"INX Current license usage","Message":"2018-07-30T14:23:49.999 Trc 55017 Current license usage: agent seat: 3/1120, email: 1/480, chat: 1/120, SMS: 0/0, open media: 0/520","tags":"app-Genesys"}\r","c":{"emailtot
":{"license":{"ChatTotal":"120","SMSUsage":"0","OpenMediaTotal":"520","OpenMediaUsage":"0","EmailTotal":"480","ChatUsage":"1","AgentSeatUsage":"3","Environment":"test","AgentSeatTotal":"1120","SMSTotal":"0","EmailUsage":"1"}}},"fqdn":"aisrv61721.hosting.corp","d":{"severity":{"debug":0,"notice":0,"informational":0,"warning":0,"emergency":0,"error":0,"critical":0,"alert":0}},"type":"nxlog","message":"2018-07-30T14:23:49.999 Trc 55017 Current license usage: agent seat: 3/1120, email: 1/480, chat: 1/120, SMS: 0/0, open media: 0/520"}]}]
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [c.customer.license]
at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:302) ~[elasticsearch-6.2.1.jar:6.2.1]
.......................
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler.lambda$messageReceived$0(SecurityServerTransportInterceptor.java:307) ~[?:?]
at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:60) ~[elasticsearch-6.2.1.jar:6.2.1]
at org.elasticsearch.xpack.security.transport.ServerTransportFilter$NodeProfile.lambda$inbound$2(ServerTransportFilter.java:166) ~[?:?]
at org.elasticsearch.xpack.security.authz.AuthorizationUtils$AsyncAuthorizer.maybeRun(AuthorizationUtils.java:183) ~[?:?]
at org.elasticsearch.xpack.security.authz.AuthorizationUtils$AsyncAuthorizer.setRunAsRoles(AuthorizationUtils.java:177) ~[?:?]
at org.elasticsearch.xpack.security.authz.AuthorizationUtils$AsyncAuthorizer.authorize(AuthorizationUtils.java:165) ~[?:?]
at org.elasticsearch.xpack.security.transport.ServerTransportFilter$NodeProfile.lambda$inbound$3(ServerTransportFilter.java:168) ~[?:?]
at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:60) ~[elasticsearch-6.2.1.jar:6.2.1]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.lambda$authenticateAsync$2(AuthenticationService.java:184) ~[x-pack-security-6.2.1.jar:6.2.1]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.lambda$lookForExistingAuthentication$4(AuthenticationService.java:217) ~[x-pack-security-6.2.1.jar:6.2.1]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.lookForExistingAuthentication(AuthenticationService.java:228) [x-pack-security-6.2.1.jar:6.2.1]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.authenticateAsync(AuthenticationService.java:182) [x-pack-security-6.2.1.jar:6.2.1]
at org.elasticsearch.xpack.security.authc.AuthenticationService$Authenticator.access$000(AuthenticationService.java:143) [x-pack-security-6.2.1.jar:6.2.1]
at org.elasticsearch.xpack.security.authc.AuthenticationService.authenticate(AuthenticationService.java:113) [x-pack-security-6.2.1.jar:6.2.1]
at org.elasticsearch.xpack.security.transport.ServerTransportFilter$NodeProfile.inbound(ServerTransportFilter.java:142) [x-pack-security-6.2.1.jar:6.2.1]
at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler.messageReceived(SecurityServerTransportInterceptor.java:314) [x-pack-security-6.2.1.jar:6.2.1]
at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:66) [elasticsearch-6.2.1.jar:6.2.1]
at org.elasticsearch.transport.TransportService$7.doRun(TransportService.java:656) [elasticsearch-6.2.1.jar:6.2.1]
at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:635) [elasticsearch-6.2.1.jar:6.2.1]
at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-6.2.1.jar:6.2.1]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_151]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_151]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_151]
Caused by: java.lang.IllegalStateException: Can't get text on a START_OBJECT at 1:1128
at org.elasticsearch.common.xcontent.json.JsonXContentParser.text(JsonXContentParser.java:85) ~[elasticsearch-6.2.1.jar:6.2.1]
at org.elasticsearch.common.xcontent.support.AbstractXContentParser.textOrNull(AbstractXContentParser.java:237) ~[elasticsearch-6.2.1.jar:6.2.1]
at org.elasticsearch.index.mapper.KeywordFieldMapper.parseCreateField(KeywordFieldMapper.java:315) ~[elasticsearch-6.2.1.jar:6.2.1]
at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:297) ~[elasticsearch-6.2.1.jar:6.2.1]
... 65 more

That's not valid JSON. You have double quotes embedded within a double quote delimited string.

ok sorry for asking.: does the end user has to change it or does it has to be changed in the filter files?
this is the original event (without tag):
{"EventReceivedTime":"2018-07-30 09:43:09","SourceModuleName":"inx_license_usage_in","SourceModuleType":"im_file","Time":"2018-07-30T09:43:09.012","AgentSeatUsage":"4","AgentSeatTotal":"1120","EmailUsage":"0","EmailTotal":"480","ChatUsage":"0","ChatTotal":"120","SMSUsage":"0","SMSTotal":"0","OpenMediaUsage":"1","OpenMediaTotal":"520","Hostname":"aisrv61721.hosting.corp","Environment":"test","InxLicenseUsage":"INX Current license usage","Message":"2018-07-30T09:43:09.012 Trc 55017 Current license usage: agent seat: 4/1120, email: 0/480, chat: 0/120, SMS: 0/0, open media: 1/520"}

if that is compared to the filter the end user is using in the nxlog agent, then it seems to run fine.

Exec if $raw_event =~ /^(\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}.\d{3}) \Trc 55017 Current license usage: agent seat: (\S+)/(\S+), email: (\S+)/(\S+), chat: (\S+)/(\S+), SMS: (\S+)/(\S+), open media: (\S+)/(\S+)/
{
$Time = $1;
$AgentSeatUsage = $2;
$AgentSeatTotal = $3;
$EmailUsage = $4;
$EmailTotal = $5;
$ChatUsage = $6;
$ChatTotal = $7;
$SMSUsage = $8;
$SMSTotal = $9;
$OpenMediaUsage = $10;
$OpenMediaTotal = $11;
$Hostname = "**********";
$Environment = "test";
$InxLicenseUsage = "INX Current license usage";
$Message = $raw_event;
} \

If that were the original event it would not even be processed by that filter, and if it were it would get a ruby exception for trying to convert a non-existent field to a hash.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.