[ERROR][logstash.agent] Failed to execute action "Expected one of #, {, } at line 161, column 185 (byte 16646) after input {\r\n\tjdbc {\r\n\t\tjdbc_driver_library => \

Hi

I get this error message while trying to indexing data to elasticsearch.
Does anyone have an idea what could cause this error?

[2019-10-04T17:00:02,272][ERROR][logstash.agent] Failed to execute ac
tion {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"L
ogStash::ConfigurationError", :message=>"Expected one of #, {, } at line 161, co
lumn 185 (byte 16646) after input {\r\n\tjdbc {\r\n\t\tjdbc_driver_library => "
C:\Elastic\logstash-7.2.0\vendor\oracle\ojdbc7.jar" #Endre egen Logstash\r
\n\t\t\r\n\t\tjdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"\r\n
\t\t\r\n\t\tjdbc_connection_string => "jdbc:oracle:thin:@source" #Endre DB\r\n\t\t\r\n\t\tjdbc_user => "USERNAME" #Endre DB\r\n\t\t\r\n\t
tjdbc_password => "PASSWORD" #Endre DB\r\n\t\t\r\n#\t\tschedule => "0 * * * *"
[2019-10-04T17:00:02,802][INFO ][logstash.agent ] Successfully started
Logstash API endpoint {:port=>9600}
[2019-10-04T17:00:07,576][INFO ][logstash.runner ] Logstash shut down.

I use the following config file:

(I replaced the acutal values with the following:
@source
jdbc_user => "USERNAME"
jdbc_password => "PASSWORD")


input {
jdbc {
jdbc_driver_library => "C:\Elastic\logstash-7.2.0\vendor\oracle\ojdbc7.jar" #Endre egen Logstash

	jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
	
	jdbc_connection_string => "jdbc:oracle:thin:@source" #Endre DB
	
	jdbc_user => "USERNAME" #Endre DB
	
	jdbc_password => "PASSWORD" #Endre DB

schedule => "0 * * * *" #Every hour Starter hvert 0 minutt, Ved å konvertere ut så starter den umiddelbart

	add_field => { "[@metadata][source_type]" => "laco-netteier-elwin-nett" } #Endre DB Filnavn: Bruker-Kunde-Avdeling-System-Formål

#Endre
statement => "
--Netteier - Detaljert
select
--Målepunkt/Anlegg
efm.maalepktnr as mpnr
, efm.maalepunkt as ean
from felles.v_gui_efmaalepunkt efm, kunde.v_ekkontrakt_admin ekk, kunde.ekkontrakthist ekkh, kunde.ekresept ekr, felles.efjurperson efj, felles.v_adresse_maalepunkt a, komponent.v_emkomponent emk, komponent.v_emkomptype emkt, tilsyn.v_tilsynsobjekt_admin tilsyn --, felles.v_adresse_jurperson adr, felles.v_adresse_reskontro err
where 1=1
and efm.maalepktnr = a.maalepktnr
and efm.maalepktnr = ekk.maalepktnr(+)
and ekk.kundenr = efj.jurpersonid(+)
and efm.netteigar in(20001)
and ekk.valstatus in('A')
and ekk.kundenr = ekkh.kundenr(+)
and ekk.kontraktnr = ekkh.kontraktnr(+)
and ekkh.reseptnr = ekr.reseptnr(+)
and ekkh.tarfradato <= sysdate
and ekkh.tartildato > sysdate
and ekr.reseptgrpnr in(20) --20. Nettariff
and efm.maalepktnr = emk.maalepktnr(+)
and emk.komptypenr = emkt.komptypenr(+)
and emk.komptypeid(+) in(1)
and emk.plasseringid(+) in(1)
and efm.anleggnr = tilsyn.anleggnr(+)
and a.fradato = (select max(a1.fradato) from felles.v_adresse_maalepunkt a1 where a.maalepktnr = a1.maalepktnr)
and (efm.mpgroupingid is null or efm.mpgroupingid not in(225)) --225.AMS Nettnytte NS
--and efm.maalepktnr in(44565)
and rownum < 10
order by 1,2
"
}
}

filter {
if [@metadata][source_type] == "laco-netteier-elwin-nett" { #Endre DB Filnavn
}
}

output {
if [@metadata][source_type] == "laco-netteier-elwin-nett" { #Endre DB Filnavn
elasticsearch {
hosts => ["https://4b9343dad4744941a7e38ed0f30b6237.eu-west-1.aws.found.io:9243"]
user => "XXXXXXX"
password => "XXXXXXXXXXX"
index => "%{[@metadata][source_type]}" #Endre
document_id => "%{kundenr} %{kontraktnr}" #Endre Unik ID
}
}

stdout {
#	codec => rubydebug #På = Ser hver enkelt kolonne fra spørringen som et json objekt i cmd-vindu, som er enklere ved testing/verifisering
	codec => dots #På = Ser hver kolonne fra spørring som en dot, som er enklere ved kjøring i produksjon
}

}

Something seems fishy in your input block. Try to offload your SQL Statement into a separate file:

https://www.elastic.co/guide/en/logstash/current/plugins-inputs-jdbc.html#plugins-inputs-jdbc-statement_filepath

And remove as many comments/unused characters as possible.

Thanks a lot! I will try to remove all the commenting and unused characters!

The sql query is running and gives me the wanted result when I try to run it in the SQL Developer tool.

The sql query is running and gives me the wanted result when I try to run it in the SQL Developer tool.

This is entierly possible, logstash could have a problem with a character in the query itself (sorry if I can't tell exactly, the query was formatted in your initial post). Logstash may not be able to parse your config file then.

Removing the query from the config file and placing it into a separate file should solve this and make your config waaaay more readable.

Let us know if you succeed! :slightly_smiling_face:

Hi!
Thanks again
I digged down in the error message from Logstash together with the sql query and found one column/field that caused this problem.

The error message contains several messages like this:
error_message=>""\xBA" from ASCII-8BIT to UTF-8"
error_message=>""\xAF" from ASCII-8BIT to UTF-8"
error_message=>""\xCD" from ASCII-8BIT to UTF-8"

Googled these error messages and it was supposedly some problems converting the content between different formats or reading the content of one of the columns.

I remembered that my query contains a column with a unique long sequence of both numbers and text ex: "928BEE3A365D4D80B43CD88B9DB8B759"... Removed this column (commented it out) from the sql query and it ran nicely indexing the result to elasticsearch.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.