Logstash Oracle setup - Cannot create pipeline


(Bharat) #1

I there, I am getting below error in the logstash logs. Can someone please help?

[2017-07-10T19:06:10,405][ERROR][logstash.agent ] Cannot create pipeline {:reason=>"Expected one of #, => at line 15, column 11 (byte 546) after output {\n file {\n stdout "}
[2017-07-10T19:06:17,780][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-07-10T19:06:17,884][INFO ][logstash.pipeline ] Pipeline main started
[2017-07-10T19:06:17,967][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-07-10T19:06:28,826][WARN ][logstash.runner ] SIGTERM received. Shutting down the agent.
[2017-07-10T19:06:28,829][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
[2017-07-10T19:06:48,776][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-07-10T19:06:48,864][INFO ][logstash.pipeline ] Pipeline main started
[2017-07-10T19:06:48,931][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Below is my config file, not sure where my query output is going?

input {
jdbc {
jdbc_validate_connection => true
jdbc_connection_string => "MyConnectionString"
jdbc_user => "MyUserName"
jdbc_password => "MyPassword"
jdbc_driver_library => "opt/jdbc/lib/ojdbc7.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
statement => "SELECT * FROM CUSTOMER.CUSTOMER WHERE FULL_NAME = 'First Last'"
schedule => "*/2 * * * *"
}
}
output {
stdout { codec => json }
}


(Magnus Bäck) #2

Somewhere in the configuration that Logstash attempts to load there's something that looks like this:

output {
 file {
 stdout

Perhaps you have additional files in /etc/logstash/conf.d. Logstash reads all files there.


(Bharat) #3

Thank you magnusbaeck....


(Bharat) #4

Thank you Magnus... I was able to get the oracle data into Kibana.

When I see the logs in Kibana, I see that few of the fields are logging junk data, not sure why. Upon checking I found that the fields with raw Oracle datatype are the ones for which junk data is being captured in Kibana.

This make reading difficult. Do you know what can be done here?

Below are some junk data which is being logged in Kibana for reference..

w:_° ¦†PV½"
w:_°@¦†PV½"
w:_°'Ц†PV½"
3 yßNmD«àS:
Ý
w:e¯ÉÀ›œPV½3œ
w:_°"a°¦†PV½"

This is my configuration file...

input {
jdbc {
jdbc_validate_connection => true
jdbc_connection_string => "MyConnectionString"
jdbc_user => "MyUserName"
jdbc_password => "MyPassword"
jdbc_driver_library => "opt/jdbc/lib/ojdbc7.jar"
jdbc_driver_class => "Java::oracle.jdbc.driver.OracleDriver"
statement => "SELECT * FROM interaction.interaction_line"
schedule => "0 0 17 * * *"
}
}
output {
elasticsearch {
hosts => ["MyELSServer:9200"]
manage_template => true
index => "<interaction.interaction_line-{now/d}>"
}
}


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.