Logstash kibana

not able to load data from mysql to elasticsearch using logstash?

Can you please formulate you request more precisely

1 Like

i am getting this error

C:\ELK\logstash-7.0.1\bin>logstash -f C:\ELK\logstash-7.0.1\config\lat_long_geopoint.conf
Sending Logstash logs to C:/ELK/logstash-7.0.1/logs which is now configured via log4j2.properties
[2019-05-29T16:48:15,407][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-29T16:48:15,422][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.1"}
[2019-05-29T16:48:19,544][ERROR][logstash.outputs.elasticsearch] Unknown setting '"document_type"' for elasticsearch
[2019-05-29T16:48:19,545][ERROR][logstash.outputs.elasticsearch] Unknown setting '"hosts"' for elasticsearch
[2019-05-29T16:48:19,546][ERROR][logstash.outputs.elasticsearch] Unknown setting '"index"' for elasticsearch
[2019-05-29T16:48:19,553][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/config/mixin.rb:86:in config_init'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/outputs/base.rb:60:ininitialize'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:232:in initialize'", "org/logstash/config/ir/compiler/OutputDelegatorExt.java:48:ininitialize'", "org/logstash/config/ir/compiler/OutputDelegatorExt.java:30:in initialize'", "org/logstash/plugins/PluginFactoryExt.java:239:inplugin'", "org/logstash/plugins/PluginFactoryExt.java:137:in buildOutput'", "org/logstash/execution/JavaBasePipelineExt.java:50:ininitialize'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/java_pipeline.rb:23:in initialize'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/pipeline_action/create.rb:36:inexecute'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/agent.rb:325:in `block in converge_state'"]}
[2019-05-29T16:48:19,780][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-05-29T16:48:24,661][INFO ][logstash.runner ] Logstash shut down.

You have some parameters in your Logstash configuration that are not good. Can you please pot your Logstash config file?

this is my .conf file

input {
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/test"
# The user we wish to execute our statement as
jdbc_user => "root"
jdbc_password => "XXXXXXXXX"
# The path to our downloaded jdbc driver
jdbc_driver_library => "C:/ELK/logstash-7.0.1/bin/mysql-connector-java-8.0.16.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
# our query
statement => "SELECT * FROM locn"
}
}
output {
stdout { codec => json_lines }
elasticsearch {
"hosts" => "localhost:9200"
"index" => "locn"
"document_type" => "data"
}
}

i am able to load data into elasticsearch using hive.
i want to plot route using latitude and longitude data which are in different column of my .csv file and stored in hive.
But i am not able to convert that latitude and longitude into Geopoint

You don't have to put " for hosts, index and document_type.
Try this instead:
elasticsearch {
hosts => "localhost:9200"
index => "locn"
document_type => "data"
}

They are keys, not strings

Now i got error like this

Sending Logstash logs to C:/ELK/logstash-7.0.1/logs which is now configured via log4j2.properties
[2019-05-29T17:23:30,267][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-05-29T17:23:30,284][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.1"}
[2019-05-29T17:23:31,185][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, => at line 17, column 17 (byte 532) after output {\r\n stdout { codec => json_lines }\r\n elasticsearch {\r\n elasticsearch ", :backtrace=>["C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/compiler.rb:41:in compile_imperative'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/compiler.rb:49:incompile_graph'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/compiler.rb:11:in block in compile_sources'", "org/jruby/RubyArray.java:2577:inmap'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/compiler.rb:10:in compile_sources'", "org/logstash/execution/AbstractPipelineExt.java:151:ininitialize'", "org/logstash/execution/JavaBasePipelineExt.java:47:in initialize'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/java_pipeline.rb:23:ininitialize'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/pipeline_action/create.rb:36:in execute'", "C:/ELK/logstash-7.0.1/logstash-core/lib/logstash/agent.rb:325:inblock in converge_state'"]}
[2019-05-29T17:23:31,467][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-05-29T17:23:36,335][INFO ][logstash.runner ] Logstash shut down.

Now it appears you have

output {
    stdout { codec => json_lines }
    elasticsearch {
        elasticsearch
            [...]

you should have

output {
    stdout { codec => json_lines }
    elasticsearch {
        hosts => "localhost:9200"
        index => "locn"
        document_type => "data"
    }
}

previous errors are solved.
can you tell me how to convert latitude & longitude into geopoint.
i have created table into hive where i have two columns as latitude and longitude

i am getting this error

[2019-05-29T18:12:02,888][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"locn", :_type=>"data", :routing=>nil}, #LogStash::Event:0x36630b61], :response=>{"index"=>{"_index"=>"locn", "_type"=>"data", "_id"=>"LAycA2sBdDeincm_jMXo", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Rejecting mapping update to [locn] as the final mapping would have more than 1 type: [_doc, data]"}}}}
[2019-05-29T18:12:03,813][INFO ][logstash.runner ] Logstash shut down.

Seems like you are trying to push in your index (locn) that has mapping type _doc, new that with mapping type data.

locn is index that i have created in kibana

i have removed document type and now its mapped with kibana
thanks buddy

Glad it helped you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.