I have a JSON file with below information.
{"apiId":210966762,"apiVersionId":15552430,"orgId":"eeb3ccb6-a2f4-4c7b-9459-7202183bce03","hostId":"mule.qa.2","receivedTs":"2019-01-28T23:59:54.691-05:00","repliedTs":"2019-01-28T23:59:56.406-05:00"}
I need to add this information to kibana.
I am trying to do it using logstash with below config file. (fileName: logstash.conf)
input{
file {
port => 5044
path => "e:\logs*"
type => "json"
start_position => "beginning"
codec => "json"
}
}
output{
stdout{
codec=> rubydebug
}
elasticsearch{
hosts => "localhost:9200"
index => "mulesoft-log-mon-%{+YYYY.MM.dd}"
}
}
But I am getting below error when I run logstash.
PS E:\softwares\ElasticLogstash6.2.3> .\bin\logstash.bat -f logstash.conf
Sending Logstash's logs to E:/softwares/ElasticLogstash6.2.3/logs which is now configured via log4j2.properties
[2019-03-05T16:54:09,394][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"E:/softwares/ElasticLogstash6.2.3/modules/fb_apache/configuration"}
[2019-03-05T16:54:09,440][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"E:/softwares/ElasticLogstash6.2.3/modules/netflow/configuration"}
[2019-03-05T16:54:10,167][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-03-05T16:54:11,461][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2019-03-05T16:54:12,840][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-03-05T16:54:16,388][ERROR][logstash.inputs.file ] Unknown setting 'port' for file
[2019-03-05T16:54:16,466][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/config/mixin.rb:89:in config_init'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/inputs/base.rb:62:in
initialize'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/plugins/plugin_factory.rb:89:in plugin'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/pipeline.rb:112:in
plugin'", "(eval):8:in <eval>'", "org/jruby/RubyKernel.java:994:in
eval'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/pipeline.rb:84:in initialize'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/pipeline.rb:169:in
initialize'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/pipeline_action/create.rb:40:in execute'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:315:in
block in converge_state'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:312:in
block in converge_state'", "org/jruby/RubyArray.java:1734:in each'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:299:in
converge_state'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:166:in block in converge_state_and_update'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:141:in
with_pipelines'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:164:in converge_state_and_update'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:90:in
execute'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/runner.rb:348:in block in execute'", "E:/softwares/ElasticLogstash6.2.3/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in
block in initialize'"]}
PS E:\softwares\ElasticLogstash6.2.3>
Am I missing something..?
Do we need grok filter to read json content..?
I am doing it on windows machine to setup things on my local env.
Appreciate your help/support.