Error when reading log files using logstash

I have a JSON file with below information.
{"apiId":210966762,"apiVersionId":15552430,"orgId":"eeb3ccb6-a2f4-4c7b-9459-7202183bce03","hostId":"mule.qa.2","receivedTs":"2019-01-28T23:59:54.691-05:00","repliedTs":"2019-01-28T23:59:56.406-05:00"}

I need to add this information to kibana.

I am trying to do it using logstash with below config file. (fileName: logstash.conf)
input{
file {
port => 5044
path => "e:\logs*"
type => "json"
start_position => "beginning"
codec => "json"
}
}

output{
stdout{
codec=> rubydebug
}
elasticsearch{
hosts => "localhost:9200"
index => "mulesoft-log-mon-%{+YYYY.MM.dd}"
}
}

But I am getting below error when I run logstash.
PS E:\softwares\ElasticLogstash6.2.3> .\bin\logstash.bat -f logstash.conf
Sending Logstash's logs to E:/softwares/ElasticLogstash6.2.3/logs which is now configured via log4j2.properties
[2019-03-05T16:54:09,394][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"E:/softwares/ElasticLogstash6.2.3/modules/fb_apache/configuration"}
[2019-03-05T16:54:09,440][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"E:/softwares/ElasticLogstash6.2.3/modules/netflow/configuration"}
[2019-03-05T16:54:10,167][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-03-05T16:54:11,461][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2019-03-05T16:54:12,840][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-03-05T16:54:16,388][ERROR][logstash.inputs.file ] Unknown setting 'port' for file
[2019-03-05T16:54:16,466][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Something is wrong with your configuration.", :backtrace=>["E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/config/mixin.rb:89:in config_init'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/inputs/base.rb:62:ininitialize'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/plugins/plugin_factory.rb:89:in plugin'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/pipeline.rb:112:inplugin'", "(eval):8:in <eval>'", "org/jruby/RubyKernel.java:994:ineval'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/pipeline.rb:84:in initialize'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/pipeline.rb:169:ininitialize'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/pipeline_action/create.rb:40:in execute'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:315:inblock in converge_state'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:312:inblock in converge_state'", "org/jruby/RubyArray.java:1734:in each'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:299:inconverge_state'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:166:in block in converge_state_and_update'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:164:in converge_state_and_update'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/agent.rb:90:inexecute'", "E:/softwares/ElasticLogstash6.2.3/logstash-core/lib/logstash/runner.rb:348:in block in execute'", "E:/softwares/ElasticLogstash6.2.3/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:inblock in initialize'"]}
PS E:\softwares\ElasticLogstash6.2.3>

Am I missing something..?
Do we need grok filter to read json content..?

I am doing it on windows machine to setup things on my local env.

Appreciate your help/support.

You are sending log info from your local path no need to parse port in input. remove that and try

Hi Ganesh,

I have removed port, following is my conf file
input{
file {
path => e:\logs*.log
type => "json"
start_position => "beginning"
codec => "json"
}
}

output{
stdout{
codec=> rubydebug
}
elasticsearch{
hosts => "localhost:9200"
index => "mulesoft-log-mon-%{+YYYY.MM.dd}"
}
}

Now we are not getting the error but,
I dont see logs getting populated/processing.

Following is the console info. It is stuck at this point without any error in idle state.
PS E:\softwares\ElasticLogstash6.2.3> .\bin\logstash.bat -f logstash.conf
Sending Logstash's logs to E:/softwares/ElasticLogstash6.2.3/logs which is now configured via log4j2.properties
[2019-03-05T17:40:21,885][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"E:/softwares/ElasticLogstash6.2.3/modules/fb_apache/configuration"}
[2019-03-05T17:40:21,916][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"E:/softwares/ElasticLogstash6.2.3/modules/netflow/configuration"}
[2019-03-05T17:40:22,213][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-03-05T17:40:23,244][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2019-03-05T17:40:24,097][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-03-05T17:40:30,142][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-03-05T17:40:30,913][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-03-05T17:40:30,928][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2019-03-05T17:40:31,219][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-03-05T17:40:31,344][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-03-05T17:40:31,360][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2019-03-05T17:40:31,376][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2019-03-05T17:40:31,422][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-03-05T17:40:31,500][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2019-03-05T17:40:33,521][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-03-05T17:40:34,637][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x6a7e0b00 run>"}
[2019-03-05T17:40:34,795][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

have you check ES index whether index get created or not?

Index is not created

Use forward slash instead of backslash

Hi Ganesh & Badger,

Thanks a ton for your response. Index got created and got data into KIBANA.

Thanks a lot for your time :slightly_smiling_face:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.