Passing data from file through Logstash to Kibana

Hello, I am new to ELK and have been trying for a few weeks to get things to work. I am using Elasticsearch 6.

I have JSON files being added to /var/local/ that have been structured for elasticsearch

sample JSON file

[{"index": {"_type": "json_file", "_index": "customer-2017-12-19"}}][{"index": {"_index": "customer-2017-12-19", "_type": "json_file"}},{"date": "12-19-2017", "datetime": "12-19-2017 21:36:45", "time": "21:36:45", "Count": "0.5"},{"date": "12-19-2017", "datetime": "12-19-2017 22:09:26", "time": "22:09:26", "Count": "0.5"}]

my input conf file /etc/logstash/conf.d/02-json-input.conf contains:

input {
file {
path => "/var/local/*.json"
type => "json" # a type to identify those logs (will need this later)
start_position => "beginning"
}
}

I do not have a filter conf file as all JSON files are configured the same and are created with Elasticsearch in mind (not sure if filter is required)

my output conf file /etc/logstash/conf.d/30-elasticsearch-output.conf contains:

output {
elasticsearch {
hosts => ["localhost:9200"]
sniffing => true
manage_template => false
index => "%{[@metadata]}-%{+YYYY.MM.dd}"
document_type => "%{json}"
}
}

I have tried starting logstash with:
bin/logstash -f logstash-simple.conf --path.settings /etc/logstash

my logstash-simple.conf file contains

input {
file {
path => [ "/var/local/*.json" ]
type => "json"
start_position => "beginning"
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => json }
}

if I run /usr/share/logstash# bin/logstash -e 'input { stdin { } } output { stdout {} }'
and pass 'hello world' I get;
2017-12-29T00:02:12.867Z brooklinhomehardwareca hello world

contents of /var/log/logstash-plain.log is;

[2017-12-28T00:15:38,607][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2017-12-28T00:15:38,633][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2017-12-28T00:15:39,243][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-28T00:15:39,262][FATAL][logstash.runner ] Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the "path.data" setting.
[2017-12-28T00:27:34,800][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2017-12-28T00:27:34,813][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2017-12-28T00:27:35,753][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-28T00:27:35,775][FATAL][logstash.runner ] Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the "path.data" setting.
[2017-12-28T00:59:41,740][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2017-12-28T00:59:41,749][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2017-12-28T00:59:42,408][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-28T00:59:42,431][FATAL][logstash.runner ] Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the "path.data" setting.
[2017-12-28T01:08:16,002][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2017-12-28T01:08:16,012][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2017-12-28T01:08:16,679][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-28T01:08:16,698][FATAL][logstash.runner ] Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the "path.data" setting.
[2017-12-28T01:22:04,961][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2017-12-28T01:22:04,967][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2017-12-28T01:22:05,589][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-28T01:22:05,613][FATAL][logstash.runner ] Logstash could not be started because there is already another instance using the configured data directory. If you wish to run multiple instances, you must change the "path.data" setting.

Any help would be greatly appreciated.

FYI we’ve renamed ELK to the Elastic Stack, otherwise Beats and APM feel left out! :wink:

You should remove this, it's going to cause problems in the future.

Have you checked for multiple Logstash processes already running?

Hello Mark, thanks for your help. Will now follow suite and respect the Elastic Stack. I have removed document_type => "%{json}". I have looked in htop and it appears there is only one instance running. It appears more data has be passed to Kibana, but nothing meaningful. It seems something has created a my_mapping.json file in my /var/local/ folder and that passed. Contents of that file are:

{
"error" : {
"root_cause" : [
{
"type" : "index_not_found_exception",
"reason" : "no such index",
"resource.type" : "index_or_alias",
"resource.id" : "my_index",
"index_uuid" : "na",
"index" : "my_index"
}
],
"type" : "index_not_found_exception",
"reason" : "no such index",
"resource.type" : "index_or_alias",
"resource.id" : "my_index",
"index_uuid" : "na",
"index" : "my_index"
},
"status" : 404
}

However, my JSON files are not passing. Thanks again for your help.

Just to update the post, I had passed this CLI:

head -50 /var/local/counter-12-21-2017.json | /usr/share/logstash/bin/logstash -f logstash-simple.conf

and a file sent, but it was not the one I specified... it had passed counter-12-19-2017.json

Not exactly sure what is going on there...

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.