./scripts/import_dashboards: No such file or directory

I installed Filebeat 6 , i can find ./scripts/import_dashboards

please assist, only found
migrate_beat_config_1_x_to_5_0.py

See the Breaking Changes doc for 6.0. The Beat itself does the importing now.

Thank you
Please send complete guide for ELK6
We have server for
1elastic search
2 kibana

Where should I install logstash beats(filebeat,packetbeat, backbeat)

Each Beat has it's own set of documentation. You can have a look at the Getting Started section in each Beat's documentation. The docs are located here. You'll want to click the "other versions" link and select 6.0.0-beta2.

1 Like

Thanks, do I need to install the beats in the source server or in kibana /
elastic search server?

You install the Beat on the server you want to monitor. It sends the data to Elasticsearch which is usually on a different server (but most people run them all on the same machine when they are getting started and learning).

thank you

Hi
my filebeat directory : /etc/filebeat/filebeat.yml

this commands

./filebeat setup -E "output.elasticsearch.username=elastic" -E "output.elasticsearch.password=changeme"

produces
./filebeat setup -E "output.elasticsearch.username=elastic" -E "output.elasticsearch.password=changeme"
filebeat2017/09/19 08:00:25.187670 beat.go:629: CRIT Exiting: error loading config file: stat filebeat.yml: no such file or directory
Exiting: error loading config file: stat filebeat.yml: no such file or directory

CRIT Exiting: error loading config file: stat filebeat.yml: no such file or

./filebeat setup -c /etc/filebeat/filebeat.yml "output.elasticsearch.username=elastic" -E "output.elasticsearch.password=changeme"
Exiting: Error loading Elasticsearch template: error creating template from file /usr/share/filebeat/bin/fields.yml: open /usr/share/filebeat/bin/fields.yml: no such file or directory

How did you install Filebeat? Was it via RPM, deb, or tar.gz?

yes i used RPM

If it was via RPM or deb use /usr/bin/filebeat (or just filebeat since it's on your PATH) to execute Filebeat. This ensures that the path.* config options are set correctly.

filebeat setup -c /etc/filebeat/filebeat.yml -E "output.elasticsearch.username=elastic" -E "output.elasticsearch.password=changeme"

Thank you very much
i manged to import dashboards for filebeat and metric beats

Packetbeat giving an error below:

packetbeat setup -c /etc/packetbeat/packetbeat.yml -E "output.elasticsearch.username=elastic" -E "output.elasticsearch.password=changeme"
Loaded index template
Exiting: Error importing Kibana dashboards: fail to create the Kibana loader: Error creating Kibana client: fail to get the Kibana version:HTTP GET request to /api/status fails: fail to execute the HTTP GET request: Get http://localhost:5601/api/status: dial tcp 127.0.0.1:5601: getsockopt: connection refused. Response: .

it is ok

i added setup.kibana:
host: "myIP:5601"

Is Kibana running locally? With ES and Kibana 6.x the dashboards are installed through an API in Kibana rather than writing directly to the .kibana index in Elasticsearch. This means Kibana needs to be up an running. If it's on a different host you can specify that using setup.kibana.host.

Kibana - is running on its own VM
ES - on its own VM


Do i need to install below beats, ?

https://www.elastic.co/guide/en/beats/libbeat/current/community-beats.html

Hi ,

please advice on the request below,

The generic error from instance:

Basically all errors appear in the x object, if it is empty and statuscode is 0 then it is successful.

"responseSeverity": "W" = warning
"responseSeverity": "S" = severe

We need to report on these variations in ELK and send a daily or weekly report by email/display on dashboard.

Sounds like you need to do parsing of the log messages. In the Elastic stack, we usually recommend Logstash's Grok filter for parsing.

Thank you , do i need to use Logstash output instead?
how do i install Grok filter?
Does Grok Filter do alerts as well?

i am getting this error

bin/logstash-plugin install logstash-filter-grok
OpenJDK 64-Bit Server VM warning: If the number of processors is expected to increase from one, then you should configure the number of parallel GC threads appropriately using -XX:ParallelGCThreads=N
URI::InvalidURIError: bad URI(is not URI?): 22.150.32.85:3128
split at /usr/share/logstash/vendor/jruby/lib/ruby/1.9/uri/common.rb:176
parse at /usr/share/logstash/vendor/jruby/lib/ruby/1.9/uri/common.rb:210
parse at /usr/share/logstash/vendor/jruby/lib/ruby/1.9/uri/common.rb:747
URI at /usr/share/logstash/vendor/jruby/lib/ruby/1.9/uri/common.rb:994
extract_proxy_values_from_uri at /usr/share/logstash/lib/pluginmanager/proxy_support.rb:47
configure_proxy at /usr/share/logstash/lib/pluginmanager/proxy_support.rb:69
(root) at /usr/share/logstash/lib/pluginmanager/main.rb:26