Error importing URL/file: Failed to import index-pattern: : error loading /tmp/tmp401063577/beats-dashboards-5.4.0/filebeat/index-pattern/filebeat.json: couldn't load json. Error: 401 Unauthorized

I have installed a new version of elk 5.5.0 alongwith xpack >> next i tried to configure the filebeat index its giving me below error message while importing the dashboard i am getting the below error message -

Error importing URL/file: Failed to import index-pattern: Failed to load directory /tmp/tmp401063577/beats-dashboards-5.4.0/filebeat/index-pattern:
** error loading /tmp/tmp401063577/beats-dashboards-5.4.0/filebeat/index-pattern/filebeat.json: couldn't load json. Error: 401 Unauthorized**

Also I am unable to load the default index (filebeat)in kibana dashboard .. not sure whats wrong with my configuration . Please help me I am tired of searching the solution.

@sandy007 401 means is unauthorized.

Try importing the dashboard as

./scripts/import_dashboards -es http://X.X.X.X:9200 -user elastic -pass elastic

Excellent !!! Appreciated!!! Thank you !!!

dashboards imported successfully now but still i am not able to set the default index ipattern in kibana .. how to get it resolved? index

Attached is the screenshot of my Kibana

@sandy007 You have to load the index pattern in es before it hits to kibana.

try this -

curl -u elastic:elastic -H 'Content-Type: application/json' -XPUT 'http://localhost:9200/_template/filebeat' -d@/etc/filebeat/filebeat.template.json

Hi parth,

i tried loading index pattern with said command but still no luck .. do i need to make changes to my filebeat.yml file ??

Thanks

@sandy007 Check your filebeat logs and es logs, look if your filebeat.yml is a valid yml if you have edited (http://www.yamllint.com/)

It says valid yaml

parth,

below is the output -


filebeat.modules: ~
filebeat.prospectors:

input_type: log
paths: 
  - /var/log/*.log

output.elasticsearch:
hosts:
- "myip:9200"

That's all you have under your filebeat.yml? Can you paste your entire filebeat.yml?

Your yaml should include at least es or logstash as an output.

after pasting full file i am getting below output -


filebeat.modules: ~
filebeat.prospectors:

input_type: log
paths: 
  - /var/log/*.log

logging.files:
path: /var/log/filebeat
logging.to_files: true
output.elasticsearch:
hosts:
- "myip:9200"

Do I need to paste the entire file here?.. its huge one .

Paste what's not commented out. Any logs would also be helpful.

here is changes of my filebeat - I just configured elasticsearch in filebeat -

#-------------------------- Elasticsearch output -------------------------------
output.elasticsearch:

Boolean flag to enable or disable the output module.

#enabled: true

Array of hosts to connect to.

Scheme and port can be left out and will be set to the default (http and 9200)

In case you specify and additional path, the scheme is required: http://localhost:9200/path

IPv6 addresses should always be defined as: https://[2001:db8::1]:9200

hosts: ["myip:9200"]

Set gzip compression level.

#compression_level: 0

Optional protocol and basic auth credentials.

protocol: "https"

#username: "elastic"
#password: "changeme"

Dictionary of HTTP parameters to pass within the url with index operations.

#parameters:
#param1: value1
#param2: value2

Number of workers per Elasticsearch host.

#worker: 1

Optional index name. The default is "filebeat" plus date

and generates [filebeat-]YYYY.MM.DD keys.

index: "filebeat-%{+yyyy.MM.dd}"

Optional ingest node pipeline. By default no pipeline will be used.

#pipeline: ""

Optional HTTP Path

#path: "/elasticsearch"

Custom HTTP headers to add to each request

#headers:

X-My-Header: Contents of the header

Proxy server url

#proxy_url: http://proxy:3128

The number of times a particular Elasticsearch index operation is attempted. If

the indexing operation doesn't succeed after this many retries, the events are

dropped. The default is 3.

#max_retries: 3

The maximum number of events to bulk in a single Elasticsearch bulk API index request.

The default is 50.

#bulk_max_size: 50

Configure http request timeout before failing an request to Elasticsearch.

#timeout: 90

The number of seconds to wait for new events between two bulk API index requests.

If bulk_max_size is reached before this interval expires, addition bulk index

requests are made.

#flush_interval: 1s

A template is used to set the mapping in Elasticsearch

By default template loading is enabled and the template is loaded.

These settings can be adjusted to load your own template or overwrite existing ones.

Set to false to disable template loading.

#template.enabled: true

Template name. By default the template name is filebeat.

template.name: "filebeat"

Path to template file

template.path: "${path.config}/filebeat.template.json"

Overwrite existing template

template.overwrite: false

If set to true, filebeat checks the Elasticsearch version at connect time, and if it

is 2.x, it loads the file specified by the template.versions.2x.path setting. The

default is true.

#template.versions.2x.enabled: true

Path to the Elasticsearch 2.x version of the template file.

#template.versions.2x.path: "${path.config}/filebeat.template-es2x.json"

If set to true, filebeat checks the Elasticsearch version at connect time, and if it

is 6.x, it loads the file specified by the template.versions.6x.path setting. The

default is true.

#template.versions.6x.enabled: true

Path to the Elasticsearch 6.x version of the template file.

#template.versions.6x.path: "${path.config}/filebeat.template-es6x.json"

Use SSL settings for HTTPS. Default is true.

#ssl.enabled: true

Configure SSL verification mode. If none is configured, all server hosts

and certificates will be accepted. In this mode, SSL based connections are

susceptible to man-in-the-middle attacks. Use only for testing. Default is

full.

#ssl.verification_mode: full

List of supported/valid TLS versions. By default all TLS versions 1.0 up to

1.2 are enabled.

#ssl.supported_protocols: [TLSv1.0, TLSv1.1, TLSv1.2]

SSL configuration. By default is off.

List of root certificates for HTTPS server verifications

#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

Certificate for SSL client authentication

#ssl.certificate: "/etc/pki/client/cert.pem"

Client Certificate Key

#ssl.key: "/etc/pki/client/cert.key"

Optional passphrase for decrypting the Certificate Key.

#ssl.key_passphrase: ''

Configure cipher suites to be used for SSL connections

#ssl.cipher_suites: []

Configure curve types for ECDHE based cipher suites

#ssl.curve_types: []
#----------------------------- Logstash output ---------------------------------
#output.logstash:

Boolean flag to enable or disable the output module.

#enabled: true

The Logstash hosts

#hosts: ["localhost:5044"]

Number of workers per Logstash host.

#worker: 1

Set gzip compression level.

#compression_level: 3

Optional load balance the events between the Logstash hosts

#loadbalance: true

Number of batches to be send asynchronously to logstash while processing

new batches.

#pipelining: 0

Optional index name. The default index name is set to name of the beat

in all lowercase.

#index: 'filebeat'

SOCKS5 proxy server URL

#proxy_url: socks5://user:password@socks5-server:2233

Resolve names locally when using a proxy server. Defaults to false.

#proxy_use_local_resolver: false

Enable SSL support. SSL is automatically enabled, if any SSL setting is set.

#ssl.enabled: true

Configure SSL verification mode. If none is configured, all server hosts

and certificates will be accepted. In this mode, SSL based connections are

susceptible to man-in-the-middle attacks. Use only for testing. Default is

full.

#ssl.verification_mode: full

List of supported/valid TLS versions. By default all TLS versions 1.0 up to

1.2 are enabled.

#ssl.supported_protocols: [TLSv1.0, TLSv1.1, TLSv1.2]

Optional SSL configuration options. SSL is off by default.

List of root certificates for HTTPS server verifications

#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

Certificate for SSL client authentication

#ssl.certificate: "/etc/pki/client/cert.pem"

Client Certificate Key

#ssl.key: "/etc/pki/client/cert.key"

Optional passphrase for decrypting the Certificate Key.

#ssl.key_passphrase: ''

Configure cipher suites to be used for SSL connections

#ssl.cipher_suites: []

Configure curve types for ECDHE based cipher suites

#ssl.curve_types: []

here is my filebeat prospector configuratioj -

#########################################################
#=========================== Filebeat prospectors =============================

List of prospectors to fetch data.

filebeat.prospectors:

Each - is a prospector. Most options can be set at the prospector level, so

you can use different prospectors for various configurations.

Below are the prospector specific configurations.

Type of the files. Based on this the way the file is read is decided.

The different types cannot be mixed in one prospector

Possible options are:

* log: Reads every line of the log file (default)

* stdin: Reads the standard in

#------------------------------ Log prospector --------------------------------

  • input_type: log

    Paths that should be crawled and fetched. Glob based paths.

    To fetch all ".log" files from a specific level of subdirectories

    /var/log//.log can be used.

    For each file found under this path, a harvester is started.

    Make sure not file is defined twice as this can lead to unexpected behaviour.

    paths:

    • /var/log/*.log
    • /var/log/filebeat/*.log
    • /var/log/elasticsearch/logs/*.log
    • /var/log/httpd/access_log-*
    • /var/log/secure-*
    • /var/log/messages
    • /var/log/boot.log
      #- c:\programdata\elasticsearch\logs*

    Configure the file encoding for reading files with international characters

    following the W3C recommendation for HTML5 (http://www.w3.org/TR/encoding).

    Some sample encodings:

    plain, utf-8, utf-16be-bom, utf-16be, utf-16le, big5, gb18030, gbk,

    hz-gb-2312, euc-kr, euc-jp, iso-2022-jp, shift-jis, ...

    #encoding: plain

    Exclude lines. A list of regular expressions to match. It drops the lines that are

    matching any regular expression from the list. The include_lines is called before

    exclude_lines. By default, no lines are dropped.

    #exclude_lines: ["^DBG"]

Include lines. A list of regular expressions to match. It exports the lines that are

matching any regular expression from the list. The include_lines is called before

exclude_lines. By default, all the lines are exported.

#include_lines: ["^ERR", "^WARN"]

Exclude files. A list of regular expressions to match. Filebeat drops the files that

are matching any regular expression from the list. By default, no files are dropped.

exclude_files: [".gz$"]

Optional additional fields. These field can be freely picked

to add additional information to the crawled log files for filtering

#fields:

level: debug

review: 1

Set to true to store the additional fields as top level fields instead

of under the "fields" sub-dictionary. In case of name conflicts with the

fields added by Filebeat itself, the custom fields overwrite the default

fields.

#fields_under_root: false

Ignore files which were modified more then the defined timespan in the past.

ignore_older is disabled by default, so no files are ignored by setting it to 0.

Time strings like 2h (2 hours), 5m (5 minutes) can be used.

#ignore_older: 0

Type to be published in the 'type' field. For Elasticsearch output,

the type defines the document type these entries should be stored

in. Default: log

#document_type: log

How often the prospector checks for new files in the paths that are specified

for harvesting. Specify 1s to scan the directory as frequently as possible

without causing Filebeat to scan too frequently. Default: 10s.

#scan_frequency: 10s

Defines the buffer size every harvester uses when fetching the file

#harvester_buffer_size: 16384

Maximum number of bytes a single log event can have

All bytes after max_bytes are discarded and not sent. The default is 10MB.

This is especially useful for multiline log messages which can get large.

#max_bytes: 10485760
#########################################################

Hello Parth,

Below are the error messages in filebeat logs -

2017-08-18T21:58:18+05:30 INFO Non-zero metrics in the last 30s: libbeat.es.publish.read_bytes=393 libbeat.es.publish.write_bytes=123
2017-08-18T21:58:48+05:30 INFO No non-zero metrics in the last 30s
2017-08-18T21:58:56+05:30 ERR Connecting error publishing events (retrying): 401 Unauthorized
2017-08-18T21:59:18+05:30 INFO Non-zero metrics in the last 30s: libbeat.es.publish.read_bytes=393 libbeat.es.publish.write_bytes=123
2017-08-18T21:59:48+05:30 INFO No non-zero metrics in the last 30s
2017-08-18T21:59:56+05:30 ERR Connecting error publishing events (retrying): 401 Unauthorized

Kindly help me to resolve this issue

@sandy007 Can you upgrade beats dashboards to match ELK stack version? I see that you running ELK v5.5.0 and beats v5.4.0.

Sure parth,

Do i need to update it through yum or rpm ? or is there any other easy way to get it done ?

@sandy007 https://www.elastic.co/guide/en/beats/libbeat/5.5/upgrading-minor-versions.html

Before upgrading can you also paste your logstash pipeline config?

Hi parth,

I did not configure the logstash for this .. i am just using elasticsearch to parse the logs directly.

as given in above filebeat yaml.

there's still 401. are you sure you have same default creds of es? elastic/changeme?

Yes Parth ,

here is the new beats -

Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Apache2-access-logs.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Apache2-browsers.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Apache2-access-logs.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Apache2-operating-systems.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Apache2-access-logs.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Apache2-error-logs-over-time.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Apache2-errors-log.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Apache2-response-codes-over-time.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Apache2-access-logs.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Apache2-errors-log.json
Import dashboard /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/dashboard/Filebeat-MySQL-Dashboard.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/MySQL-slowest-queries.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Filebeat-MySQL-Slow-log.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/MySQL-Slow-queries-over-time.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Filebeat-MySQL-Slow-log.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/MySQL-error-logs.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Filebeat-MySQL-error-log.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Filebeat-MySQL-error-log.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/MySQL-Error-logs-levels.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Filebeat-MySQL-error-log.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/MySQL-Slow-logs-by-count.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Filebeat-MySQL-Slow-log.json
Import dashboard /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/dashboard/Filebeat-Nginx-Dashboard.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Errors-over-time.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Nginx-Access-Browsers.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Nginx-Access-OSes.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/New-Visualization.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Filebeat-Nginx-module.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Nginx-Access-Response-codes-by-top-URLs.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Sent-sizes.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Nginx-Access-Map.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Filebeat-Nginx-module.json
Import dashboard /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/dashboard/Filebeat-syslog-dashboard.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Syslog-events-by-hostname.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Syslog-system-logs.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/Syslog-hostnames-and-processes.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Syslog-system-logs.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/Syslog-system-logs.json
Import dashboard /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/dashboard/dfbb49f0-0a0f-11e7-8a62-2d05eaaac5cb.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/6295bdd0-0a0e-11e7-825f-6748cda7d858.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/5ebdbe50-0a0f-11e7-825f-6748cda7d858.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/2bb0fa70-0a11-11e7-9e84-43da493ad0c7.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/d1726930-0a7f-11e7-8b04-eb22a5669f27.json
Import visualization /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/visualization/c5411910-0a87-11e7-8b04-eb22a5669f27.json
Import search /tmp/tmp030684442/beats-dashboards-5.5.0/filebeat/search/4ac0a370-0a11-11e7-8b04-eb22a5669f27.json
Importing Kibana from /tmp/tmp030684442/beats-dashboards-5.5.0/heartbeat
Importing Kibana from /tmp/tmp030684442/beats-dashboards-5.5.0/metricbeat
Importing Kibana from /tmp/tmp030684442/beats-dashboards-5.5.0/packetbeat
Importing Kibana from /tmp/tmp030684442/beats-dashboards-5.5.0/winlogbeat