Filebeat system modules

Hi, I am trying to run filebeat system modules and the index filebeat-yyyy.mm.dd is also getting created in elasticsearch index but when configuring index pattern in kibana with filebeat-* I am not able to view pre-built kibana visualizations for the module.

I am running this command for filebeat system module:

sudo service filebeat start -modules=system -setup

I am not able to understand where I am wrong. Any help would be appreciated. Thanks:)

Have you checked parsed system events are actually available in your index. Try to find some events in the Kibana Discovery tab by filtering for fileset.module: system

I tried fileset.module: system in Discovery tab and I am getting No Results found. This is the structure which is getting stored in elasticsearch index after running system module.

I have checked pipeline.json under ingest folder and it says it will store geoip information too but nothing is getting stored here.

These are plain logs. Nothing is parsed.

Which filebeat version are you using?

Which time-range did you use to search for logs (default is last 15min)? And which timezone do your machines run in? Try to increase range to multiple years and see if you can find any logs with fileset.module: system.

Please share your complete filebeat config file.

Plus output of filebeat modules list (assuming you're running a 6.x release).

I am using filebeat 5.5.2 version.

I searched logs for yesterday because that's when I transferred the logs. My machine is running in UTC timezone.

This is the config in filebeat.yml file which I am using.

#=========================== Filebeat prospectors =============================
filebeat.prospectors:
# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.

- input_type: log

 # Paths that should be crawled and fetched. Glob based paths.
paths:
- /var/log/auth.log
#- c:\programdata\elasticsearch\logs\*
#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
# Array of hosts to connect to.
hosts: ["xxx.xxx.x.xxx:xxxx"]

# Optional protocol and basic auth credentials.
#protocol: "https"
username: "xxxxxx"
password: "xxxxxxxx"

And then I am running this command to start running filebeat modules.

sudo service filebeat start -modules=system -setup

I tried running filebeat modules list but getting filebeat command not found error.

Why do you run this: sudo service filebeat start -modules=system -setup

Depending on system, this one uses init.d or systemd. Passing parameters should be done when running filebeat in foreground. If you follow the tutorial, you will learn to put modules into the filebeat config. This is required when running filebeat as a daemon/service.

Module loading/enabling support has been much improved in 6.x. Commands like filebeat module list, filebeat module enable and other are only available in 6.x.

Ok I will try to follow the tutorial and will let you know if it works. Thank You :slight_smile:

@steffens I have tried following the same tutorial for filebeat modules but it is not sending any log messages to elasticsearch index.

As they told in the tutorial to comment out the prospectors from filebeat.yml file I did the same but still, it did not help me.

In fact, it gave me error Exiting: No prospectors defined. What files do you want me to watch?
I am stuck on this part no idea what I am doing wrong.

i am stuck with this as well.

cdlentgf9db01:~ # cd /etc/filebeat/
cdlentgf9db01:/etc/filebeat # /usr/share/filebeat/bin/filebeat modules enable system
Module system doesn't exists!
You have new mail in /var/spool/mail/root
cdlentgf9db01:/etc/filebeat # /usr/share/filebeat/bin/filebeat modules list
Enabled:

Disabled:
cdlentgf9db01:/etc/filebeat # /usr/share/filebeat/bin/filebeat -e setup
2018-03-30T09:59:36.648-0400 INFO instance/beat.go:468 Home path: [/usr/share/filebeat/bin] Config path: [/usr/share/filebeat/bin] Data path: [/usr/share/filebeat/bin/data] Logs path: [/usr/share/filebeat/bin/logs]
2018-03-30T09:59:36.648-0400 DEBUG [beat] instance/beat.go:495 Beat metadata path: /usr/share/filebeat/bin/data/meta.json
2018-03-30T09:59:36.648-0400 INFO instance/beat.go:475 Beat UUID: 3af1ea81-d191-4b13-9b4e-ca9220b816af
2018-03-30T09:59:36.648-0400 INFO instance/beat.go:213 Setup Beat: filebeat; Version: 6.2.3
2018-03-30T09:59:36.648-0400 DEBUG [beat] instance/beat.go:230 Initializing output plugins
2018-03-30T09:59:36.648-0400 DEBUG [processors] processors/processor.go:49 Processors:
2018-03-30T09:59:36.648-0400 INFO elasticsearch/client.go:145 Elasticsearch url: http://cdlelk01.es.ad.adp.com:9200
2018-03-30T09:59:36.648-0400 INFO pipeline/module.go:76 Beat name: cdlentgf9db01
2018-03-30T09:59:36.649-0400 ERROR fileset/modules.go:95 Not loading modules. Module directory not found: /usr/share/filebeat/bin/module
2018-03-30T09:59:36.649-0400 INFO elasticsearch/client.go:145 Elasticsearch url: http://cdlelk01.es.ad.adp.com:9200
2018-03-30T09:59:36.649-0400 DEBUG [elasticsearch] elasticsearch/client.go:666 ES Ping(url=http://cdlelk01.es.ad.adp.com:9200)
2018-03-30T09:59:36.655-0400 DEBUG [elasticsearch] elasticsearch/client.go:689 Ping status code: 200
2018-03-30T09:59:36.655-0400 INFO elasticsearch/client.go:690 Connected to Elasticsearch version 6.0.0
2018-03-30T09:59:36.655-0400 DEBUG [elasticsearch] elasticsearch/client.go:708 HEAD http://cdlelk01.es.ad.adp.com:9200/_template/filebeat-6.2.3
2018-03-30T09:59:36.657-0400 INFO template/load.go:73 Template already exists and will not be overwritten.
Loaded index template
2018-03-30T09:59:36.658-0400 INFO elasticsearch/client.go:145 Elasticsearch url: http://cdlelk01.es.ad.adp.com:9200
2018-03-30T09:59:36.658-0400 DEBUG [elasticsearch] elasticsearch/client.go:666 ES Ping(url=http://cdlelk01.es.ad.adp.com:9200)
2018-03-30T09:59:36.661-0400 DEBUG [elasticsearch] elasticsearch/client.go:689 Ping status code: 200
2018-03-30T09:59:36.661-0400 INFO elasticsearch/client.go:690 Connected to Elasticsearch version 6.0.0
2018-03-30T09:59:36.662-0400 DEBUG [dashboards] dashboards/es_loader.go:309 Initialize the Elasticsearch 6.0.0 loader
2018-03-30T09:59:36.662-0400 DEBUG [dashboards] dashboards/es_loader.go:309 Elasticsearch URL http://cdlelk01.es.ad.adp.com:9200
2018-03-30T09:59:36.662-0400 ERROR instance/beat.go:667 Exiting: Error importing Kibana dashboards: fail to create the Kibana loader: Error creating Kibana client: can not convert 'object' into 'string' accessing 'setup.kibana.host' (source:'filebeat.yml')
Exiting: Error importing Kibana dashboards: fail to create the Kibana loader: Error creating Kibana client: can not convert 'object' into 'string' accessing 'setup.kibana.host' (source:'filebeat.yml')
cdlentgf9db01:/etc/filebeat

There seems to be an error with your YML configuration for the kibana part, would you mind posting it?

ok changing the following line in filebeats.yml

host: ["cdlelk01.es.ad.adp.com"]

to

host: "cdlelk01.es.ad.adp.com"

I got further, but am now getting this:

2018-03-30T10:22:27.401-0400 ERROR instance/beat.go:667 Exiting: Error importing Kibana dashboards: fail to import the dashboards in Kibana: Error importing directory /usr/share/filebeat/bin/kibana: No directory /usr/share/filebeat/bin/kibana/6
Exiting: Error importing Kibana dashboards: fail to import the dashboards in Kibana: Error importing directory /usr/share/filebeat/bin/kibana: No directory /usr/share/filebeat/bin/kibana/6

to get further I ran the service status command and got the entire start command with parameters:

cdlentgf9db01:/usr/share/filebeat/bin # service filebeat status
● filebeat.service - filebeat
Loaded: loaded (/lib/systemd/system/filebeat.service; disabled; vendor preset: disabled)
Active: active (running) since Thu 2018-03-29 15:41:45 EDT; 18h ago
Docs: https://www.elastic.co/guide/en/beats/filebeat/current/index.html
Main PID: 61161 (filebeat)
Tasks: 37 (limit: 512)
CGroup: /system.slice/filebeat.service
└─61161 /usr/share/filebeat/bin/filebeat -c /etc/filebeat/filebeat.yml -path.home /usr/share/filebeat -path.config /etc/filebeat -path.data /var/lib/filebeat -path.logs /var/log/filebeat

Mar 29 15:41:45 cdlentgf9db01 systemd[1]: Started filebeat.
cdlentgf9db01:/usr/share/filebeat/bin #

then I ran that passing in the setup option and it worked:

cdlentgf9db01:/etc/filebeat # /usr/share/filebeat/bin/filebeat -c /etc/filebeat/filebeat.yml -path.home /usr/share/filebeat -path.config /etc/filebeat -path.data /var/lib/filebeat -path.logs /var/log/filebeat setup
Loaded index template
Loaded dashboards
Loaded machine learning job configurations
cdlentgf9db01:/etc/filebeat #

now I made it to here:

2018-03-30T11:27:51.102-0400 ERROR pipeline/output.go:74 Failed to connect: Connection marked as failed because the onConnect callback failed: Error loading pipeline for fileset system/auth: This module requires the ingest-geoip plugin to be installed in Elasticsearch. You can install it using the following command in the Elasticsearch home directory:
sudo bin/elasticsearch-plugin install ingest-geoip

running that command give me:

CDLELK01:/usr/share/elasticsearch/bin # ./elasticsearch-plugin install ingest-geoip -v
Checking if url exists: https://artifacts.elastic.co/downloads/elasticsearch-plugins/ingest-geoip/ingest-geoip-linux-x86_64-6.0.0.zip
Exception in thread "main" java.net.ConnectException: Connection timed out (Connection timed out)
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
        at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
        at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
        at java.net.Socket.connect(Socket.java:589)
        at sun.security.ssl.SSLSocketImpl.connect(SSLSocketImpl.java:673)
        at sun.security.ssl.BaseSSLSocketImpl.connect(BaseSSLSocketImpl.java:173)
        at sun.net.NetworkClient.doConnect(NetworkClient.java:180)
        at sun.net.www.http.HttpClient.openServer(HttpClient.java:463)
        at sun.net.www.http.HttpClient.openServer(HttpClient.java:558)
        at sun.net.www.protocol.https.HttpsClient.<init>(HttpsClient.java:264)
        at sun.net.www.protocol.https.HttpsClient.New(HttpsClient.java:367)
        at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.getNewHttpClient(AbstractDelegateHttpsURLConnection.java:191)
        at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1138)
        at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:1032)
        at sun.net.www.protocol.https.AbstractDelegateHttpsURLConnection.connect(AbstractDelegateHttpsURLConnection.java:177)
        at sun.net.www.protocol.https.HttpsURLConnectionImpl.connect(HttpsURLConnectionImpl.java:153)
        at org.elasticsearch.plugins.InstallPluginCommand.urlExists(InstallPluginCommand.java:298)
        at org.elasticsearch.plugins.InstallPluginCommand.getElasticUrl(InstallPluginCommand.java:265)
        at org.elasticsearch.plugins.InstallPluginCommand.download(InstallPluginCommand.java:221)
        at org.elasticsearch.plugins.InstallPluginCommand.execute(InstallPluginCommand.java:213)
        at org.elasticsearch.plugins.InstallPluginCommand.execute(InstallPluginCommand.java:204)
        at org.elasticsearch.cli.EnvironmentAwareCommand.execute(EnvironmentAwareCommand.java:69)
        at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:134)
        at org.elasticsearch.cli.MultiCommand.execute(MultiCommand.java:69)
        at org.elasticsearch.cli.Command.mainWithoutErrorHandling(Command.java:134)
        at org.elasticsearch.cli.Command.main(Command.java:90)
        at org.elasticsearch.plugins.PluginCli.main(PluginCli.java:47)
CDLELK01:/usr/share/elasticsearch/bin # curl  https://artifacts.elastic.co/downloads/elasticsearch-plugins/ingest-geoip/ingest-geoip-linux-x86_64-6.0.0.zip
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>NoSuchKey</Code><Message>The specified key does not exist.</Message><Key>downloads/elasticsearch-plugins/ingest-geoip/ingest-geoip-linux-x86_64-6.0.0.zip</Key><RequestId>4A983C13ABC17F25</RequestId><HostId>tKN0R8cdDgGEnK/4NtXmsPqWGZd0zXAD1/56X6SOD8b7Q84TYEcfe10e0Oyq7w9dsUMl5ndUrYI=</HostId></Error>CDLELK01:/usr/share/elasticsearch/bin #

Hello @Pete_Del_Rosso , is there some kind of proxy blocking external connection?

I've ran the same command locally and it was fine.

 ✘  ph@sashimi  ~/Downloads/elasticsearch-6.0.0  bin/elasticsearch-plugin install ingest-geoip -v
Checking if url exists: https://artifacts.elastic.co/downloads/elasticsearch-plugins/ingest-geoip/ingest-geoip-darwin-x86_64-6.0.0.zip
-> Downloading ingest-geoip from elastic
Retrieving zip from https://artifacts.elastic.co/downloads/elasticsearch-plugins/ingest-geoip/ingest-geoip-6.0.0.zip
[=================================================] 100%  

I have the same error with curl, IIRC some headers are sent from the client and probably missing from the curl request.

Thanks PIer. Then our proxy might be interfering with this request. Let me check with that team

You might want to use theses options to make it work https://www.elastic.co/guide/en/elasticsearch/plugins/current/_other_command_line_parameters.html#_proxy_settings

Thank you but no dice. Our proxy is brutal. The old fashion way worked though:

CDLELK01:/usr/share/elasticsearch/bin # ./elasticsearch-plugin install file:///home/virtual/ingest-geoip-6.0.0.zip
-> Downloading file:///home/virtual/ingest-geoip-6.0.0.zip
[=================================================] 100%
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
@ WARNING: plugin requires additional permissions @
@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

Continue with installation? [y/N]y
-> Installed ingest-geoip
CDLELK01:/usr/share/elasticsearch/bin #

1 Like

Glad it worked!

Can you share your config? This sounds like your config file does not configure and prospectors and no modules.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.