Data from fuse.log into Kibana

Hi,

I am trying to get data from a file named fuse.log into Kibana.
When I go to http://localhost:5601 I see "Couldn't find any Elasticsearch data".

The data I want to load is fuse.log data. Here is some sample data:

Mar 09, 2018 12:47:59 PM org.apache.karaf.main.SimpleFileLock lock
INFO: locking
2018-03-09 12:48:01,393 | INFO | FelixStartLevel | | org.apache.felix.fileinstall | Creating configuration from org.apache.karaf.features.repos.cfg
2018-03-09 12:48:05,765 | INFO | FelixStartLevel | | org.apache.karaf.shell.security.impl.SecuredCommandConfigTransformer | Generating command ACL config org.apache.karaf.command.acl.activemq into service ACL configs [org.apache.karaf.service.acl.command.activemq.query, org.apache.karaf.service.acl.command.activemq.purge, org.apache.karaf.service.acl.command.activemq.bstat, org.apache.karaf.service.acl.command.activemq.dstat, org.apache.karaf.service.acl.command.activemq.browse, org.apache.karaf.service.acl.command.activemq.list]

What I have done so far:

  1. Installed elasticsearch-6.2.2.deb

  2. Installed ingest-geoip

  3. Installed kibana-6.2.2-amd64.deb
    Set elasticsearch.url: "http://localhost:9200"

  4. Installed filebeat-6.2.2-amd64.deb
    Set host: "localhost:5601"
    Set hosts: ["localhost:5044"]
    Change - /var/log/.log to - /home/sindre/Temp/FileBeatInput/.log

  5. Installed logstash
    Created file /etc/logstash/conf.d/beats.conf:

 	input { 
			beats { 
				port => "5043"
			} 
		}
		filter {
			if [type] == "syslog" {
				grok {
					match => { "message" => "<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?:? %{GREEDYDATA:syslog_message}" }
				}
				date {
					match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
				}
			}
		}

		output {
			elasticsearch { 
				hosts => ["localhost:9200"]
				index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
				document_type => "%{[metadata][type]}"
			}
		}
  1. Created /etc/filebeat/filebeat.template.json

{
"index_patterns": ["te*", "bar*"],
"mappings": {
"type1": {
"properties": {
"host_name": {
"type": "keyword"
}
}
}
}
}

Loaded it:
curl -XPUT 'http://localhost:9200/_template/filebeat?pretty' -d@/etc/filebeat/filebeat.template.json -H'Content-Type: application/json'

  1. Filebeat should be pointing to elasticsearch, not Kibana, so I think host should be set to localhost:9200
  2. When you run filebeat do you see any errors?
  3. Are you following a tutorial of some sort?

I will try this without logstash.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.