Elasticsearch does not receive Filebeat data

Hello,

I am new to ELK and filebeat.
I have configured filebeat to get its input from a given json log file, and output it to logstash.
However, when I test with curl -XGET 'http://localhost:9200/filebeat-*/_search?pretty' I get no hit. And when I browse Kibana, it tells me it has no data :frowning:

This is my filebeat config (/etc/filebeat/filebeat.yml):

- type: log

  enabled: true
  paths:
    - /home/axelle/cowrie/var/log/cowrie/cowrie.json*
  encoding: plain
...
output.logstash:
  # Boolean flag to enable or disable the output module.
  enabled: true

  # The Logstash hosts
  hosts: ["localhost:5044"]

  # Number of workers per Logstash host.
  worker: 1

Logstash is configured to gets its input from filebeat (/etc/logstash/conf.d/logstash-cowrie.conf):

input {
       beats {
       	     port => 5044
	     type => "cowrie"
       }	  
}
...

I am using Elasticsearch, logstash, kibana, filebeat version 7.6.0 and nginx. On a Debian 10.

Can you please help me fix this?
Thanks,

What do the Logstash and Elasticsearch logs show? Do you know Logstash is receiving the data?

Logstash logs are full of these warnings "could not index event". Indeed, I haven't defined an index, because ... I don't understand what I should index in my case :frowning:

[2020-03-06T09:43:51,056][WARN ][logstash.outputs.elasticsearch][main] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x65a27d1c>], :response=>{"index"=>{"_index"=>"logstash-2020.03.03-000001", "_type"=>"_doc", "_id"=>"1tkDr3ABoIyQtREg3XWE", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [input] of type [text] in document with id '1tkDr3ABoIyQtREg3XWE'. Preview of field's value: '{type=log}'", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:872"}}}}}

This page explains how to create an index. Where are those logstash-YYYY.MMM.DD the page mentions? Is my files /var/log/logstash/logstash-plain-2020-03-03-1.log.gz (etc)? But those files (cf above) only contain warnings, there is no (useful) data.

The useful logs, I get them by doing curl 'http://127.0.0.1:9200/_search?q=cowrie&size=5' in my case.

{"took":22,"timed_out":false,"_shards":{"total":5,"successful":5,"skipped":0,"failed":0},"hits":{"total":{"value":2842,"relation":"eq"},"max_score":11.6610365,"hits":[{"_index":"logstash-2020.03.03-000001","_type":"_doc","_id":"bdIGqnABoIyQtREg3rDm","_score":11.6610365,"_source":{"size":2040,"shasum":"069b3fd2fc1ca9abf47bc7c256770b0d641d017ae0e0606f1e3336ab6e5e577d","duplicate":false,"sensor":"instance-39","timestamp":"2020-03-05T09:29:01.640887Z","message":"Closing TTY Log: var/lib/cowrie/tty/069b3fd2fc1ca9abf47bc7c256770b0d641d017ae0e0606f1e3336ab6e5e577d after 102 seconds","src_ip":"124.218.149.8","geoip":{"continent_code":"AS","ip":"124.218.149.8","city_name":"Kaohsiung City","location":{"lat":22.6148,"lon":120.3139},"longitude":120.3139,"latitude":22.6148,"timezone":"Asia/Taipei","country_name":"Taiwan","region_code":"KHH","region_name":"Kaohsiung","country_code2":"TW","country_code3":"TW"},"ttylog":"var/lib/cowrie

As for ElasticSearch logs, in /var/log/elasticsearch/elasticsearch.log, I have MapperParsingExceptions.

[2020-03-06T08:55:23,171][DEBUG][o.e.a.b.TransportShardBulkAction] [instance-39] [logstash-2020.03.03-000001][0] failed to execute bulk item (index) index {[logstash][_doc][e9kOr3ABoIyQtREgbXgd], source[{"@version":"1","@timestamp":"2020-03-06T08:55:22.055Z","log":{"file":{"path":"/home/axelle/cowrie/var/log/cowrie/cowrie.json"},"offset":21864514},"message":"{\"eventid\":\"cowrie.direct-tcpip.data\",\"dst_ip\":\"signup.live.com\",\"dst_port\":443,\"data\":\"b'\\\\x16\\\\x03\\\\x00\\\\x00/\\\\x01 ... \\x00'\",\"sensor\":\"instance-39\",\"timestamp\":\"2020-03-06T08:55:21.305854Z\",\"src_ip\":\"5.188.86.169\",\"session\":\"87e3dcf4c86d\"}","input":{"type":"log"},"agent":{"id":"f505e637-ece6-4dc5-ad66-5f986536eae4","ephemeral_id":"64e3fdc5-be01-458e-bcb6-b6cc0d95d6d2","hostname":"instance-39","version":"7.6.0","type":"filebeat"},"host":{"name":"instance-39"},"ecs":{"version":"1.4.0"},"tags":[],"type":"cowrie"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse field [input] of type [text] in document with id 'e9kOr3ABoIyQtREgbXgd'. Preview of field's value: '{type=log}'
	at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:306) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrField(DocumentParser.java:488) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.mapper.DocumentParser.parseObject(DocumentParser.java:505) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.mapper.DocumentParser.innerParseObject(DocumentParser.java:418) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.mapper.DocumentParser.parseObjectOrNested(DocumentParser.java:395) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.mapper.DocumentParser.internalParseDocument(DocumentParser.java:112) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:71) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.java:267) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.shard.IndexShard.prepareIndex(IndexShard.java:793) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.shard.IndexShard.applyIndexOperation(IndexShard.java:770) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.shard.IndexShard.applyIndexOperationOnPrimary(IndexShard.java:742) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.bulk.TransportShardBulkAction.executeBulkItemRequest(TransportShardBulkAction.java:254) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.bulk.TransportShardBulkAction$2.doRun(TransportShardBulkAction.java:157) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.bulk.TransportShardBulkAction.performOnPrimary(TransportShardBulkAction.java:189) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:114) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.bulk.TransportShardBulkAction.shardOperationOnPrimary(TransportShardBulkAction.java:81) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.support.replication.TransportReplicationAction$PrimaryShardReference.perform(TransportReplicationAction.java:895) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.support.replication.ReplicationOperation.execute(ReplicationOperation.java:109) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.runWithPrimaryShardReference(TransportReplicationAction.java:374) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.lambda$doRun$0(TransportReplicationAction.java:297) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.ActionListener$1.onResponse(ActionListener.java:63) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.shard.IndexShard.lambda$wrapPrimaryOperationPermitListener$24(IndexShard.java:2791) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.ActionListener$3.onResponse(ActionListener.java:113) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.shard.IndexShardOperationPermits.acquire(IndexShardOperationPermits.java:285) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.shard.IndexShardOperationPermits.acquire(IndexShardOperationPermits.java:237) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.shard.IndexShard.acquirePrimaryOperationPermit(IndexShard.java:2765) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.support.replication.TransportReplicationAction.acquirePrimaryOperationPermit(TransportReplicationAction.java:836) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.support.replication.TransportReplicationAction$AsyncPrimaryAction.doRun(TransportReplicationAction.java:293) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.action.support.replication.TransportReplicationAction.handlePrimaryRequest(TransportReplicationAction.java:256) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler$1.doRun(SecurityServerTransportInterceptor.java:257) [x-pack-security-7.6.0.jar:7.6.0]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.xpack.security.transport.SecurityServerTransportInterceptor$ProfileSecuredRequestHandler.messageReceived(SecurityServerTransportInterceptor.java:315) [x-pack-security-7.6.0.jar:7.6.0]
	at org.elasticsearch.transport.RequestHandlerRegistry.processMessageReceived(RequestHandlerRegistry.java:63) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.transport.TransportService$7.doRun(TransportService.java:750) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingAbstractRunnable.doRun(ThreadContext.java:692) [elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.common.util.concurrent.AbstractRunnable.run(AbstractRunnable.java:37) [elasticsearch-7.6.0.jar:7.6.0]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) [?:?]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) [?:?]
	at java.lang.Thread.run(Thread.java:830) [?:?]
Caused by: java.lang.IllegalStateException: Can't get text on a START_OBJECT at 1:1039
	at org.elasticsearch.common.xcontent.json.JsonXContentParser.text(JsonXContentParser.java:85) ~[elasticsearch-x-content-7.6.0.jar:7.6.0]
	at org.elasticsearch.common.xcontent.support.AbstractXContentParser.textOrNull(AbstractXContentParser.java:253) ~[elasticsearch-x-content-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.mapper.TextFieldMapper.parseCreateField(TextFieldMapper.java:823) ~[elasticsearch-7.6.0.jar:7.6.0]
	at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:284) ~[elasticsearch-7.6.0.jar:7.6.0]
	... 40 more

Hope this helps.

I think I solved my problem.

This post saved me: my logstash configuration had the same issue, where I was using if [type] == "cowrie" { instead of the correct if [fields][document_type] == "cowrie".

Once I modified this and restarted logstash, I started seeing useful data in the logstash data.
Then, I want back to Kibana. I don't fully understand indexes to be honest, but from log messages, I know I needed to create one. So, I created an index pattern for logstash-* (which is the name of my logstash logs). It found all the interesting fields of my logs (e.g timestamp, arch, data, geoip_cityname...). Then, in Discover, I can see my data :slight_smile:

Note I still have MapperParsingExceptions in my /var/log/elasticsearch/elasticsearch.log, and in logstash logs, I still have many warnings Could not index event to Elasticsearch.. I'll look into that, but probably a different issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.