I am pulling mimecast email logs using a python script and saving it to a folder the logs are in json and each file only has 1 log, I have configured my filebeat to read logs from the directory where the logs are stored and send it to es. The thing is although filebeat reads the files it does not output anything only when i add copy the log and paste it below it and save the log file filebeat reads and is able to out put properly.
How are you able to tell that Filebeat is reading the file?
Also, what version of Filebeat are you using?
I can see in the debug log that crawler is seeing the file and the harvester for the file also runs and it also says end of file reached but no output unless i edit the log file and copy paste it contents in the next line as well and save it. I am using the latest 7.10 version of filebeat.
Thanks. Could you enable the Filebeat HTTP API and then post the results of curl -XGET 'localhost:5066/stats?pretty'
here please (enclosed in ``` for readability)?
{
"beat": {
"cpu": {
"system": {
"ticks": 150,
"time": {
"ms": 155
}
},
"total": {
"ticks": 630,
"time": {
"ms": 641
},
"value": 630
},
"user": {
"ticks": 480,
"time": {
"ms": 486
}
}
},
"handles": {
"limit": {
"hard": 4096,
"soft": 1024
},
"open": 13
},
"info": {
"ephemeral_id": "efcaf386-a77d-4d89-86a7-d8aa69fc41a1",
"uptime": {
"ms": 9642
}
},
"memstats": {
"gc_next": 17896016,
"memory_alloc": 12625664,
"memory_total": 54937200,
"rss": 54202368
},
"runtime": {
"goroutines": 31
}
},
"filebeat": {
"events": {
"active": 0,
"added": 2,
"done": 2
},
"harvester": {
"closed": 0,
"open_files": 1,
"running": 1,
"skipped": 0,
"started": 1
},
"input": {
"log": {
"files": {
"renamed": 0,
"truncated": 0
}
},
"netflow": {
"flows": 0,
"packets": {
"dropped": 0,
"received": 0
}
}
}
},
"libbeat": {
"config": {
"module": {
"running": 0,
"starts": 0,
"stops": 0
},
"reloads": 1,
"scans": 1
},
"output": {
"events": {
"acked": 0,
"active": 0,
"batches": 0,
"dropped": 0,
"duplicates": 0,
"failed": 0,
"toomany": 0,
"total": 0
},
"read": {
"bytes": 0,
"errors": 0
},
"type": "logstash",
"write": {
"bytes": 0,
"errors": 0
}
},
"pipeline": {
"clients": 1,
"events": {
"active": 0,
"dropped": 0,
"failed": 0,
"filtered": 2,
"published": 0,
"retry": 0,
"total": 2
},
"queue": {
"acked": 0
}
}
},
"registrar": {
"states": {
"cleanup": 0,
"current": 21,
"update": 2
},
"writes": {
"fail": 0,
"success": 2,
"total": 2
}
},
"system": {
"cpu": {
"cores": 4
},
"load": {
"1": 0,
"15": 0.1,
"5": 0.04,
"norm": {
"1": 0,
"15": 0.025,
"5": 0.01
}
}
}
My apologies, the stats I was looking for have been recently moved to another API endpoint. Would you mind posting the results of curl 'http://localhost:5066/dataset?pretty'
please?
Thanks,
Shaunak
{
"fe16c2d4-a501-43d4-8fa3-2e469d2be40a": {
"last_event_published_time": "",
"last_event_timestamp": "",
"name": "/root/mimecast_log_collector/ttp_events.log",
"read_offset": 0,
"size": 3240,
"start_time": "2020-11-24T10:35:00.555Z"
}
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.