Zeek ingest using logstash on elastic cloud

Hi,

I am trying the elastic cloud for the first time using 7.12 stack. I would like to send zeek logs to my elastic cloud deployment and the default method recommended is to use filebeats. I would like to stream the logs using kafka and apply zeek specific data transformations to make it ECS compliant using logstash. I would appreciate it if you can let me know what I need to do to achieve this.

Thanks

The filebeat zeek module is already ecs compliant. Some of the processing is done locally and the rest is done via the ingest pipelines for the module.

Thanks for the response. Yes, I am aware of that. My requirement is to be able to replicate that pipeline using a combination of kafka and logstash without using filebeats. I have been able to configure logstash to pull zeek logs from kafka, but I don;t know how to make it ECS compliant. Because of this, I don't see data populated in the inbuilt zeek dashboards on kibana. Appreciate any more info on how to make it happen.

The source for the filebeat zeek module is available in github. That documents all the transformations.

Yes I was just going to post the link to github. I wouldn't bother doing too much in Logstash except pulling from Kafka and perhaps doing the minor changes that filebeat is doing prior to pushing to the respective ingest pipeline. I would use those native pipelines so you don't have to reinvent the wheel.

Thanks for the info. Looks like there are a lot of transformations being done on each log file. Would be tough to replicate this all in logstash.

@legoguy1000 Thanks again for your comments.

I am a newbie to the ELK ecosystem. Just to understand what filebeats does, I followed the instructions provided by Elastic. What I see is that filebeats has created an ingest pipeline in the cloud with a whole bunch of transformations. what are the minor changes it does locally that I need to replicate in logstash? Would appreciate any pointers.

Also, if I understand correctly, what you are suggesting is to configure logstash to reuse the ingest pipeline created by filebeats. Is there any document that helps me understand how to do this. Your replies have been very informative and useful.

Take a look at ingest-convert. I believe that tool is designed to convert the pipeline.yaml from a Filebeat module into a logstash configuration. It has limitations, and if I remember correctly it has bugs.

This blog describes its introduction. Back then filebeat modules used JSON, not YAML. Someone from Elastic opened an issue saying the converter should be removed, but it has not happened yet.

1 Like

the actions that Filebeat takes locally are located under each fileset under the config folder, such as beats/capture_loss.yml at master · elastic/beats · GitHub

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.