Logstash to ECS

Since the beats modules can collect data and emit ECS format, is it possible to send data to a logstash node that contains the same data and have logstash perform the ECS conversion work.

Example: legacy systems utilize a Windows Event Forwarding service instead of using the beat module. Can these logs be sent to a logstash node and have the same windows events transformed to ECS for ingest into Elastic and use in products like SIEM and subsequently Machine Learning and the rest of the stack?

Is the alternative a logstash mutate filter to map all of the windows event log fields to ECS?

Is the alternative a logstash mutate filter to map all of the windows event log fields to ECS?

I think this is the case for now. Would also love to see ECS mapping in Logstash in the near future.

We are working on solutions.

There are two problems.

  1. Converting fields that are created internally by plugins like geoip
  2. Ease of converting user supplied mappings to ECS.

Both of these are complicated by the fact that some field values are not in the datatype defined by ECS and need converting too.

1 Like

I agree those are problems to overcome but they are not insurmountable.

For unknown fields that do not map from raw source to ECS (like custom fields) should go into unknown or unparsed or something to indicate they don't match and allow the user to create matching fields.

In the meantime, does Elastic or anyone in the community have conversions they want to share. Elastic must have a spreadsheet with the raw xml name conversions for things like Bro/Zeek, WEL, Osquery, Etc, because there are modules that make use of the ECS fields in Kibana. Can those be shared so the community has a place to start.

Since elastic has had to complete the exercise for the beats platform can you share it with the community.

Hey Tim,

This is not as straightforward as you probably would like, but you can look at the code performing the conversion for all Beats modules in the Beats repo.

In a given module's directory, you'll have a directory for each log type, and underneath it another directory that contains an Elasticsearch ingest pipeline that performs the renames & such.

Since you mention Zeek, here's a directly link to the pipeline that handles the Zeek "connection" events, as an example: https://github.com/elastic/beats/blob/master/x-pack/filebeat/module/zeek/connection/ingest/pipeline.json. Now Zeek is one of the rare (or only?) module that also performs some renames directly in Beats, not just in ES ingest pipelines, so you'll want to look at this file as well: https://github.com/elastic/beats/blob/master/x-pack/filebeat/module/zeek/connection/config/connection.yml

Under the zeek directory, you'll be able to find all other Zeek event types. Also of note is the [module]/[log type]/tests directory, where you'll see original log files and their converted JSON equivalent (minus some metadata fields that would change between test runs).

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.