CloudFlare Log Ingest (Grok?)

Anybody ever setup Logstash to ingest CloudFlare API pulls? I'm not sure how to go about setting up grok patterns to match the output. Additionally, the returned data is in NDJson format, which I think is a double edged sword...or maybe not.

Anyways, is there a good place that I can go that breaks down each grok pattern so I can understand which to use? Below are a couple sample return lines to give you an idea of what's returned.

I have no pipeline config setup yet because there's a couple ingest methods I need to investigate and output will be to elasticsearch so that's pretty straightforward.


This is JSON so you just use a json codec or json filter. Don't use a grok filter for this.

1 Like

doh....that made this sooo much easier, lol.

In case anyone else runs across wanting to pull CloudFlare Enterprise logs, here's the pipeline I have. I have a PowerShell script running as a scheduled task that does the API call and saves the results to the ingest directory.

input {
  file {
    path => "d:/ElasticStack/Ingest/CloudFlare/*.txt"
    start_position => beginning
filter {
  json {
    source => "message"
  mutate {
    id => "Field addition"
    add_field => {
      "Full URL" => "https://%{ClientRequestHost}%{ClientRequestURI}"
output {
  elasticsearch {
    id => "Send to Elasticsearch"
    hosts => [""]
    index => "cloudflare-%{+YYYY.MM.dd}"
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.