Cloudflare Logs via Logstash

Hello & Greetings all!
I'm doing some testing to retrieve Cloudflare logs using Logstash. My approach in retrieving the Cloudflare logs are by using Bash script to pull the logs from cloudflare then push them to logstash.

The process for retrieving the logs would be like this:
Cloudflare (Bash script) --> Logstash --> Kibana

As for the Bash script, I split them into 2:,

Cloudflare (Bash script):


CF_ZONES= <cloudflare-zone-id>
CF_EMAIL= <cloudflare-zone-email>
CF_API_KEY= <cloudflare-api-key>

# Date
CF_DATE=$(date -u +"%Y-%m-%d")

# Add one min to our request since we have to start one minute behind

# Set from and to times
CF_FROM_TIME=$(date +"%H:%M:%S" --date "-${CF_LOGS_FETCH_MIN} min" --utc)
CF_TO_TIME=$(date +"%H:%M:%S" --date "-1 min" --utc)

# Replace colons with periods for filename
CF_FROM_TIME_FILENAME=$(echo ${CF_FROM_TIME} | sed 's/:/./g')
CF_TO_TIME_FILENAME=$(echo ${CF_TO_TIME} | sed 's/:/./g')
#STARTDATE=$(($(date +%s)-3900))
FIELDS=$(curl -s -H "X-Auth-Email: ${CF_EMAIL}" -H "X-Auth-Key: ${CF_API_KEY}""${CF_ZONES}/logs/received/fields" | jq '. | to_entries[] | .key' -r | paste -sd "," -)

# Grab logs from cloudflare
for i in $(echo ${CF_ZONES} | sed "s/,/ /g"); do
  mkdir -p "${OUTPUT_DIR}"
  curl -H "Content-Type: application/json" -H "X-Auth-Email: ${CF_EMAIL}" -H "X-Auth-Key: ${CF_API_                                                                         KEY}" "${i}/logs/received?start=${CF_DATE}T${CF_FROM_TIM                                                                         E}Z&end=${CF_DATE}T${CF_TO_TIME}Z&timestamps=rfc3339&fields=${FIELDS}" > ${OUTPUT_LOG}


TIME=`date +%s` #get current time
GAP=240         #set gap for getting log
INTERVAL=60     #set logging interval


STARTDATE=$(($(date +%s)-3900))

  curl -svo "$CLOUDFLARE_FILENAME" -X GET "$ZONE_ID/logs/received?start=${STARTDATE}&end=${ENDDATE}&timestamps=rfc3339&fields=$(curl -s -H "X-Auth-Email: $AUTH_EMAIL" -H "X-Auth-Key: $API_KEY" "$ZONE_ID/logs/received/fields" | jq '. | to_entries[] | .key' -r | paste -sd "," -)" \
  -H "X-Auth-Email: $AUTH_EMAIL" \
  -H "X-Auth-Key: $API_KEY" \
  -H "Content-Type: application/json";

Logstash configuration: testcloudflare.conf

input {
  file {
    path => "/etc/logstash/testcloudflare/*.json"
    sincedb_path => "/dev/null"
    mode => "read"
    #type => "gzip"
    file_completed_action => "delete"
filter {
if [message] =~ "\A\{.+\}\z" {
          json {
            source => "message"

mutate {
add_field => {
      "Full URL" => "https://%{ClientRequestHost}%{ClientRequestURI}"

geoip {
 source => "OriginIP"
 target => "server_geoip" }

geoip {
 source => "ClientIP"
 target => "geoip"
if "_jsonparsefailure" in [tags] {
                    drop { }
output {
         elasticsearch {
             hosts => [""]
             #ssl => true
             #cacert => "/etc/logstash/certs/ca/ca.crt"
             ilm_rollover_alias => "testcloudflare"
             ilm_pattern => "{now{YYYY.MM.dd}}-000001"
             ilm_policy => "testcloudflare-hot-warm-cold"
             #user => "elastic"
             #password => "elastic"
         stdout {
            codec => rubydebug

Logstash pipelines.yml:

- test
  path.config: "/etc/logstash/conf.d/testcloudflare.conf"

During the testing, both of the script & configuration works fine but no output shown in Kibana. Is there something missing when ingesting Cloudflare logs inside logstash? Much appreciated. Thanks!

If there are no events in Elasticsearch that suggest that logstash is dropping them all, which suggests that the json filter is not parsing them. I would remove that drop {} filter and take a look at the raw messages.

Hey there @Badger! Thanks for the reply.
I have already commented out the drop {} filter but I still cannot found any raw messages. Is there any other way for me to see the output of the raw data?

Do you have anything in Logstash logs? How are you running logstash?

Are you running it as a service, since you are using pipelines.yml ? If so, does the logstash user has read/write permissions to the path /etc/logstash/testcloudflare/?

I have a similar pipeline, but I use a path where logstash has full permissions, which is not possible in /etc.

This is my input:

input {
    file {
        path => "/opt/logs-cloudflare/json/*.json"
        sincedb_path => "/opt/logs-cloudflare/dbs/logs-cloudflare.db"
        sincedb_clean_after => "2"
        mode => "read"
        file_completed_action => "delete"

Check Logstash logs to see if you have some errors and check if you have the right permissions, it doesn't seem to be anything wrong in the configuration.

Hi there @leandrojmp. Yes I 'm running the Logstash as a service & the logstash user has read/write permissions to the path /etc/logstash/testcloudflare/.

As for the file path, /etc/logstash/testcloudflare/*.json return blank document when using cat command on the json file. Is the output for Cloudflare's Logpush should return a blank document on logstash?

Do you have anything in Logstash logs that can give any hint of an issue?

If you having nothing in the source directory /etc/logstash/testcloudflare/*.json this can mean that Logstash is reading the file and deleting it or that your script is not working and is not creating the files.

You need to share Logstash logs to see if there is any hint of an issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.