Multiple values in field json parser

In order to learn more about ELK, I try to combine it with fun stuff to build :slight_smile: .

On of the things I try to make is dashboard with Formula 1 data.
Im able to get the data in Elasticsearch by curl an json api.

Now the problem is the data is the same for every driver. So its combines the fields. Whichs makes it impossible to create visualizations.

Using this logstash input to get the data.

input {
    exec {
        command => "curl --location --request GET 'http://ergast.com/api/f1/2021/1/results.json'"
        type => "curl"
        interval => 3600
        tags => [ "f1" ]
    }
}
filter {
        json {
            source => "message"
        }
        mutate {
            remove_field => [ "command" ]
            remove_field => [ "type" ]
            remove_field => [ "host" ]
            remove_field => [ "@version" ]
            remove_field => [ "message" ]
        }
}
output {
        elasticsearch {
                hosts => "localhost:9200"
                ecs_compatibility => disabled
                index => "f1.tracks-%{+YYYY.MM.dd}"
        }
}

Data output can be vieuw here: http://ergast.com/api/f1/2021/1/results.json

Some fields are used for every driver like "permanentNumber". This is now combined in one field separed by a comma instead of multiple field so it can be used in a visualizations, if im correct.

Is there a way to get the data in multiple fields or any other way to it can be used in visualizations.

I am not sure that it answers your question, but if you want each race and car to be a separate event you can use

    json { source => "message" }
    mutate { remove_field => [ "command", "type", "host", "@version", "message" ] }
    split { field => "[MRData][RaceTable][Races]" }
    split { field => "[MRData][RaceTable][Races][Results]" }

Also an http_poller may be more efficient that an exec

input { http_poller { urls => { "first" => "http://ergast.com/api/f1/2021/1/results.json" } schedule => { cron => "0 0 * * * *" } } }
1 Like

Sorry for the late reply!

Thanks for the solution that works!

One last question; Since the data will be "imported" once is there a way to remove the date and time of the even. Or should I just set the time range in Kibana to every long.

I do not think @timestamp is optional in Elasticsearch, just set the kibana time range to All Time.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.