Parsing csv file through Logstash

But if serverName is the key then where does the value go? Here is one option for what you could do -- one event per server. Change the json filter to store the parsed data in [@metadata] and then use ruby

    json { source => "[@metadata][userCalls]" target => "[@metadata][parsedUserCalls]" }
    json { source => "[@metadata][totalCalls]" target => "[@metadata][parsedTotalCalls]" }
    ruby {
        code => '
            a = []
            userCalls = event.get("[@metadata][parsedUserCalls]")
            totalCalls = event.get("[@metadata][parsedTotalCalls]")
            totalCalls.each { |k, v|
                server = {}
                server["serverName"] = k
                server["totalCalls"] = v
                server["userCalls"] = userCalls[k]
                a << server
            }
            event.set("servers", a)
        '
    }

That will get you

   "servers" => [
    [ 0] {
         "userCalls" => {},
        "serverName" => "CiCServer-20210701:20210930",
        "totalCalls" => 0
    },
    [ 1] {
         "userCalls" => {
            "empty" => 3.0,
            "fqenv" => 3.0
        },
        "serverName" => "gccdServer",
        "totalCalls" => 6
    },

etc. You might want to use a split filter to separate each filter into its own event, and then use this to move the contents of [server] to the root.

But do not let my guesses drive your data design. You need to describe exactly how you want they keys and values to appear.