How to filter object,float,string datatype values coming from single variable

I am trying to parse the data which is in the following format:

{
          "node" => "10.0.1.136",
     "timestamp" => "2016-02-23 11:47:09",
     "namespace" => "statistics",
         "stats" => {
             "error" => nil,
             "devid" => 0,
             "value" => [
            [0] {
                  "in_min" => 162,
                "time_max" => 7260,
                  "in_max" => 162,
                "time_avg" => 5277.64697265625,
                 "out_min" => 240,
                 "out_max" => 396,
                 "in_rate" => 2315.375244140625,
                   "op_id" => 4294967295,
                "op_count" => 68,
                 "op_rate" => 14.29244041442871,
                "out_rate" => 4800.3681640625,
                "time_min" => 4389
            }
        ],
               "key" => "cluster.protostats.papi.total",
              "time" => 1456208163,
        "error_code" => nil,
              "list" => "{\"in_min\"=>162, \"time_max\"=>7260, \"in_max\"=>162, \"time_avg\"=>#<BigDecimal:542cb627,'0.527764697265625E4',15(16)>, \"out_min\"=>240, \"out_max\"=>396, \"in_rate\"=>#<BigDecimal:dac417f,'0.2315375244140625E4',16(20)>, \"op_id\"=>4294967295, \"op_count\"=>68, \"op_rate\"=>#<BigDecimal:28875d3a,'0.1429244041442871E2',16(20)>, \"out_rate\"=>#<BigDecimal:38f25909,'0.48003681640625E4',14(16)>, \"time_min\"=>4389}"
    },
      "@version" => "1",
    "@timestamp" => "2016-02-23T06:17:10.121Z",
          "host" => "parth.crest.loc",
       "command" => "python /home/elastic/collect.py -inputFile input.txt",
          "tags" => [
        [0] "_jsonparsefailure"
    ]
}
{
          "node" => "10.0.1.136",
     "timestamp" => "2016-02-23 11:47:09",
     "namespace" => "statistics",
         "stats" => {
             "error" => nil,
             "devid" => 0,
             "value" => [],
               "key" => "cluster.protostats.hdfs.total",
              "time" => 1456208163,
        "error_code" => nil,
              "list" => nil
    },
      "@version" => "1",
    "@timestamp" => "2016-02-23T06:17:10.121Z",
          "host" => "parth.crest.loc",
       "command" => "python /home/elastic/collect.py -inputFile input.txt"
}
{
          "node" => "10.0.1.136",
     "timestamp" => "2016-02-23 11:47:09",
     "namespace" => "statistics",
         "stats" => {
             "error" => nil,
             "devid" => 0,
             "value" => 1054.2,
               "key" => "cluster.net.ext.bytes.in.rate",
              "time" => 1456208160,
        "error_code" => nil
    },
      "@version" => "1",
    "@timestamp" => "2016-02-23T06:17:10.121Z",
          "host" => "parth.crest.loc",
       "command" => "python /home/elastic/collect.py -inputFile input.txt"
}

The config filter I wrote is:

filter {
    if ( [stats][key] =~ /cluster\.protostats.*/ ) {
        mutate  {
            add_field => { "[stats][list]" => "%{[stats][value}"}
        }
        json{
            source => "[stats][list]"
            target => "[stats][list]"
            }
}

Is it possible for [stats][value] field to have multiple data type?
How to parse the [stats][list] as a json object and get it's values?

Is it possible for [stats][value] field to have multiple data type?

To contain subfields with different types? Yes.

How to parse the [stats][list] as a json object and get it's values?

[stats][list] isn't JSON, it's a stringified Ruby hash. Maybe you can use a ruby filter to parse it back into a Ruby hash.

I am getting 3 Different values for "stats.value" i.e [stats][value]
as I have shown in previous post. Which are

  1. An empty list
  2. A list
  3. Float value

Elasticsearch won't like that. A field with a given name needs to have the same type across all documents. Empty array vs. non-empty array is fine, but if it's a float value you should rename the field.

Thank That worked.