Failed to convert the data type from string to float

Hi experts,

I have met an data type converting issue. My requirement is

  1. parse a xml file ------ done
  2. collect some node information which i need ------ done
  3. add a field to be used to retrieve it ------ done
  4. convert data type of this field from string to float, need use it to draw a trend in kibana. ------blocking

The input file is:

<?xml version="1.0" encoding="UTF-8"?>
<OMeS version="2.3">
  <PMSetup startTime="2018-03-09T07:50:00.000+01:00" interval="5">
    <PMMOResult>
      <MO dimension="network_element">
        <DN>NTAS-tas01/HOSTNAME-cbam-4c63de7aeae2460289cd4197dc7-admintd-node-0/DBTYPE-cmdb</DN>
      </MO>
      <PMTarget measurementType="DBMEAS">
        <M704B1C1>32</M704B1C1>
        <M704B1C2>216</M704B1C2>
        <M704B1C3>0</M704B1C3>
        <M704B1C4>0</M704B1C4>
      </PMTarget>
    </PMMOResult>
    <PMMOResult>
      <MO dimension="network_element">
        <DN>NTAS-tas01/HOSTNAME-cbam-4c63de7aeae2460289cd4197dc7-admintd-node-1/DBTYPE-cmdb</DN>
      </MO>
      <PMTarget measurementType="DBMEAS">
        <M704B1C1>36</M704B1C1>
        <M704B1C2>96</M704B1C2>
        <M704B1C3>0</M704B1C3>
        <M704B1C4>1</M704B1C4>
      </PMTarget>
    </PMMOResult>
    <PMMOResult>
      <MO dimension="network_element">
        ... ... (about 18 node) ... ...

My filter pattern is:

input {

        file {
                path => ["/home/admin/log/NTASlog/*xml"]
                start_position => "beginning"
                type => "pmxmllog"
                sincedb_path => "/home/admin/log/NTASlog/.sincedb_file"

                codec => multiline {
                        pattern => "<?xml version"
                        #auto_flush_interval => 5
                        #max_lines => 60000
                        what => "previous"
                        negate=> true
                }

        }

}

filter {

        if [type] == "pmxmllog" {

                xml {
                        source => "message"
                        target => "parsed"

                        xpath => [
                                "/OMeS/PMSetup/@startTime", "audit_time",
                                "/OMeS/PMSetup/PMMOResult/MO/@dimension", "ELKNE",
                                "/OMeS/PMSetup/PMMOResult/MO/DN/text()", "ELKDN",
                                "/OMeS/PMSetup/PMMOResult/PMTarget/@measurementType", "ELKPMtype",
                                "/OMeS/PMSetup/PMMOResult/PMTarget/*[contains(local-name(), 'C1')]/text()", "ELKPV1",
                                "/OMeS/PMSetup/PMMOResult/PMTarget/*[contains(local-name(), 'C2')]/text()", "ELKPV2",
                                "/OMeS/PMSetup/PMMOResult/PMTarget/*[contains(local-name(), 'C3')]/text()", "ELKPV3",
                                "/OMeS/PMSetup/PMMOResult/PMTarget/*[contains(local-name(), 'C4')]/text()", "ELKPV4",
                                "/OMeS/PMSetup/PMMOResult/PMTarget/*[contains(local-name(), 'C5')]/text()", "ELKPV5"
                        ]


                }
        }

        mutate {
                add_field => {
                                "container1" => "%{[ELKDN][0]}"
                }
                add_field => {
                                "peakvalue1" => "%{[ELKPV1][0]}"
                                "peakvalue2" => "%{[ELKPV1][1]}"
                                "peakvalue3" => "%{[ELKPV1][2]}"
                                "peakvalue4" => "%{[ELKPV1][3]}"
                }

# this convert isn't  valid.
                convert => [
                                "peakvalue1" , "float",
                                "peakvalue2" , "float",
                                "peakvalue3" , "float",
                                "peakvalue4" , "float"
                ]

        }

#       there is no dissect plugin
#       dissect {
#               convert_datatype => {
#                                       peakvalue1 => "float"
#                               }
#       }

        date {
                match => [ "audit_time", "YYYY-MM-DD HH:MM:SS.SSS" ]
                timezone =>  "UTC"

        }

}


output {
        elasticsearch {
                        hosts => ["localhost:9200"]
        }
}

the output from kibana:

From the output, we can see the field which named "peakvalue1" is still a string, that result I can't draw a trend with it in kibana. can you help me check it? are there some other methods to achieve my object? thank you.

Can you try the below for conversion:

 convert => {
            "peakvalue1" => "float"
            "peakvalue2" => "float"
            "peakvalue3" => "float"
            "peakvalue4" => "float"
        }

I have do three actions,

  1. I saw a similar issue and split the add_field and convert into two mutates function.
  1. using the convert code format as you mentioned.

      mutate {
                     add_field => {
                                     "container1" => "%{[ELKDN][0]}"
                     }
                     add_field => {
                                     "peakvalue1" => "%{[ELKPV1][0]}"
                                     "peakvalue2" => "%{[ELKPV1][1]}"
                                     "peakvalue3" => "%{[ELKPV1][2]}"
                                     "peakvalue4" => "%{[ELKPV1][3]}"
                     }
             }
    
             mutate {
                     convert => {
                                     "peakvalue1" => "float"
                                     "peakvalue2" => "float"
                                     "peakvalue3" => "float"
                                     "peakvalue4" => "float"
                     }
    
  2. when the data fed completely, refresh the log pattern

Then, the convert is working well !!!!!!!!!!! it blocks me a long time!.


thanks a lot for your reply.!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.