Logstash SPLIT plugin discards/strips empty strings at the end

Hello,

I'm trying to parse a log data where each line is constructed with string fields that are separated with a "/t" delimeter. I'm forwarding logs with Filebeat to Logstash and in logstash, I get each line as a GREEDYDATA since each line may contain different number of fields with actual data in it ( some of them may be null). I use SPLIT plugin with "/t" delimeter to split data into an array. However, if there's a log line with null fields at the end, SPLIT field automatically discards those fields. Is there any way that I can set those fields with the NULL/Emptystring value?

My Input data looks like this:

  • A/tB/tC/t/t/t

Normally, I should be able to extract 6 fields with "/t" delimeter. However, I'm only getting 3 fields because the ones at the end got discarded.

How my Grok looks like:

            grok
        	{
                        match => ["message", "%{GREEDYDATA:trx}"]
                }

How Mutate&Split looks like:

			mutate
			{
				split => 
				{
					"trx" => "	"
				}
				add_field => {"fieldA" => "%{[trx][0]}"}
				add_field => {"fieldB"=> "%{[trx][1]}"}
				add_field => {"fieldC"=> "%{[trx][2]}"}
				add_field => {"fieldD"=> "%{[trx][3]}"}
				add_field => {"fieldE"=> "%{[trx][4]}"}
				add_field => {"fieldF"=> "%{[trx][5]}"}

			}

Is there any way I can get all the fields even if they are empty and at the end?

Hi,
I would recomend a dissect filter:

dissect {
      mapping => {
        "message" => "%{+fieldA}/t%{+fieldB}/t%{+fieldC}/t%{fieldD}/t%{fieldE}/t%{fieldF}"
      }
}

Hope this helps.

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.