CSV Filter

Hello,

I have a very customizable log lines.They are several lines consist of different column count.My log lines are below

BDTz;i68;SiEQ_WR;s1;TSz0.01;PFr0.01;PTo999999.999;
BDTz;i70;SiEQ_FN;s1;TSz0.01;PFr0.01;PTo49.999;
BDTz;i70;SiEQ_FN;s1;TSz0.02;PFr50;PTo99.999;
BDTz;i70;SiEQ_FN;s1;TSz0.05;PFr100;PTo249.999;
BDTz;i70;SiEQ_FN;s1;TSz0.1;PFr250;PTo999999.999;
BDTz;i72;SiEQ_PM;s1;TSz0.01;PFr1;PTo19.999;
BDt;i1300;SiGLRYH.E;s1;Mk292;INiMSPOTEQTGLRYH;SYmGLRYH.E;NAmGULER YAT. HOLDING;ISi1204;CUiTRY;CUtTRY;PRt1;VOd1;LDa20140428;ITSz78;NDp3;NDTp3;CLId254;CNyTR;SSc0;STy1;AUmY;TRaY;PTaY;PTb2;MSe56;LSz1;SSvY;MLm1;MLt10000000;TRId96;MMk3362;TRmSI;GRsN;RTyPriceTime;CRaN;
BDt;i1302;SiBLCYT.E;s1;Mk292;INiMSPOTEQTBLCYT;SYmBLCYT.E;NAmBILICI YATIRIM;ISi446;CUiTRY;CUtTRY;PRt1;VOd1;LDa20140428;ITSz78;NDp3;NDTp3;CLId254;CNyTR;SSc0;STy1;AUmY;TRaY;PTaY;PTb2;MSe56;LSz1;SSvY;MLm1;MLt10000000;TRId96;TRmSI;GRsN;RTyPriceTime;CRaN;
BDt;i1306;SiBTCIM.E;s1;Mk292;INiMSPOTEQTBTCIM;SYmBTCIM.E;NAmBATI CIMENTO;ISi944;CUiTRY;CUtTRY;PRt1;VOd1;LDa20140428;ITSz78;NDp3;NDTp3;CLId254;CNyTR;SSc0;STy1;AUmY;TRaY;PTaY;PTb2;MSe64;LSz1;SSvY;MLm1;MLt10000000;TRId96;TRmSI;GRsN;RTyPriceTime;CRaN;
BDt;i1312;SiKOZAL.E;s1;Mk292;INiMSPOTEQTKOZAL;SYmKOZAL.E;NAmKOZA ALTIN;ISi504;CUiTRY;CUtTRY;PRt1;VOd1;LDa20140428;ITSz78;NDp3;NDTp3;CLId254;CNyTR;SSc0;STy1;AUmY;TRaY;PTaY;PTb2;MSe64;LSz1;SSvY;MLm1;MLt10000000;TRId96;TRmSI;GRsN;RTyPriceTime;CRaN;
BDt;i1318;SiDOCO.E;s1;Mk292;INiMSPOTEQTDOCO;SYmDOCO.E;NAmDO-CO;ISi642;CUiTRY;CUtTRY;PRt1;VOd1;LDa20140428;ITSz78;NDp3;NDTp3;CLId254;CNyTR;SSc0;STy1;AUmY;TRaY;PTaY;PTb2;MSe64;LSz1;SSvY;MLm1;MLt10000000;TRId96;TRmSI;GRsN;RTyPriceTime;CRaN;
z;i20542;s1;t095239.337;a3:3.6812;j3:100;k3:1;
z;i33004;s1;t095239.337;a4:3.6136;j4:125;k4:1;a5:3.6137;j5:100;k5:1;
z;i33196;s1;t095239.430;Bw4.1889;Bt12044;Aw4.368;At9052;b2:4.2833;g2:1000;h2:1;b3:4.2501;g3:1;h3:1;b4:4.241;g4:5;h4:1;b5:4.2405;g5:5;h5:1;
z;i20174;s1;t095239.437;b5:137.05;g5:4;h5:1;
z;i29390;s1;t095239.471;d0.0033;Bw1.1624;Bt14131;Aw1.2038;At11470;b1:1.1834;g1:1876;h1:1;
z;i33448;s1;t095239.471;Bw1.1643;Bt9555;Aw1.2048;At10049;a2:1.188;j2:1000;k2:1;a3:1.1881;j3:1000;k3:1;a4:1.1909;j4:1;k4:1;a5:1.1919;j5:1;k5:1;
z;i33196;s1;t095239.537;a1:4.2897;j1:2002;k1:2;a2:4.2925;j2:1;k2:1;a3:4.299;j3:2;k3:1;a4:4.312;j4:5;k4:1;a5:4.3374;j5:1;k5:1;
z;i29076;s1;t095239.537;Bw4.1802;Bt1416;Aw4.2061;At2513;b2:4.1916;g2:1001;h2:1;b3:4.191;g3:2;h3:1;b4:4.1887;g4:2;h4:1;a4:4.1972;j4:1000;k4:1;b5:4.1871;g5:3;h5:2;
z;i35648;s1;t095239.537;d0.0035;Bw1.1615;Bt9234;Aw1.2083;At6480;b1:1.1849;g1:1002;h1:1;
z;i29390;s1;t095239.537;d0.0032;b1:1.1833;g1:2876;h1:2;b2:1.1802;g2:1;h2:1;b3:1.1799;g3:1;h3:1;b4:1.179;g4:1;h4:1;b5:1.1786;g5:1;h5:1;
z;i33196;s1;t095239.637;d0.0092;Bw4.1888;Bt12044;Aw4.368;At9052;b1:4.2843;g1:1001;h1:1;a1:4.2896;j1:1000;k1:1;a2:4.2897;j2:1002;k2:1;a3:4.2925;j3:1;k3:1;a4:4.299;j4:2;k4:1;a5:4.312;j5:5;k5:1;
z;i24498;s1;t095239.637;a3:4.3786;j3:4;k3:1;
z;i22690;s1;t095239.731;d0.825;Bw130.433;Bt8886;Aw135.479;At5785;b1:132.925;g1:137;h1:10;
z;i29076;s1;t095239.737;d0.0019;Bw4.1801;Bt1416;Aw4.2061;At2513;b1:4.1926;g1:150;h1:1;
z;i33196;s1;t095239.817;Bw4.1961;Bt13044;Aw4.368;At9052;b2:4.2837;g2:1000;h2:1;b3:4.2833;g3:1000;h3:1;b4:4.2501;g4:1;h4:1;b5:4.241;g5:5;h5:1;
z;i29076;s1;t095239.817;Bw4.1849;Bt2416;Aw4.2061;At2513;b2:4.1916;g2:2001;h2:2;

First come to mind parsing this lines using csv filter.When i use csv filter output is as below

{
"column1" => "z",
"column23" => "j4:2",
"column22" => "a4:4.299",
"column21" => "k3:1",
"column20" => "j3:1",
"column5" => "d0.0092",
"column4" => "t095239.637",
"column3" => "s1",
"column2" => "i33196",
"column28" => nil,
"type" => "tiplog",
"column27" => "k5:1",
"column26" => "j5:5",
"column25" => "a5:4.312",
"column24" => "k4:1",
"path" => "/tmp/tip.log",
"@version" => "1",
"host" => "openshiftallinone",
"column12" => "h1:1",
"column11" => "g1:1001",
"column10" => "b1:4.2843",
"myField" => "z;",
"column19" => "a3:4.2925",
"column18" => "k2:1",
"column17" => "j2:1002",
"message" => "z;i33196;s1;t095239.637;d0.0092;Bw4.1888;Bt12044;Aw4.368;At9052;b1:4.2843;g1:1001;h1:1;a1:4.2896;j1:1000;k1:1;a2:4.2897;j2:1002;k2:1;a3:4.2925;j3:1;k3:1;a4:4.299;j4:2;k4:1;a5:4.312;j5:5;k5:1;\r",
"column16" => "a2:4.2897",
"column15" => "k1:1",
"column14" => "j1:1000",
"column13" => "a1:4.2896",
"@timestamp" => 2017-11-10T11:10:09.126Z,
"column9" => "At9052",
"column8" => "Aw4.368",
"column7" => "Bt12044",
"column6" => "Bw4.1888"
}
{
"column1" => "z",
"myField" => "z;",
"column5" => "a3:4.3786",
"column4" => "t095239.637",
"column3" => "s1",
"column2" => "i24498",
"message" => "z;i24498;s1;t095239.637;a3:4.3786;j3:4;k3:1;\r",
"type" => "tiplog",
"path" => "/tmp/tip.log",
"@timestamp" => 2017-11-10T11:10:09.134Z,
"@version" => "1",
"host" => "openshiftallinone",
"column8" => nil,
"column7" => "k3:1",
"column6" => "j3:4"
}

My second requirement i want to parse again to each csv column but there are different column count that parsed by each line using csv.So column name autodetected and column names are column1 column2 columnN etc.

So i want to parse again each column again using grok filter or another filter.This second filter read beginning of the string and match first lower character then between left side of the lower characters key and right side of the characters value as a explain below

For sample line için

​​z;i33448;s1;t095239.471;Bw1.1643;Bt9555;Aw1.2048;At10049;a2:1.188;j2:1000;k2:1;a3:1.1881;j3:1000;k3:1;a4:1.1909;j4:1;k4:1;a5:1.1919;j5:1;k5:1;​

Firstly parse this lines with ";" seperator

​​z
i33448
s1
t095239.471
Bw1.1643
Bt9555
Aw1.2048
At10049
a2:1.188
j2:1000
k2:1
a3:1.1881
j3:1000
k3:1
a4:1.1909
j4:1
k4:1
a5:1.1919
j5:1
k5:1

After second filter parse each column

​Message_Tpe=z​
​i=33448​ (Also rename i with ID)
​s=1 (Also rename s with SourceSYstem )
t=095239.471
Bw=1.1643
Bt=9555
Aw=1.2048
At=10049
a=2:1.188
j=2:1000
k=2:1

As i mention above i want to rename each character on the left side of your equation (key)

Is it possible using with csv filter o another way to parse this lines ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.