Logstash 對於 postgresql log的filter 問題

Hi All,
不知道這裡有沒有人使用logstash , 把postgresql log 做fileter, 找了很多方法,都無法實現,postgresql log 大致如下, 不知道各位有沒有比較好的解決方法

2017-10-02 16:00:06 CST[2017-10-02 15:59:13 CST]59d1f1d1.61ad[25005]10.0.10.* username || LOG: duration: 9904.724 ms execute : select ***************************************
2017-10-02 16:00:06 CST[2017-10-02 15:59:13 CST]59d1f1d1.61ad[25005]10.0.10.* username || DETAIL: parameters: $1 = '1'
2017-10-02 16:00:07 CST[2017-10-02 15:59:45 CST]59d1f1f1.63f4[25588]10.0.10.* username || LOG: duration: 14514.898 ms execute : select ***************************************
2017-10-02 16:00:07 CST[2017-10-02 15:59:45 CST]59d1f1f1.63f4[25588]10.0.10.* username || DETAIL: parameters: $1 = '', $2 = '2', $3 = '3', $4 = '6', $5 = '4', $6 = '20171002', $7 = '20180102', $8 = '', $9 = ''
2017-10-02 16:00:50 CST[2017-10-02 16:00:34 CST]59d1f222.6808[26632]10.0.10.
username || LOG: duration: 15381.794 ms execute : select *****************************
2017-10-02 16:00:50 CST[2017-10-02 16:00:34 CST]59d1f222.6808[26632]10.0.10.
username || DETAIL: parameters: $1 = '', $2 = '', $3 = '', $4 = '20171009'
2017-10-02 16:00:54 CST[2017-10-02 16:00:13 CST]59d1f20d.6665[26213]10.0.10.
username || LOG: duration: 14386.347 ms execute : select *********************************
2017-10-02 16:00:54 CST[2017-10-02 16:00:13 CST]59d1f20d.6665[26213]10.0.10.
username || DETAIL: parameters: $1 = ''
2017-10-02 16:01:32 CST[2017-10-02 16:00:41 CST]59d1f229.68b8[26808]10.0.10.
username || LOG: duration: 47126.369 ms execute : select **********************************
2017-10-02 16:01:32 CST[2017-10-02 16:00:41 CST]59d1f229.68b8[26808]10.0.10.
username || DETAIL: parameters: $1 = '
', $2 = '', $3 = '
', $4 = '20171109'
2017-10-02 16:01:38 CST[2017-10-02 16:01:11 CST]59d1f247.6b8a[27530]10.0.10.
username || LOG: duration: 7699.837 ms execute : select ***************************************
2017-10-02 16:01:38 CST[2017-10-02 16:01:11 CST]59d1f247.6b8a[27530]10.0.10.
username || DETAIL: parameters: $1 = '23018', $2 = '00', $3 = '10', $4 = '
', $5 = '', $6 = '
', $7 = '', $8 = '', $9 = '', $10 = '10', $11 = '15', $12 = '20', $13 = '90', $14 = '', $15 = '', $16 = '', $17 = '', $18 = '20170715', $19 = '20180102'
2017-10-02 16:01:41 CST[2017-10-02 15:59:36 CST]59d1f1e8.6336[25398]10.0.10.
username || LOG: duration: 123312.666 ms execute : select *******************************************
2017-10-02 16:01:41 CST[2017-10-02 15:59:36 CST]59d1f1e8.6336[25398]10.0.10.
username || DETAIL: parameters: $1 = '
', $2 = '
', $3 = '', $4 = '20171006'
2017-10-02 16:02:23 CST[2017-10-02 16:01:11 CST]59d1f247.6b84[27524]10.0.10.
username || LOG: duration: 67953.366 ms execute : select *******************************************
2017-10-02 16:02:23 CST[2017-10-02 16:01:11 CST]59d1f247.6b84[27524]10.0.10.
username || DETAIL: parameters: $1 = '20170901', $2 = '
', $3 = '****', $4 = '*', $5 = ''

Hi, 看你的日志,主要目的是要进行结构化么,你说的 filter 是指 Grok 抽取字段么?还是其他需求?

Hi , 我主要的目的確實要進行結構化,並且利用elasticsearch 及kibana來分析。

如果是结构化,用 Grok filter 就可以实现的。

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.