How to filter uwsgi logs on the basis of date specified inside logs

[pid: 5413|app: 0|req: 374528/2711349] 107.167.105.4 () {82 vars in 1747 bytes} [Tue Nov 8 11:47:28 2016] GET /api/v1/core/capturedata/?tag_id=a1aa493710fc4584aafd4c066119e6a0&cid=a762b82eba66493694058cf4602f6e5a => generated 42 bytes in 4 msecs (HTTP/1.1 200) 3 headers in 96 bytes (1 switches on core 0) [pid: 5435|app: 0|req: 374397/2711350] 192.140.221.101 () {72 vars in 1561 bytes} [Tue Nov 16 11:47:28 2016] GET /api/v1/core/capturedata/?tag_id=a1aa493710fc4584aafd4c066119e6a0&cid=a762b82eba66493694058cf4602f6e5a&temp=0.29777087707226646 => generated 42 bytes in 3 msecs (HTTP/1.1 200) 3 headers in 96 bytes (1 switches on core 0)

here i want only those logs to be shipped to elasticsearch that are dated above [Mon Nov 15], is there any way to configure logstash for the same, i know how it works with @timestamp , but is it possible to work around with this kind of date format?

Assuming you're using grok to extract the timestamp to a separate field you can use a date filter to parse it. However I'm not sure if you can use a conditional on that field to accomplish what you want (not without a ruby filter anyway). I'd probably use a simple string match on the timestamp field.

if [timestamp] =~ /^\w+ Nov 15 / {
  elasticsearch { ... }
}
1 Like

thanx for the help, but i dont think i can apply range over string format date and yes i will try with ruby filter that might solve my issue.

Your question didn't talk about any date ranges.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.