Apache live stats using elasticsearch and logstash


(user16) #1

Hi, I am currently collecting apache access logs using the logstash
centralized setup and all the logs are being collected properly. I want
trying to display the last 2min visitors (something similar to SPLUNK Google
Map of Visitors
Link:http://docs.splunk.com/Documentation/WebIntel/latest/User/BMgooglemapofvisitors
)

I want to get a list of sourceip address from the apache logs and look up
using the geoip database and show it onto the google maps, but my problem
is there are multiple log entries like below and having trouble extracting
just source ip address

Q. Is there a way in elasticsearch to apply "UNIQUE" or "DISTINCT" filters
to give just the unique source ip's for that time period.of 2 min?

2012-04-30T04:23:07.000Z
60.0.0.181 - - [30/Apr/2012:14:23:07 +1000] "POST
/posttoserver.php?x=0.5323159941472113 HTTP/1.1" 200 10
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.0; Trident/5.0)"

2012-04-30T04:23:07.000Z
60.0.0.181 - - [30/Apr/2012:14:23:07 +1000] "POST
/posttoserver.php?x=0.7869502729736269 HTTP/1.1" 200 10
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.0; Trident/5.0)"

2012-04-30T04:23:26.000Z
124.0.0.83 - - [30/Apr/2012:14:23:26 +1000] "POST
/posttoserver.php?x=0.48628548765555024 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:26.000Z
124.0.0.83 - - [30/Apr/2012:14:23:26 +1000] "POST
/posttoserver.php?x=0.808886235114187 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:28.000Z
110.0.0.196 - - [30/Apr/2012:14:23:28 +1000] "POST
/posttoserver.php?x=0.33912599040195346 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:28.000Z
110.0.0.196 - - [30/Apr/2012:14:23:28 +1000] "POST
/posttoserver.php?x=0.5306816347874701 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
110.0.0.196 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.8703127126209438 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
124.0.0.83 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.7754105641506612 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:31.000Z
110.0.0.196 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.8894465500488877 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
124.0.0.83 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.6088048042729497 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:33.000Z
110.0.0.196 - - [30/Apr/2012:14:23:33 +1000] "POST
/posttoserver.php?x=0.2770046340301633 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:33.000Z
110.0.0.196 - - [30/Apr/2012:14:23:33 +1000] "POST
/posttoserver.php?x=0.08373264269903302 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:35.000Z
124.0.0.83 - - [30/Apr/2012:14:23:35 +1000] "POST
/posttoserver.php?x=0.673358547501266 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:35.000Z
124.0.0.83 - - [30/Apr/2012:14:23:35 +1000] "POST
/posttoserver.php?x=0.7178014349192381 HTTP/1.1" 200 11
"http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

Thanks


(Igor Motov) #2

If ip addresses are indexed into a separate field, you can use Terms Facethttp://www.elasticsearch.org/guide/reference/api/search/facets/terms-facet.html on
the source ip field with a query that limits results to the last 2 minutes
to get a list of all unique IP addresses.

On Monday, April 30, 2012 12:40:25 AM UTC-4, user16 wrote:

Hi, I am currently collecting apache access logs using the logstash
centralized setup and all the logs are being collected properly. I want
trying to display the last 2min visitors (something similar to SPLUNK Google
Map of Visitors
Link:
http://docs.splunk.com/Documentation/WebIntel/latest/User/BMgooglemapofvisitors)

I want to get a list of sourceip address from the apache logs and look up
using the geoip database and show it onto the google maps, but my problem
is there are multiple log entries like below and having trouble extracting
just source ip address

Q. Is there a way in elasticsearch to apply "UNIQUE" or "DISTINCT" filters
to give just the unique source ip's for that time period.of 2 min?

2012-04-30T04:23:07.000Z
60.0.0.181 - - [30/Apr/2012:14:23:07 +1000] "POST
/posttoserver.php?x=0.5323159941472113 HTTP/1.1" 200 10 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.0; Trident/5.0)"

2012-04-30T04:23:07.000Z
60.0.0.181 - - [30/Apr/2012:14:23:07 +1000] "POST
/posttoserver.php?x=0.7869502729736269 HTTP/1.1" 200 10 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.0; Trident/5.0)"

2012-04-30T04:23:26.000Z
124.0.0.83 - - [30/Apr/2012:14:23:26 +1000] "POST
/posttoserver.php?x=0.48628548765555024 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:26.000Z
124.0.0.83 - - [30/Apr/2012:14:23:26 +1000] "POST
/posttoserver.php?x=0.808886235114187 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:28.000Z
110.0.0.196 - - [30/Apr/2012:14:23:28 +1000] "POST
/posttoserver.php?x=0.33912599040195346 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:28.000Z
110.0.0.196 - - [30/Apr/2012:14:23:28 +1000] "POST
/posttoserver.php?x=0.5306816347874701 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
110.0.0.196 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.8703127126209438 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
124.0.0.83 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.7754105641506612 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:31.000Z
110.0.0.196 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.8894465500488877 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
124.0.0.83 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.6088048042729497 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:33.000Z
110.0.0.196 - - [30/Apr/2012:14:23:33 +1000] "POST
/posttoserver.php?x=0.2770046340301633 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:33.000Z
110.0.0.196 - - [30/Apr/2012:14:23:33 +1000] "POST
/posttoserver.php?x=0.08373264269903302 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:35.000Z
124.0.0.83 - - [30/Apr/2012:14:23:35 +1000] "POST
/posttoserver.php?x=0.673358547501266 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:35.000Z
124.0.0.83 - - [30/Apr/2012:14:23:35 +1000] "POST
/posttoserver.php?x=0.7178014349192381 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

Thanks


(user16) #3

Thanks Igor, I am still very new to elasticsearch. can you please point me
to some examples on how this can be done?

On Monday, April 30, 2012 9:29:23 PM UTC+10, Igor Motov wrote:

If ip addresses are indexed into a separate field, you can use Terms Facethttp://www.elasticsearch.org/guide/reference/api/search/facets/terms-facet.html on
the source ip field with a query that limits results to the last 2 minutes
to get a list of all unique IP addresses.

On Monday, April 30, 2012 12:40:25 AM UTC-4, user16 wrote:

Hi, I am currently collecting apache access logs using the logstash
centralized setup and all the logs are being collected properly. I want
trying to display the last 2min visitors (something similar to SPLUNK Google
Map of Visitors
Link:
http://docs.splunk.com/Documentation/WebIntel/latest/User/BMgooglemapofvisitors)

I want to get a list of sourceip address from the apache logs and look up
using the geoip database and show it onto the google maps, but my problem
is there are multiple log entries like below and having trouble extracting
just source ip address

Q. Is there a way in elasticsearch to apply "UNIQUE" or "DISTINCT"
filters to give just the unique source ip's for that time period.of 2 min?

2012-04-30T04:23:07.000Z
60.0.0.181 - - [30/Apr/2012:14:23:07 +1000] "POST
/posttoserver.php?x=0.5323159941472113 HTTP/1.1" 200 10 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.0; Trident/5.0)"

2012-04-30T04:23:07.000Z
60.0.0.181 - - [30/Apr/2012:14:23:07 +1000] "POST
/posttoserver.php?x=0.7869502729736269 HTTP/1.1" 200 10 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.0; Trident/5.0)"

2012-04-30T04:23:26.000Z
124.0.0.83 - - [30/Apr/2012:14:23:26 +1000] "POST
/posttoserver.php?x=0.48628548765555024 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:26.000Z
124.0.0.83 - - [30/Apr/2012:14:23:26 +1000] "POST
/posttoserver.php?x=0.808886235114187 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:28.000Z
110.0.0.196 - - [30/Apr/2012:14:23:28 +1000] "POST
/posttoserver.php?x=0.33912599040195346 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:28.000Z
110.0.0.196 - - [30/Apr/2012:14:23:28 +1000] "POST
/posttoserver.php?x=0.5306816347874701 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
110.0.0.196 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.8703127126209438 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
124.0.0.83 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.7754105641506612 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:31.000Z
110.0.0.196 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.8894465500488877 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
124.0.0.83 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.6088048042729497 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:33.000Z
110.0.0.196 - - [30/Apr/2012:14:23:33 +1000] "POST
/posttoserver.php?x=0.2770046340301633 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:33.000Z
110.0.0.196 - - [30/Apr/2012:14:23:33 +1000] "POST
/posttoserver.php?x=0.08373264269903302 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:35.000Z
124.0.0.83 - - [30/Apr/2012:14:23:35 +1000] "POST
/posttoserver.php?x=0.673358547501266 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:35.000Z
124.0.0.83 - - [30/Apr/2012:14:23:35 +1000] "POST
/posttoserver.php?x=0.7178014349192381 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

Thanks


(Igor Motov) #4

Sure. I just need to know date and source ip field names. Could you share
the mapping for you logstash index?

On Tuesday, May 1, 2012 1:27:50 AM UTC-4, user16 wrote:

Thanks Igor, I am still very new to elasticsearch. can you please point me
to some examples on how this can be done?

On Monday, April 30, 2012 9:29:23 PM UTC+10, Igor Motov wrote:

If ip addresses are indexed into a separate field, you can use Terms
Facethttp://www.elasticsearch.org/guide/reference/api/search/facets/terms-facet.html on
the source ip field with a query that limits results to the last 2 minutes
to get a list of all unique IP addresses.

On Monday, April 30, 2012 12:40:25 AM UTC-4, user16 wrote:

Hi, I am currently collecting apache access logs using the logstash
centralized setup and all the logs are being collected properly. I want
trying to display the last 2min visitors (something similar to SPLUNK Google
Map of Visitors
Link:
http://docs.splunk.com/Documentation/WebIntel/latest/User/BMgooglemapofvisitors)

I want to get a list of sourceip address from the apache logs and look
up using the geoip database and show it onto the google maps, but my
problem is there are multiple log entries like below and having trouble
extracting just source ip address

Q. Is there a way in elasticsearch to apply "UNIQUE" or "DISTINCT"
filters to give just the unique source ip's for that time period.of 2 min?

2012-04-30T04:23:07.000Z
60.0.0.181 - - [30/Apr/2012:14:23:07 +1000] "POST
/posttoserver.php?x=0.5323159941472113 HTTP/1.1" 200 10 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.0; Trident/5.0)"

2012-04-30T04:23:07.000Z
60.0.0.181 - - [30/Apr/2012:14:23:07 +1000] "POST
/posttoserver.php?x=0.7869502729736269 HTTP/1.1" 200 10 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.0; Trident/5.0)"

2012-04-30T04:23:26.000Z
124.0.0.83 - - [30/Apr/2012:14:23:26 +1000] "POST
/posttoserver.php?x=0.48628548765555024 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:26.000Z
124.0.0.83 - - [30/Apr/2012:14:23:26 +1000] "POST
/posttoserver.php?x=0.808886235114187 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:28.000Z
110.0.0.196 - - [30/Apr/2012:14:23:28 +1000] "POST
/posttoserver.php?x=0.33912599040195346 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:28.000Z
110.0.0.196 - - [30/Apr/2012:14:23:28 +1000] "POST
/posttoserver.php?x=0.5306816347874701 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
110.0.0.196 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.8703127126209438 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
124.0.0.83 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.7754105641506612 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:31.000Z
110.0.0.196 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.8894465500488877 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:31.000Z
124.0.0.83 - - [30/Apr/2012:14:23:31 +1000] "POST
/posttoserver.php?x=0.6088048042729497 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:33.000Z
110.0.0.196 - - [30/Apr/2012:14:23:33 +1000] "POST
/posttoserver.php?x=0.2770046340301633 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:33.000Z
110.0.0.196 - - [30/Apr/2012:14:23:33 +1000] "POST
/posttoserver.php?x=0.08373264269903302 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"

2012-04-30T04:23:35.000Z
124.0.0.83 - - [30/Apr/2012:14:23:35 +1000] "POST
/posttoserver.php?x=0.673358547501266 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

2012-04-30T04:23:35.000Z
124.0.0.83 - - [30/Apr/2012:14:23:35 +1000] "POST
/posttoserver.php?x=0.7178014349192381 HTTP/1.1" 200 11 "
http://mywebserver/folder/test.swf?v=10" "Mozilla/5.0 (Macintosh; Intel
Mac OS X 10_5_8) AppleWebKit/534.50.2 (KHTML, like Gecko) Version/5.0.6
Safari/533.22.3"

Thanks


(user16) #5

Hi, Thanks for your help. I have been reading some documentation and when

i try the below POST query

{"query":{"match_all":{}},"facets":{"histo1":{"histogram":{"field":"_timestamp","time_interval":"1.5h"}}}}

i get the below result (NOTE: the facets section is blank). Using
elasticsearch-head plugin, under browser there is no field called
"_timestamp". should there be one? I am using logstash to feed into
graylog2.

Thanks

{
"took":26,
"timed_out":false,
"_shards":{
"total":5,
"successful":5,
"failed":0
},
"hits":{
"total":39411,
"max_score":1.0,
"hits":[
{
"_index":"graylog2",
"_type":"message",
"_id":"tS1q1ebbTBOW0pcFjidnzg",
"_score":1.0,
"_source":{
"host":"hostname",
"_clientip":"58.1.7.171",
"full_message":GET/test_folder/text.txtHTTP/1.1"20010"-""Dalvik/1.6.0(Linux;U;Android4.0.3;HTCVelocity4GBuild/IML74K)"""58.1.7.171

    • [02/May/2012:15:09:47 +1000] ",
      "_timestamp":"02/May/2012:15:09:47 +1000",
      "line":151,
      "_ident":"-",
      "_agent":Dalvik/1.6.0(Linux;U;Android4.0.3;HTCVelocity4GBuild/IML74K)"""",
      "message":GET/test_folder/text.txtHTTP/1.1"20010"-""Dalvik/1.6.0(Linux;U;Android4.0.3;HTCVelocity4GBuild/IML74K)"""58.1.7.171
    • [02/May/2012:15:09:47 +1000] ",
      "_request":"/test_folder/text.txt",
      "level":7,
      "facility":"logstash-gelf",
      "_response":"200",
      "file":"//var/log/httpd/access_log",
      "created_at":1.33593538833E9,
      "_verb":"GET",
      "streams":[
      ],
      "_bytes":"10",
      "_httpversion":"1.1",
      "_ZONE":"+1000",
      "_auth":"-"
      }
      },.....................
      .........................
      "facets":{
      "histo1":{
      "_type":"histogram",
      "entries":[
      ]
      }
      }
      }

(Igor Motov) #6

Unfortunately, I am not familiar with graylog2. So, hopefully, somebody who
has an experience with it will be able to help here. But from elasticsearch
perspective, it seems that the _timestamp field in your sample document is
clashing with the built-in _timestamp field of elasticsearch. Moreover, it
has the date format, that elasticsearch doesn't understand. So, the first
thing that you need to do is to make sure that your timestamp is indexed as
date in elasticsearch. After this, you should be able to limit your results
to last 2 min, and retrieve all client ip addresses:

{
"query":{
"range":{
"_timestamp":{
"gte":"now-2m"
}
}
},
"facets":{
"term1":{
"terms":{
"field":"_clientip"
}
}
}
}

On Wednesday, May 2, 2012 3:11:07 AM UTC-4, user16 wrote:

Hi, Thanks for your help. I have been reading some documentation and when

i try the below POST query

{"query":{"match_all":{}},"facets":{"histo1":{"histogram":{"field":"_timestamp","time_interval":"1.5h"}}}}

i get the below result (NOTE: the facets section is blank). Using
elasticsearch-head plugin, under browser there is no field called
"_timestamp". should there be one? I am using logstash to feed into
graylog2.

Thanks

{
"took":26,
"timed_out":false,
"_shards":{
"total":5,
"successful":5,
"failed":0
},
"hits":{
"total":39411,
"max_score":1.0,
"hits":[
{
"_index":"graylog2",
"_type":"message",
"_id":"tS1q1ebbTBOW0pcFjidnzg",
"_score":1.0,
"_source":{
"host":"hostname",
"_clientip":"58.1.7.171",
"full_message":GET/test_folder/text.txtHTTP/1.1"20010"-""Dalvik/1.6.0(Linux;U;Android4.0.3;HTCVelocity4GBuild/IML74K)"""58.1.7.171

    • [02/May/2012:15:09:47 +1000] ",
      "_timestamp":"02/May/2012:15:09:47 +1000",
      "line":151,
      "_ident":"-",

"_agent":Dalvik/1.6.0(Linux;U;Android4.0.3;HTCVelocity4GBuild/IML74K)"""",
"message":GET/test_folder/text.txtHTTP/1.1"20010"-""Dalvik/1.6.0(Linux;U;Android4.0.3;HTCVelocity4GBuild/IML74K)"""58.1.7.171

    • [02/May/2012:15:09:47 +1000] ",
      "_request":"/test_folder/text.txt",
      "level":7,
      "facility":"logstash-gelf",
      "_response":"200",
      "file":"//var/log/httpd/access_log",
      "created_at":1.33593538833E9,
      "_verb":"GET",
      "streams":[
      ],
      "_bytes":"10",
      "_httpversion":"1.1",
      "_ZONE":"+1000",
      "_auth":"-"
      }
      },.....................
      .........................
      "facets":{
      "histo1":{
      "_type":"histogram",
      "entries":[
      ]
      }
      }
      }

(user16) #7

Hi, I have taken graylog2 out of equation, now i have standalone elastic
search running with logstash send the logs and now i have a "timestamp"
field as below

@fields: {
clientip: [
101.1.2.106
]
ident: [

]
auth: [

]
timestamp: [
03/May/2012:18:01:52 +1000
]
ZONE: [
+1000
]
verb: [
GET
]
request: [
/folder/text.txt?x=1336032110254
]
httpversion: [
1.1
]
response: [
200
]
bytes: [
10
]
referrer: [
http://somewebsite.net.au/test/folder.swf
]
agent: [
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
]
}
@timestamp: 2012-05-03T08:01:52.000Z
@source_host: PFIOLO1O31
@source_path: //var/log/httpd/access_log
@message: 101.1.2.106 - - [03/May/2012:18:01:52 +1000] "GET
/folder/text.txt?x=1336032110254 HTTP/1.1" 200 10
"http://somewebsite.net.au/test/folder.swf" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"
}
}

but, when i tried the below POST query
{
"query": {
"range": {
"timestamp": {
"gte": "now-2m"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "clientip"
}
}
}
}
i get
{

  • took: 16
  • timed_out: false
  • _shards: {
    • total: 5
    • successful: 5
    • failed: 0
      }
  • hits: {
    • total: 0
    • max_score: null
    • hits: [ ]
      }
  • facets: {
    • term1: {
      • _type: terms
      • missing: 0
      • total: 0
      • other: 0
      • terms: [ ]
        }
        }

}
I am guessing it because of the timezone. using "head" plugin i can see
there are 2 fields "timestamp" and "@timestamp". @timestamp is showing the
hourly records and hence it is being processed by elasticsearch properly
(but in GMT time).
Thanks for your help.


(Igor Motov) #8

Try replacing "timestamp" with "@timestamp" in the range query. If it still
doesn't work, post the output of curl localhost:9200/yourindex/_mapping
here.

On Thursday, May 3, 2012 7:37:41 AM UTC-4, user16 wrote:

Hi, I have taken graylog2 out of equation, now i have standalone elastic
search running with logstash send the logs and now i have a "timestamp"
field as below

@fields: {
clientip: [
101.1.2.106
]
ident: [

]
auth: [

]
timestamp: [
03/May/2012:18:01:52 +1000
]
ZONE: [
+1000
]
verb: [
GET
]
request: [
/folder/text.txt?x=1336032110254
]
httpversion: [
1.1
]
response: [
200
]
bytes: [
10
]
referrer: [
http://somewebsite.net.au/test/folder.swf
]
agent: [
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
]
}
@timestamp: 2012-05-03T08:01:52.000Z
@source_host: PFIOLO1O31
@source_path: //var/log/httpd/access_log
@message: 101.1.2.106 - - [03/May/2012:18:01:52 +1000] "GET
/folder/text.txt?x=1336032110254 HTTP/1.1" 200 10 "
http://somewebsite.net.au/test/folder.swf" "Mozilla/5.0 (compatible; MSIE
9.0; Windows NT 6.1; WOW64; Trident/5.0)"
}
}

but, when i tried the below POST query
{
"query": {
"range": {
"timestamp": {
"gte": "now-2m"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "clientip"
}
}
}
}
i get
{

  • took: 16
  • timed_out: false
  • _shards: {
    • total: 5
    • successful: 5
    • failed: 0
      }
  • hits: {
    • total: 0
    • max_score: null
    • hits: [ ]
      }
  • facets: {
    • term1: {
      • _type: terms
      • missing: 0
      • total: 0
      • other: 0
      • terms: [ ]
        }
        }

}
I am guessing it because of the timezone. using "head" plugin i can see
there are 2 fields "timestamp" and "@timestamp". @timestamp is showing the
hourly records and hence it is being processed by elasticsearch properly
(but in GMT time).
Thanks for your help.


(user16) #9

tried @timestamp and dint work either

my mapping is below

{"logstash-2012.05.03":{"apache-access":{"properties":{"@fields":{"dynamic":"true","properties":{"timestamp":{"type":"string"},"response":{"type":"string"},"httpversion":{"type":"string"},"referrer":{"type":"string"},"bytes":{"type":"string"},"verb":{"type":"string"},"ident":{"type":"string"},"clientip":{"type":"string"},"request":{"type":"string"},"agent":{"type":"string"},"ZONE":{"type":"string"},"auth":{"type":"string"}}},"@timestamp":{"format":"dateOptionalTime","type":"date"},"@message":{"type":"string"},"@source":{"type":"string"},"@type":{"type":"string"},"@tags":{"type":"string"},"@source_host":{"type":"string"},"@source_path":{"type":"string"}}}}}

On Thursday, May 3, 2012 9:49:09 PM UTC+10, Igor Motov wrote:

Try replacing "timestamp" with "@timestamp" in the range query. If it
still doesn't work, post the output of curl
localhost:9200/yourindex/_mapping here.

On Thursday, May 3, 2012 7:37:41 AM UTC-4, user16 wrote:

Hi, I have taken graylog2 out of equation, now i have standalone elastic
search running with logstash send the logs and now i have a "timestamp"
field as below

@fields: {
clientip: [
101.1.2.106
]
ident: [

]
auth: [

]
timestamp: [
03/May/2012:18:01:52 +1000
]
ZONE: [
+1000
]
verb: [
GET
]
request: [
/folder/text.txt?x=1336032110254
]
httpversion: [
1.1
]
response: [
200
]
bytes: [
10
]
referrer: [
http://somewebsite.net.au/test/folder.swf
]
agent: [
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
]
}
@timestamp: 2012-05-03T08:01:52.000Z
@source_host: PFIOLO1O31
@source_path: //var/log/httpd/access_log
@message: 101.1.2.106 - - [03/May/2012:18:01:52 +1000] "GET
/folder/text.txt?x=1336032110254 HTTP/1.1" 200 10 "
http://somewebsite.net.au/test/folder.swf" "Mozilla/5.0 (compatible;
MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
}
}

but, when i tried the below POST query
{
"query": {
"range": {
"timestamp": {
"gte": "now-2m"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "clientip"
}
}
}
}
i get
{

  • took: 16
  • timed_out: false
  • _shards: {
    • total: 5
    • successful: 5
    • failed: 0
      }
  • hits: {
    • total: 0
    • max_score: null
    • hits: [ ]
      }
  • facets: {
    • term1: {
      • _type: terms
      • missing: 0
      • total: 0
      • other: 0
      • terms: [ ]
        }
        }

}
I am guessing it because of the timezone. using "head" plugin i can see
there are 2 fields "timestamp" and "@timestamp". @timestamp is showing the
hourly records and hence it is being processed by elasticsearch properly
(but in GMT time).
Thanks for your help.

On Thursday, May 3, 2012 9:49:09 PM UTC+10, Igor Motov wrote:

Try replacing "timestamp" with "@timestamp" in the range query. If it
still doesn't work, post the output of curl
localhost:9200/yourindex/_mapping here.

On Thursday, May 3, 2012 7:37:41 AM UTC-4, user16 wrote:

Hi, I have taken graylog2 out of equation, now i have standalone elastic
search running with logstash send the logs and now i have a "timestamp"
field as below

@fields: {
clientip: [
101.1.2.106
]
ident: [

]
auth: [

]
timestamp: [
03/May/2012:18:01:52 +1000
]
ZONE: [
+1000
]
verb: [
GET
]
request: [
/folder/text.txt?x=1336032110254
]
httpversion: [
1.1
]
response: [
200
]
bytes: [
10
]
referrer: [
http://somewebsite.net.au/test/folder.swf
]
agent: [
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
]
}
@timestamp: 2012-05-03T08:01:52.000Z
@source_host: PFIOLO1O31
@source_path: //var/log/httpd/access_log
@message: 101.1.2.106 - - [03/May/2012:18:01:52 +1000] "GET
/folder/text.txt?x=1336032110254 HTTP/1.1" 200 10 "
http://somewebsite.net.au/test/folder.swf" "Mozilla/5.0 (compatible;
MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
}
}

but, when i tried the below POST query
{
"query": {
"range": {
"timestamp": {
"gte": "now-2m"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "clientip"
}
}
}
}
i get
{

  • took: 16
  • timed_out: false
  • _shards: {
    • total: 5
    • successful: 5
    • failed: 0
      }
  • hits: {
    • total: 0
    • max_score: null
    • hits: [ ]
      }
  • facets: {
    • term1: {
      • _type: terms
      • missing: 0
      • total: 0
      • other: 0
      • terms: [ ]
        }
        }

}
I am guessing it because of the timezone. using "head" plugin i can see
there are 2 fields "timestamp" and "@timestamp". @timestamp is showing the
hourly records and hence it is being processed by elasticsearch properly
(but in GMT time).
Thanks for your help.


(Igor Motov) #10

@timestamp should work. Maybe two minutes is not enough. Let's try
expanding it to 1 day:

curl 'localhost:9200/logstash-2012.05.03/apache-access/_search?pretty=true'
-d '{
"query": {
"range": {
"@timestamp": {
"gte": "now-1d"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "@fields.clientip"
}
}
}
}'

On Thursday, May 3, 2012 8:00:34 AM UTC-4, user16 wrote:

tried @timestamp and dint work either

my mapping is below

{"logstash-2012.05.03":{"apache-access":{"properties":{"@fields":{"dynamic":"true","properties":{"timestamp":{"type":"string"},"response":{"type":"string"},"httpversion":{"type":"string"},"referrer":{"type":"string"},"bytes":{"type":"string"},"verb":{"type":"string"},"ident":{"type":"string"},"clientip":{"type":"string"},"request":{"type":"string"},"agent":{"type":"string"},"ZONE":{"type":"string"},"auth":{"type":"string"}}},"@timestamp":{"format":"dateOptionalTime","type":"date"},"@message":{"type":"string"},"@source":{"type":"string"},"@type":{"type":"string"},"@tags":{"type":"string"},"@source_host":{"type":"string"},"@source_path":{"type":"string"}}}}}

On Thursday, May 3, 2012 9:49:09 PM UTC+10, Igor Motov wrote:

Try replacing "timestamp" with "@timestamp" in the range query. If it
still doesn't work, post the output of curl
localhost:9200/yourindex/_mapping here.

On Thursday, May 3, 2012 7:37:41 AM UTC-4, user16 wrote:

Hi, I have taken graylog2 out of equation, now i have standalone elastic
search running with logstash send the logs and now i have a "timestamp"
field as below

@fields: {
clientip: [
101.1.2.106
]
ident: [

]
auth: [

]
timestamp: [
03/May/2012:18:01:52 +1000
]
ZONE: [
+1000
]
verb: [
GET
]
request: [
/folder/text.txt?x=1336032110254
]
httpversion: [
1.1
]
response: [
200
]
bytes: [
10
]
referrer: [
http://somewebsite.net.au/test/folder.swf
]
agent: [
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
]
}
@timestamp: 2012-05-03T08:01:52.000Z
@source_host: PFIOLO1O31
@source_path: //var/log/httpd/access_log
@message: 101.1.2.106 - - [03/May/2012:18:01:52 +1000] "GET
/folder/text.txt?x=1336032110254 HTTP/1.1" 200 10 "
http://somewebsite.net.au/test/folder.swf" "Mozilla/5.0 (compatible;
MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
}
}

but, when i tried the below POST query
{
"query": {
"range": {
"timestamp": {
"gte": "now-2m"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "clientip"
}
}
}
}
i get
{

  • took: 16
  • timed_out: false
  • _shards: {
    • total: 5
    • successful: 5
    • failed: 0
      }
  • hits: {
    • total: 0
    • max_score: null
    • hits: [ ]
      }
  • facets: {
    • term1: {
      • _type: terms
      • missing: 0
      • total: 0
      • other: 0
      • terms: [ ]
        }
        }

}
I am guessing it because of the timezone. using "head" plugin i can see
there are 2 fields "timestamp" and "@timestamp". @timestamp is showing the
hourly records and hence it is being processed by elasticsearch properly
(but in GMT time).
Thanks for your help.

On Thursday, May 3, 2012 9:49:09 PM UTC+10, Igor Motov wrote:

Try replacing "timestamp" with "@timestamp" in the range query. If it
still doesn't work, post the output of curl
localhost:9200/yourindex/_mapping here.

On Thursday, May 3, 2012 7:37:41 AM UTC-4, user16 wrote:

Hi, I have taken graylog2 out of equation, now i have standalone elastic
search running with logstash send the logs and now i have a "timestamp"
field as below

@fields: {
clientip: [
101.1.2.106
]
ident: [

]
auth: [

]
timestamp: [
03/May/2012:18:01:52 +1000
]
ZONE: [
+1000
]
verb: [
GET
]
request: [
/folder/text.txt?x=1336032110254
]
httpversion: [
1.1
]
response: [
200
]
bytes: [
10
]
referrer: [
http://somewebsite.net.au/test/folder.swf
]
agent: [
"Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
]
}
@timestamp: 2012-05-03T08:01:52.000Z
@source_host: PFIOLO1O31
@source_path: //var/log/httpd/access_log
@message: 101.1.2.106 - - [03/May/2012:18:01:52 +1000] "GET
/folder/text.txt?x=1336032110254 HTTP/1.1" 200 10 "
http://somewebsite.net.au/test/folder.swf" "Mozilla/5.0 (compatible;
MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)"
}
}

but, when i tried the below POST query
{
"query": {
"range": {
"timestamp": {
"gte": "now-2m"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "clientip"
}
}
}
}
i get
{

  • took: 16
  • timed_out: false
  • _shards: {
    • total: 5
    • successful: 5
    • failed: 0
      }
  • hits: {
    • total: 0
    • max_score: null
    • hits: [ ]
      }
  • facets: {
    • term1: {
      • _type: terms
      • missing: 0
      • total: 0
      • other: 0
      • terms: [ ]
        }
        }

}
I am guessing it because of the timezone. using "head" plugin i can see
there are 2 fields "timestamp" and "@timestamp". @timestamp is showing the
hourly records and hence it is being processed by elasticsearch properly
(but in GMT time).
Thanks for your help.


(user16) #11

Hmm.. I get the below error if i try the below query

{

  • "error":query":{
    • \n"range":{
      • \n"@timestamp":{
        • \n"gte":"now-2d"\n "SearchPhaseExecutionException[Failed
          to execute phase [query], total failure; shardFailures
          {[skrcTvObR1-EEH8JTETjIQ][logstash-2012.05.04][1]:
          SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
          Failure [Failed to parse source [{\n "
          }\n
          }\n
          },
  • \n"facets":{
    • \n"term1":{
      • \n"terms":{
        • \n"field":"@fields.clientip"\n
          }\n
          }\n
          }\n

}


(Igor Motov) #12

How do you run it?

On Friday, May 4, 2012 1:11:09 AM UTC-4, user16 wrote:

Hmm.. I get the below error if i try the below query

{

  • "error":query":{
    • \n"range":{
      • \n"@timestamp":{
        • \n"gte":"now-2d"\n "SearchPhaseExecutionException[Failed
          to execute phase [query], total failure; shardFailures
          {[skrcTvObR1-EEH8JTETjIQ][logstash-2012.05.04][1]:
          SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
          Failure [Failed to parse source [{\n "
          }\n
          }\n
          },
  • \n"facets":{
    • \n"term1":{
      • \n"terms":{
        • \n"field":"@fields.clientip"\n
          }\n
          }\n
          }\n

}


(user16) #13

HI i ran it from the local console elasticsearch is running and also using
the "dev http client" app on chrome.

On Friday, May 4, 2012 8:20:02 PM UTC+10, Igor Motov wrote:

How do you run it?

On Friday, May 4, 2012 1:11:09 AM UTC-4, user16 wrote:

Hmm.. I get the below error if i try the below query

{

  • "error":query":{
    • \n"range":{
      • \n"@timestamp":{
        • \n"gte":"now-2d"\n "SearchPhaseExecutionException[Failed
          to execute phase [query], total failure; shardFailures
          {[skrcTvObR1-EEH8JTETjIQ][logstash-2012.05.04][1]:
          SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
          Failure [Failed to parse source [{\n "
          }\n
          }\n
          },
  • \n"facets":{
    • \n"term1":{
      • \n"terms":{
        • \n"field":"@fields.clientip"\n
          }\n
          }\n
          }\n

}


(Igor Motov) #14

Are you running it on Windows? Try saving the query into a text file
request.txt:

{
"query": {
"range": {
"@timestamp": {
"gte": "now-1d"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "@fields.clientip"
}
}
}
}

and then run it like this:

curl "localhost:9200/logstash-2012.05.03/apache-access/_search?pretty=true"
-d @request.txt

You might also need to increase -1d to -2d since your requests in this
index are now at least a day old.

On Friday, May 4, 2012 7:59:16 AM UTC-4, user16 wrote:

HI i ran it from the local console elasticsearch is running and also using
the "dev http client" app on chrome.

On Friday, May 4, 2012 8:20:02 PM UTC+10, Igor Motov wrote:

How do you run it?

On Friday, May 4, 2012 1:11:09 AM UTC-4, user16 wrote:

Hmm.. I get the below error if i try the below query

{

  • "error":query":{
    • \n"range":{
      • \n"@timestamp":{
        • \n"gte":"now-2d"\n "SearchPhaseExecutionException[Failed
          to execute phase [query], total failure; shardFailures
          {[skrcTvObR1-EEH8JTETjIQ][logstash-2012.05.04][1]:
          SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
          Failure [Failed to parse source [{\n "
          }\n
          }\n
          },
  • \n"facets":{
    • \n"term1":{
      • \n"terms":{
        • \n"field":"@fields.clientip"\n
          }\n
          }\n
          }\n

}


(user16) #15

I am running ubuntu and running the like suggested give me the same error

. 500 Internal Server Error and the below description.

{
"error" : "SearchPhaseExecutionException[Failed to execute phase [query],
total failure; shardFailures
{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.03][1]:
SearchParseException[[logstash-2012.05.03][1]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-1d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-1d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-1d"];
}{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.03][4]:
SearchParseException[[logstash-2012.05.03][4]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-1d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-1d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-1d"]; }]",
"status" : 500
}


(user16) #16

I am running ubuntu and running like suggested gave me the same error .

500 Internal Server Error and the below description. Really appreciate

your help

Thanks

{

"error" : "SearchPhaseExecutionException[Failed to execute phase
[query], total failure; shardFailures
{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.03][1]:
SearchParseException[[logstash-2012.05.03][1]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-1d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-1d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-1d"];
}{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.03][4]:
SearchParseException[[logstash-2012.05.03][4]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-1d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-1d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-1d"]; }]",
"status" : 500
}


(user16) #17

I am running ubuntu and running like suggested gave me the same error .

500 Internal Server Error and the below description. Really
appreciate your help
Thanks

curl "localhost:9200/logstash-2012.05.04/apache-access/_search?pretty=true"
-d @request.txt
{
"error" : "SearchPhaseExecutionException[Failed to execute phase [query],
total failure; shardFailures
{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.04][2]:
SearchParseException[[logstash-2012.05.04][2]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-2d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-2d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-2d"];
}{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.04][1]:
SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-2d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-2d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-2d"]; }]",
"status" : 500

On Friday, May 4, 2012 10:09:52 PM UTC+10, Igor Motov wrote:

Are you running it on Windows? Try saving the query into a text file
request.txt:

{
"query": {
"range": {
"@timestamp": {
"gte": "now-1d"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "@fields.clientip"
}
}
}
}

and then run it like this:

curl
"localhost:9200/logstash-2012.05.03/apache-access/_search?pretty=true" -d
@request.txt

You might also need to increase -1d to -2d since your requests in this
index are now at least a day old.

On Friday, May 4, 2012 7:59:16 AM UTC-4, user16 wrote:

HI i ran it from the local console elasticsearch is running and also
using the "dev http client" app on chrome.

On Friday, May 4, 2012 8:20:02 PM UTC+10, Igor Motov wrote:

How do you run it?

On Friday, May 4, 2012 1:11:09 AM UTC-4, user16 wrote:

Hmm.. I get the below error if i try the below query

{

  • "error":query":{
    • \n"range":{
      • \n"@timestamp":{
        • \n"gte":"now-2d"\n "SearchPhaseExecutionException[Failed
          to execute phase [query], total failure; shardFailures
          {[skrcTvObR1-EEH8JTETjIQ][logstash-2012.05.04][1]:
          SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
          Failure [Failed to parse source [{\n "
          }\n
          }\n
          },
  • \n"facets":{
    • \n"term1":{
      • \n"terms":{
        • \n"field":"@fields.clientip"\n
          }\n
          }\n
          }\n

}


(Igor Motov) #18

Which version of elaticsearch are you using? I think date math was added
around 0.19.0. So, your version might not support it. Try specifying an
actual date:
.....
"@timestamp": {
"gte": "2012-05-03T00:00:00.000Z"
}
.....

On Friday, May 4, 2012 8:33:11 AM UTC-4, user16 wrote:

I am running ubuntu and running like suggested gave me the same error .

500 Internal Server Error and the below description. Really
appreciate your help
Thanks

curl
"localhost:9200/logstash-2012.05.04/apache-access/_search?pretty=true" -d
@request.txt
{
"error" : "SearchPhaseExecutionException[Failed to execute phase
[query], total failure; shardFailures
{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.04][2]:
SearchParseException[[logstash-2012.05.04][2]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-2d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-2d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-2d"];
}{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.04][1]:
SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-2d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-2d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-2d"]; }]",
"status" : 500

On Friday, May 4, 2012 10:09:52 PM UTC+10, Igor Motov wrote:

Are you running it on Windows? Try saving the query into a text file
request.txt:

{
"query": {
"range": {
"@timestamp": {
"gte": "now-1d"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "@fields.clientip"
}
}
}
}

and then run it like this:

curl
"localhost:9200/logstash-2012.05.03/apache-access/_search?pretty=true" -d
@request.txt

You might also need to increase -1d to -2d since your requests in this
index are now at least a day old.

On Friday, May 4, 2012 7:59:16 AM UTC-4, user16 wrote:

HI i ran it from the local console elasticsearch is running and also
using the "dev http client" app on chrome.

On Friday, May 4, 2012 8:20:02 PM UTC+10, Igor Motov wrote:

How do you run it?

On Friday, May 4, 2012 1:11:09 AM UTC-4, user16 wrote:

Hmm.. I get the below error if i try the below query

{

  • "error":query":{
    • \n"range":{
      • \n"@timestamp":{
        • \n"gte":"now-2d"\n "SearchPhaseExecutionException[Failed
          to execute phase [query], total failure; shardFailures
          {[skrcTvObR1-EEH8JTETjIQ][logstash-2012.05.04][1]:
          SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
          Failure [Failed to parse source [{\n "
          }\n
          }\n
          },
  • \n"facets":{
    • \n"term1":{
      • \n"terms":{
        • \n"field":"@fields.clientip"\n
          }\n
          }\n
          }\n

}


(user16) #19

Ahhhhh.. tks i am using 0.18.7 and
"@timestamp": {
"gte": "2012-05-03T00:00:00.000Z"
}

gave me the client ip and "count" . Now,is there any way to get the last 2
min of unique client ip's?

Thanks

On Friday, May 4, 2012 10:47:47 PM UTC+10, Igor Motov wrote:

Which version of elaticsearch are you using? I think date math was added
around 0.19.0. So, your version might not support it. Try specifying an
actual date:
.....
"@timestamp": {
"gte": "2012-05-03T00:00:00.000Z"
}
.....

On Friday, May 4, 2012 8:33:11 AM UTC-4, user16 wrote:

I am running ubuntu and running like suggested gave me the same error .

500 Internal Server Error and the below description. Really
appreciate your help
Thanks

curl
"localhost:9200/logstash-2012.05.04/apache-access/_search?pretty=true" -d
@request.txt
{
"error" : "SearchPhaseExecutionException[Failed to execute phase
[query], total failure; shardFailures
{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.04][2]:
SearchParseException[[logstash-2012.05.04][2]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-2d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-2d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-2d"];
}{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.04][1]:
SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-2d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-2d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-2d"]; }]",
"status" : 500

On Friday, May 4, 2012 10:09:52 PM UTC+10, Igor Motov wrote:

Are you running it on Windows? Try saving the query into a text file
request.txt:

{
"query": {
"range": {
"@timestamp": {
"gte": "now-1d"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "@fields.clientip"
}
}
}
}

and then run it like this:

curl
"localhost:9200/logstash-2012.05.03/apache-access/_search?pretty=true" -d
@request.txt

You might also need to increase -1d to -2d since your requests in this
index are now at least a day old.

On Friday, May 4, 2012 7:59:16 AM UTC-4, user16 wrote:

HI i ran it from the local console elasticsearch is running and also
using the "dev http client" app on chrome.

On Friday, May 4, 2012 8:20:02 PM UTC+10, Igor Motov wrote:

How do you run it?

On Friday, May 4, 2012 1:11:09 AM UTC-4, user16 wrote:

Hmm.. I get the below error if i try the below query

{

  • "error":query":{
    • \n"range":{
      • \n"@timestamp":{
        • \n"gte":"now-2d"\n "SearchPhaseExecutionException[Failed
          to execute phase [query], total failure; shardFailures
          {[skrcTvObR1-EEH8JTETjIQ][logstash-2012.05.04][1]:
          SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
          Failure [Failed to parse source [{\n "
          }\n
          }\n
          },
  • \n"facets":{
    • \n"term1":{
      • \n"terms":{
        • \n"field":"@fields.clientip"\n
          }\n
          }\n
          }\n

}


(Igor Motov) #20

Yeah, this should give you up to 10000 top unique IPs in the last two
minutes:

{
"query": {
"range": {
"@timestamp": {
"gte": "... two minutes ago ..."
}
}
},
"size":0,
"facets": {
"term1": {
"terms": {
"field": "@fields.clientip",
"size": 10000
}
}
}
}

On Friday, May 4, 2012 9:10:34 AM UTC-4, user16 wrote:

Ahhhhh.. tks i am using 0.18.7 and
"@timestamp": {
"gte": "2012-05-03T00:00:00.000Z"
}

gave me the client ip and "count" . Now,is there any way to get the last 2
min of unique client ip's?

Thanks

On Friday, May 4, 2012 10:47:47 PM UTC+10, Igor Motov wrote:

Which version of elaticsearch are you using? I think date math was added
around 0.19.0. So, your version might not support it. Try specifying an
actual date:
.....
"@timestamp": {
"gte": "2012-05-03T00:00:00.000Z"
}
.....

On Friday, May 4, 2012 8:33:11 AM UTC-4, user16 wrote:

I am running ubuntu and running like suggested gave me the same error .

500 Internal Server Error and the below description. Really
appreciate your help
Thanks

curl
"localhost:9200/logstash-2012.05.04/apache-access/_search?pretty=true" -d
@request.txt
{
"error" : "SearchPhaseExecutionException[Failed to execute phase
[query], total failure; shardFailures
{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.04][2]:
SearchParseException[[logstash-2012.05.04][2]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-2d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-2d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-2d"];
}{[evarIQDhQu-Jj8DapZc8QA][logstash-2012.05.04][1]:
SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
Failure [Failed to parse source [{ "query": { "range": {
"@timestamp": { "gte": "now-2d" } } }, "facets":
{ "term1": { "terms": { "field": "@fields.clientip"
} } }}]]]; nested: MapperParsingException[failed to parse date
field [now-2d], tried both date format [dateOptionalTime], and timestamp
number]; nested: IllegalArgumentException[Invalid format: "now-2d"]; }]",
"status" : 500

On Friday, May 4, 2012 10:09:52 PM UTC+10, Igor Motov wrote:

Are you running it on Windows? Try saving the query into a text file
request.txt:

{
"query": {
"range": {
"@timestamp": {
"gte": "now-1d"
}
}
},
"facets": {
"term1": {
"terms": {
"field": "@fields.clientip"
}
}
}
}

and then run it like this:

curl
"localhost:9200/logstash-2012.05.03/apache-access/_search?pretty=true" -d
@request.txt

You might also need to increase -1d to -2d since your requests in this
index are now at least a day old.

On Friday, May 4, 2012 7:59:16 AM UTC-4, user16 wrote:

HI i ran it from the local console elasticsearch is running and also
using the "dev http client" app on chrome.

On Friday, May 4, 2012 8:20:02 PM UTC+10, Igor Motov wrote:

How do you run it?

On Friday, May 4, 2012 1:11:09 AM UTC-4, user16 wrote:

Hmm.. I get the below error if i try the below query

{

  • "error":query":{
    • \n"range":{
      • \n"@timestamp":{
        • \n"gte":"now-2d"\n "SearchPhaseExecutionException[Failed
          to execute phase [query], total failure; shardFailures
          {[skrcTvObR1-EEH8JTETjIQ][logstash-2012.05.04][1]:
          SearchParseException[[logstash-2012.05.04][1]: from[-1],size[-1]: Parse
          Failure [Failed to parse source [{\n "
          }\n
          }\n
          },
  • \n"facets":{
    • \n"term1":{
      • \n"terms":{
        • \n"field":"@fields.clientip"\n
          }\n
          }\n
          }\n

}