Polling with logstash using http_poller

hi friends
i'm trying to make the polling of json log from this site
http://stream.meetup.com/2/rsvps

input 
{

http_poller {
 urls => {
  myurl => "http://stream.meetup.com/2/rsvps"
 }
 
 interval => 5
 }

}

but the jsons doesn't enter on elastic
where do i wrong?

  • Are you getting anything from Logstash if you comment out the the elasticsearch output and just use a stdout { codec => rubydebug } output?
  • Are you getting any error messages in the Logstash log?
  • How can you be sure nothing's in Elasticsearch? Do you have X-Pack enabled?

hi
i'm using this configuration

input 
{ http_poller {
	urls => {
	myurl => "http://stream.meetup.com/2/rsvps"}
	interval => 5
}
}


filter{
grok {
match => ["message", "(?<meetup>{\".*\".*})"]
break_on_match => true
add_field => { "type" => "log"} 
}
if "_grokparsefailure" in [tags]{drop {}}	
else {json {source => "meetup"}	 


mutate {

	 add_field => { "[location]" => "%{[venue][lat]},%{[venue][lon]}"}
   }

 kv {
	source => "location"
	target => "location"
	field_split => ","}  
   
 mutate {
	convert => { "[location][lat]" => "float" }
	convert => { "[location][lon]" => "float" }
   
   }
  }
}		



output
{ elasticsearch {

index => "meetup1-%{+YYYY.MM.dd}"
document_type => "log"
manage_template => true
template_name => "test"
template => "C:\Users\Lock\Desktop\meetup_polling\template1.json"
template_overwrite => true
codec => json
}

stdout { codec => rubydebug }}

i have not x-pack enabled

on the terminal i see this:

Perhaps your grok filter is failing and you're dropping all events?

Either way, debug your configuration systematically by starting small and adding complexity once you've verified that what you have is working as expected.

the grok filter usually work properly..
i changed only imput filter from file{} (with the json date from meetup) to http{}
the data are identical

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.