Twitter input not working on Debian build in Google compute node

Bit of a strange one here.

I'm trying to get the twitter input working for Logstash 6.5.4 running on Debian in Google compute node. Everything seems to start ok, but Logstash never actually collects any tweets and doesn't create the index. I've got firewall logging on and it never actually tries to connect to https://stream.twitter.com (or any HTTPS address).

From the command line, I can "wget"the twitter URL and that works, so I know the firewall is open, and if I run exactly the same logstash config on my Windows workstation, it works just fine.

At this point, I've stripped everything out and just using the stdout output but still nothing happening.

input {

  twitter { 
    consumer_key       => "xxxx"
    consumer_secret    => "xxxx"
    oauth_token        => "xxxx"
    oauth_token_secret => "xxxx"
    ignore_retweets    => true
    full_tweet         => false
    keywords           => [ "trump" ]
  }

}

filter {
}

output {
  stdout {
  }
}

I've tried increasing memory for Logstash and various other tweaks but seems Logstash isn't activating the twitter plugin. Even if I put in nonsense auth keys, it doesn't complain.

Logfile just shows this once it's up and running.

[2019-01-04T16:12:31,630][INFO ][logstash.inputs.twitter  ] Starting twitter tracking {:track=>"trump"}
[2019-01-04T16:12:31,703][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-01-04T16:12:31,857][DEBUG][logstash.agent           ] Starting puma
[2019-01-04T16:12:31,908][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2019-01-04T16:12:32,003][DEBUG][logstash.api.service     ] [api-service] start
[2019-01-04T16:12:32,288][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2019-01-04T16:12:36,016][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-01-04T16:12:36,017][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-01-04T16:12:36,621][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x2c3aa455 sleep>"}
[2019-01-04T16:12:41,027][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-01-04T16:12:41,028][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-01-04T16:12:41,624][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x2c3aa455 sleep>"}
[2019-01-04T16:12:46,037][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-01-04T16:12:46,039][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}

If I use the "http_poller" input and point it to any external web site, that works fine.

So, any ideas what could be wrong here, as I'm all out of them at the moment.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.