Logstash only wants to target ONE file?

Hi guys, so I'm trying to send some dummy data to Elastic via Logstash. I am able to send kibana logs (kibana.log), but ONLY those kibana logs for some reason! Whenever I change my "path" in my config to another file Logstash gets to "Pipeline main started" and freezes there, never actually sending any data. Here's my config...
`

input
{
	file
	{
		path           => "/tmp/dummyData.log"
		start_position => "beginning"
	}
}

output
{
  elasticsearch
  {
	codec           => json
	hosts           => "ulvelkd08.devfg.rbc.com:9200" # it used to be "host" and "port" pre-2.0
	index           => "json"
  }

	stdout { codec => rubydebug }
}

`

/tmp/dummyData.log is just a simple one-line JSON item with 3 key-value pairs I made up. I've already validated that the JSON formatting is correct, so that's not the issue. Here's what I see when I run the following command...

[root@ulvelkd08 ~]# /opt/logstash/bin/logstash --verbose -f /opt/elk/etc/logstash/new-indexer.conf starting agent {:level=>:info} starting pipeline {:id=>"main", :level=>:info} Settings: Default pipeline workers: 8 Registering file input {:path=>["/tmp/dummyData.log"], :level=>:info} No sincedb_path set, generating one based on the file path {:sincedb_path=>"/root/.sincedb_929b1cdd87304084b0d37bb799dd2f15", :path=>["/tmp/dummyData.log"], :level=>:info} Using mapping template from {:path=>nil, :level=>:info} Attempting to install template {:manage_template=>{"template"=>"logstash-*", "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "omit_norms"=>true}, "dynamic_templates"=>[{"message_field"=>{"match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"string", "index"=>"analyzed", "omit_norms"=>true, "fielddata"=>{"format"=>"disabled"}, "fields"=>{"raw"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true, "ignore_above"=>256}}}}}, {"float_fields"=>{"match"=>"*", "match_mapping_type"=>"float", "mapping"=>{"type"=>"float", "doc_values"=>true}}}, {"double_fields"=>{"match"=>"*", "match_mapping_type"=>"double", "mapping"=>{"type"=>"double", "doc_values"=>true}}}, {"byte_fields"=>{"match"=>"*", "match_mapping_type"=>"byte", "mapping"=>{"type"=>"byte", "doc_values"=>true}}}, {"short_fields"=>{"match"=>"*", "match_mapping_type"=>"short", "mapping"=>{"type"=>"short", "doc_values"=>true}}}, {"integer_fields"=>{"match"=>"*", "match_mapping_type"=>"integer", "mapping"=>{"type"=>"integer", "doc_values"=>true}}}, {"long_fields"=>{"match"=>"*", "match_mapping_type"=>"long", "mapping"=>{"type"=>"long", "doc_values"=>true}}}, {"date_fields"=>{"match"=>"*", "match_mapping_type"=>"date", "mapping"=>{"type"=>"date", "doc_values"=>true}}}, {"geo_point_fields"=>{"match"=>"*", "match_mapping_type"=>"geo_point", "mapping"=>{"type"=>"geo_point", "doc_values"=>true}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "doc_values"=>true}, "@version"=>{"type"=>"string", "index"=>"not_analyzed", "doc_values"=>true}, "geoip"=>{"type"=>"object", "dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip", "doc_values"=>true}, "location"=>{"type"=>"geo_point", "doc_values"=>true}, "latitude"=>{"type"=>"float", "doc_values"=>true}, "longitude"=>{"type"=>"float", "doc_values"=>true}}}}}}}, :level=>:info} New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["ulvelkd08.devfg.rbc.com:9200"], :level=>:info} Starting pipeline {:id=>"main", :pipeline_workers=>8, :batch_size=>125, :batch_delay=>5, :max_inflight=>1000, :level=>:info} Pipeline main started

Usually after "Pipeline Main Started" I'll see a bunch of data fly up as it is being sent to Kibana, but ONLY if I use my standard kibana.log file! No other file that I create works!
I've already gone through and deleted all the .sincedb_* files that I can find, even tried setting my own path, but nothing helps.

Any ideas?

If the input file is older than 24 hours, make sure you adjust the file input's ignore_older option accordingly.