Not able to import data in elastic through logstash

I am using logstash to import data in elastic search. I am using .conf file to import data in index in elastic. Every time I run the .conf file I get the following message repeatedly and index DOES NOT get created.
<
[DEBUG] 2021-05-02 22:56:26.174 [[main]<mongodb] mongodb - collection_data is: {"institutions_metadeta"=>{:name=>"institutions_metadeta", :last_id=>"608ea384a1441c4d77453a12"}}
[DEBUG] 2021-05-02 22:56:26.199 [[main]<mongodb] mongodb - Updating watch collections
[DEBUG] 2021-05-02 22:56:26.279 [[main]<mongodb] mongodb - Added institutions_metadeta to the collection list as it matches our collection search
[DEBUG] 2021-05-02 22:56:26.283 [[main]<mongodb] mongodb - since table already exists
[DEBUG] 2021-05-02 22:56:26.290 [[main]<mongodb] mongodb - placeholder already exists, it is {:table=>"logstash_since_institutions_metadeta", :place=>"608ea384a1441c4d77453a12"}
[DEBUG] 2021-05-02 22:56:26.291 [[main]<mongodb] mongodb - No new rows. Sleeping. {:time=>5}
[DEBUG] 2021-05-02 22:56:27.376 [pool-3-thread-1] jvm - collector name {:name=>"ParNew"}
[DEBUG] 2021-05-02 22:56:27.377 [pool-3-thread-1] jvm - collector name {:name=>"ConcurrentMarkSweep"}
[DEBUG] 2021-05-02 22:56:29.023 [logstash-pipeline-flush] PeriodicFlush - Pushing flush onto pipeline.
/>

Welcome to our community! :smiley:

That would suggest that it's not seeing new rows because you've run the config before.

Can you post your config?

Thanks for responding. I have figured out the issue.

But now I am running into another issue. I am importing data from an collection named "nad_partner_api_dump" of mongodb into an index. But since in the source DB there are two more collections named "nad_partner_api_dump_09_april_21" and
"nad_partner_api_dump_backup_March_17_2021", data from those collections are also getting imported automatically. But I want data to be imported from only nad_partner_api_dump. Please help.

The config file is as follows -
input {
mongodb {
uri => <connection_string>
placeholder_db_dir => "/opt/logstash/"
placeholder_db_name => "ufi_logstash_sqlite.db"
collection => "upload_file_info"
batch_size => 200
}
}

filter {
mutate {
rename => { "_id" => "mongo_id" }
}
}

output {
elasticsearch {
hosts => ["http://127.0.0.1:9200"]
action => "index"
index => "upload_file_info"
}
}

According to the code, the collection option is "The collection to use. Is turned into a regex so 'events' will match 'events_20150227'".

I have no idea if it will work but you could try collection => "upload_file_info$"

Thanks that have worked. But now I am facing another issue. I have first imported data from upload_file_info collections and then I have imported data from institutions collection into a separate index. The moment I started importing data in institutions collection, the same data also got inserted in upload_file_info collection automatically. The conf file is similar as I shared before. Please help.

You may be hitting this.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.