Multiple Files for Logstash - URGENT HELP Please

Hi everyone so I loaded my first *.csv into logstash and got it to come up in Kibana, but I have about 50 more files and more overtime I will need to upload. So I created a new config file. called config_2.config and when I run this I get a error saying this:

bin/logstash -f /home/vault/Wulf/data/configs/logstash_2.config
Sending Logstash's logs to /home/vault/Wulf/logstash/logs which is now configured via log4j2.properties
[2018-01-06T14:07:06,207][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/vault/Wulf/logstash/modules/fb_apache/configuration"}
[2018-01-06T14:07:06,254][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/vault/Wulf/logstash/modules/netflow/configuration"}
[2018-01-06T14:07:06,646][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-01-06T14:07:06,668][FATAL][logstash.runner          ] Logstash could not be started because there is already another instance using the configured data directory.  If you wish to run multiple instances, you must change the "path.data" setting.

This has me very confused.

Any help would be great so in the future when I get more files I can upload them.

Here is my config file

input {
file {
path => "/home/vault/Wulf/data/a/path/to/file.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
columns => [ "xxxxx", "xxxxxxx" ]
}
}
output {
elasticsearch {
hosts => "localhost:9200"
index => "xxxxx"
document_type => "xxxxxxx"
}
stdout {}
}

I figured it out. I created a new folder called Logstash2 and inserted everything from logstash in there and ran it. It worked. Did I do that correctly?

Why run multiple instances of Logstash when you can either have multiple file inputs in the same file or multiple filename patterns in the same file input?

Good morning Magnus.

That was the only way I could figure it out last night. See my problem is I have a lot of CSV and TXT files that I need to add and cant do it all at once. So I will need to do it over time. So like today I plan on loading 4 CSV files but tomorrow I have about 8 more I will need to load and so on.

How can I continue to add to the same index. I have one Index name Master and I want to use to same config to keep loading data into it but when I try that I get that same error as above. Can you please tell me what I am doing wrong.

I am looking to do that and to doing multiple filename patterns. TXT, CSV, EXCEL and so on.

That was the only way I could figure it out last night. See my problem is I have a lot of CSV and TXT files that I need to add and cant do it all at once. So I will need to do it over time. So like today I plan on loading 4 CSV files but tomorrow I have about 8 more I will need to load and so on.

In your file input, use a filename pattern like *.csv so that Logstash automatically picks up all CSV files in a particular directory. When you have more files to process just drop them in the directory that Logstash monitors.

How can I continue to add to the same index. I have one Index name Master and I want to use to same config to keep loading data into it but when I try that I get that same error as above. Can you please tell me what I am doing wrong.

Don't run multiple Logstash processes and that problem will go away.

Got it thanks Magnus for all the help. That makes more sense now. I will delete all the other ones. What about if I have to create a new index for say like text files?

Like I have a index file name called CSV and I want to create a new one called Text. Do I need to create a new config file and point to that data path?

What you'll probably find least confusing is adding another file input to the existing configuration file. If you want events from different files to be treated differently you should look into using conditionals. Examples of that has been posted and discussed here countless times before.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.