I have doubt on logstash, while importing CSV files,
Step 1: importing 10 CSV files(pr_*)
Ex:
input {
file {
path => "D:/ELK-Sample-CSV/pr/pr_*.csv"
start_position => "beginning"
sincedb_path => "D:/www/logstash-6.2.3/data/plugins/inputs/file/.sincedb_*"
}
}
**At the first time importing data, can set start_position 'end'?
Step 2: If I suddenly stop my logstash after some amount of data imported, then start_position change to 'end'
Ex:
input {
file {
path => "D:/ELK-Sample-CSV/pr/pr_*.csv"
start_position => "end"
sincedb_path => "D:/www/logstash-6.2.3/data/plugins/inputs/file/.sincedb_*"
}
}
- In 10 CSV file have 100 k records,
- First time imported 15 k records
- Now, have to import the unprocessed line from 15001 record (same 10 CSV files)
I got error:
_globbed_files: D:/ELK-Sample-CSV/pr/pr_*.csv: glob is: ["D:/ELK-Sample-CSV/pr/pr_1_AK.csv", "D:/ELK-Sample-CSV/pr/pr_1_DE.csv"]
_discover_file: D:/ELK-Sample-CSV/pr/pr_*.csv: new: D:/ELK-Sample-CSV/pr/pr_1_AK.csv (exclude is [])
_discover_file: D:/ELK-Sample-CSV/pr/pr_*.csv: new: D:/ELK-Sample-CSV/pr/pr_1_DE.csv (exclude is [])
_open_file: D:/ELK-Sample-CSV/pr/pr_1_AK.csv: opening
D:/ELK-Sample-CSV/pr/pr_1_AK.csv: initial create, no sincedb, seeking to end 28263215
_open_file: D:/ELK-Sample-CSV/pr/pr_1_DE.csv: opening
D:/ELK-Sample-CSV/pr/pr_1_DE.csv: initial create, no sincedb, seeking to end 105196756
each: file grew: D:/ELK-Sample-CSV/pr/pr_1_AK.csv: old size 0, new size 28263215
each: file grew: D:/ELK-Sample-CSV/pr/pr_1_DE.csv: old size 0, new size 105196756
each: file grew: D:/ELK-Sample-CSV/pr/pr_1_AK.csv: old size 0, new size 28263215
each: file grew: D:/ELK-Sample-CSV/pr/pr_1_DE.csv: old size 0, new size 105196756
each: file grew: D:/ELK-Sample-CSV/pr/pr_1_AK.csv: old size 0, new size 28263215
each: file grew: D:/ELK-Sample-CSV/pr/pr_1_DE.csv: old size 0, new size 105196756c