Hello,
I have no experience with Elasticsearch, however I wish to setup an environment to be able to search through some large csv files. The import with logstash works, however I have two issues:
- timestamp scale is using the time each record was inserted to elasticsearch, while I have a field date that is holding the decoded TimeStamp for each record from the file
- each bar is limited at 8000 records, while in some cases I have much more
My logstash configuration is below:
input {
file {
path => "/mnt/xxxxxxxxxxxxx/*.txt"
start_position => "beginning"
}
}
filter {
csv {
columns => [ "xx1","xx2","xx3","xx4","Responsexx","xx5","TimeStamp","Username","SourceIP" ]
}
date {
match => [ "TimeStamp", "yyyyMMddHHmmss" ]
target => "date"
locale => "en"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "xxxx"
}
}
Here is an image on how it looks like:
Can you help me solve these issues?
Best Regards,
Mihai Radulescu