I have 2 server. server log and server logstash
I using only logstash 6.8 and elasticsearch (without filebeat)
from server log I send the log to server logstash using cron.
and in server logstash I have multiple pipeline.
for log I'm using this config
input {
file {
path => ["/home/xxxxxx/bcp/bcp_usage_home_lte_*.txt"]
start_position => "beginning"
sincedb_path => "/etc/logstash/dbpath/usagestatistic.db"
}
}
filter {
grok {
match => {"message" => "(?<DATE>[0-9]+)\|(?<MSISDN>(.)+)\|(?<APP>(.)+)\|(?<CATEGORY>(.)+)\|(?<DEVICE>(.)+)\|(?<VENDOR>(.)+)\|(?<APN>(.)+)\|(?<TETHER>[0-9]+)\|(?<QUOTA>[0-9]+)"}
remove_field => ["message"]
}
mutate {
convert => {
"QUOTA" => "integer"
"TETHER" => "integer"
}
}
}
output {
elasticsearch {
hosts => ["1.2.3.4:3306"]
user => "user"
password => "xxxxxxxx"
ssl => true
ssl_certificate_verification => false
index => "usagestatistic"
}
stdout { codec => rubydebug }
}
and for pipelines.yml
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
# https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html
- pipeline.id: usagestatistic
path.config: "/etc/logstash/conf.d/usagestatistic.conf"
queue.type: persisted
pipeline.workers: 2
- pipeline.id: customer_data-activity
path.config: "/home/xxxxxx/customer-data/activity.conf"
pipeline.workers: 1
#queue.type: persisted
- pipeline.id: customer_data-delivery
path.config: "/home/xxxxxx/customer-data/delivery.conf"
pipeline.workers: 1
queue.type: persisted
- pipeline.id: customer_data-profile
path.config: "/home/xxxxxx/customer-data/profile.conf"
pipeline.workers: 1
queue.type: persisted
- pipeline.id: customer_data-payment
path.config: "/home/xxxxxx/customer-data/payment.conf"
pipeline.workers: 1
queue.type: persisted
- pipeline.id: customer_data-order_fullfillment
path.config: "/home/xxxxxx/customer-data/order_fulfillment.conf"
pipeline.workers: 1
queue.type: persisted
- pipeline.id: customer_data-order_fulfillment_history
path.config: "/home/xxxxxx/customer-data/order_fulfillment_history.conf"
pipeline.workers: 1
queue.type: persisted
when I check the log I cannot see any error so I think it's not problem with the config files.
[2020-08-17T14:57:53,751][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk
[2020-08-17T14:57:53,752][DEBUG][logstash.inputs.file ] Received line {:path=>"/home/xxxxxx/bcp/bcp_usage_home_lte_20200815.txt", :text=>"20200815|628123456789|BranchMetrics|DeviceServices|unclassified|unclassified|home|1|142705"
these logstash-plain.txt print end of line log bcp_usage_home_lte_20200815 but not push to elasticsearch.
when I kill the logstash and start again it running well with output to ES. but tomorrow I check again the next log it received to logstash but stop at x lines. and I tried to restart the logstash and it's happened again.