ELK state management!

Hi,

I'm new to ELK, need clarifications on

  1. what is the purpose of logstash-forwarder when logstash itself can ship files ?
  2. I have a Usecase where my system might get restarted at times(killing all services abruptly),
    • when logstash-forwarder is restarted while reading a huge log file, how to implement state management?
      (I assume, .logstash-forwarder is not updated while reading)
  3. which has better state management logstash(with .sincedb) as shipper or logstash-forwarder (with .logstash-forwarder) as shipper?
  1. It's light weight, so if you don't want to install a JVM everywhere you can use it. Plus it ships logs securely.
  2. LSF and LS handle that with sincedb.
  3. Dunno.

Hi Mark :slight_smile: ,

But when I restart logstash-forwarder service while shipping a huge log file, its duplicating logs. Should I do any explicit configurations to handle this ?

Below is how my configurations look,

Logstash Config
input {
lumberjack {
# The port to listen on
port => 5544
ssl_certificate => "D:/logstash-forwarder/ssl/logstash-forwarder.crt"
ssl_key => "D:/logstash-forwarder/ssl/logstash-forwarder.key"
type => "Logs"
codec => plain { charset => "UTF-16" }
}
}

Logstash-forwarder config
{
"network": {
"servers": [ "localhost:5544" ],
"ssl key": "D:/logstash-forwarder/ssl/logstash-forwarder.key",
"ssl ca": "D:/logstash-forwarder/ssl/logstash-forwarder.crt",
"timeout": 15
},

"files": [
{
"paths": ["D:/logpath/**/*.txt"],
"fields": { "type": "log" }
}
]
}

LSF stores the state information in .logstash-forwarder, i.e. a file in the current directory. Is that file ever created for you? Are you starting LSF from the same directory both times?

Yeah .logstash-forwarder file is created in current directory. But when i restart LSF service while reading data is duplicated.

I'm still testing ELK, So it will be of great help if you can suggest me ELK setup best for my requirement,

  1. Highly available (incase of system failure)
  2. Need to schedule log reads (during non-business hours).
  3. Better if all components are Freeware

Thanks in advance.

Yeah .logstash-forwarder file is created in current directory. But when i restart LSF service while reading data is duplicated.

That's weird and unexpected. I suggest you increase the logging and post the results.

Highly available (incase of system failure)

Please be more specific. "High availability" is a fluffy term that means different things to different people.

Need to schedule log reads (during non-business hours).

There's nothing built-in for this, but a cronjob to start and stop log collection daemons is easy to write.

Better if all components are Freeware

Elasticsearch, Logstash, and Kibana are all open source.