Hello,
At the moment our cluster is composed of:
- 4 elasticsearch nodes
- 3 graylog instances for the logs processing/extractors
- Kibana for the visualization (graph, dashboard)
I want to upgrade our elastic nodes to 5.0, but unfortunately, graylog isn't supporting it yet.
I am planning of using Logstash instead, and have a few questions:
-
Can't we start one logstash for all the inputs we wanna have ? I saw that we can configure multiple inputs and outputs in one files but I don't understand how we can define filter to be applied on specific input logs ? Also, are we forced to use the /bin/logstash command everytime, or is there a way to tell logstash to start automatically for all the inputs ?
-
I spent a lot of time in graylog to make extractors, and create all the fields. Graylog has a json export of all extractors, which looks like this :
{
"extractors": [
{
"title": "Squid3 response",
"extractor_type": "regex",
"converters": [],
"order": 14,
"cursor_strategy": "copy",
"source_field": "message",
"target_field": "squid3_response",
"extractor_config": {
"regex_value": "pamandzi squid3:.+ [0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3} ([^/]+/[0-9]+) [0-9]{1,10}"
},
"condition_type": "string",
"condition_value": "pamandzi squid3"
},
{
"title": "Squid3 IP client",
"extractor_type": "regex",
"converters": [],
"order": 16,
"cursor_strategy": "copy",
"source_field": "message",
"target_field": "squid3_ip_client",
"extractor_config": {
"regex_value": "pamandzi squid3:.+ ([0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}) [^/]+/[0-9]+ [0-9]{1,10}"
},
"condition_type": "string",
"condition_value": "pamandzi squid3"
},
My question is : will I be able to reuse those regex somehow with logstash ?
- Will I have to relaunch Logstash every time I want to add an input or modify something ?
Thanks!