Trying to add a 2nd source of logs

Hey everyone,
I'm very new to the ELK stack and I've been learning it thanks to online resources. My next goal is to visualize logs from 2 different logs files. First, I have tried 2 different config files with 2 pipelines. Here is my pipeline.yml and my .conf files.

- pipeline.id: nginx
  path.config: "/etc/logstash/conf.nginx/nginx.conf"
- pipeline.id: apache
  path.config: "/etc/logstash/conf.apache/apache-01.conf"

nginx conf :

input {
    file { path => "/var/log/nginx/access_log" }
}
filter {
      grok {
    match => { "message" => ["%{IPORHOST:clientip} (?:-|%{USER:ident}) (?:-|%{USER:auth}) \[%{HTTPDATE:timestamp}\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|-)\" %{NUMBER:response} (?:-|%{NUMBER:bytes})"]  }
        remove_field => "message"
      }
      mutate {
        add_field => { "read_timestamp" => "%{@timestamp}" }
      }
      date {
        match => [ "[nginx][access][time]", "dd/MMM/YYYY:H:m:s Z" ]
        remove_field => "[nginx][access][time]"
      }
      useragent {
        source => "[nginx][access][agent]"
        target => "[nginx][access][user_agent]"
        remove_field => "[nginx][access][agent]"
      }
      geoip {
        source => "[nginx][access][remote_ip]"
      }
}
output {
    elasticsearch {
         hosts => "localhost:9200"
         index => "nginx"
    }
} 

Apache conf :

input {
        file {
                path => "/home/Downloads/apache-daily-access.log"
                start_position => "beginning"
                sincedb_path => "/dev/null"
        }
}

filter {
        grok {
                match => { "message" => "%{COMBINEDAPACHELOG}" }
        }
        date {
                match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
        }
        geoip {
                source => "clientip"
        }
}

output {
        elasticsearch {
                hosts => ["localhost:9200"]
        }
}

However it didn't worked, I only had the apache logs in both my nginx and apache indices on kibana.

Then I tried to mix both conf files and start logstash with the following command and a new conf file : bin/logstash -f test.conf

test.conf :

input {

	file {
		path => "/home/Downloads/apache-daily-access.log"
		start_position => "beginning"
		sincedb_path => "/dev/null"
		type => "apache"
	}

	file {
		path => "/var/log/nginx/access_log" 
		type => "nginx"
	}
}

filter {

	if [type] == "apache" {
		grok {
			match => { "message" => "%{COMBINEDAPACHELOG}" }
		}
		date {
			match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
		}
		geoip {
			source => "clientip"
		}
	}

	if [type] == "nginx"{
		grok {
    match => { "message" => ["%{IPORHOST:clientip} (?:-|%{USER:ident}) (?:-|%{USER:auth}) \[%{HTTPDATE:timestamp}\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|-)\" %{NUMBER:response} (?:-|%{NUMBER:bytes})"]  }
			remove_field => "message"
		}
		mutate {
			add_field => { "read_timestamp" => "%{@timestamp}" }
		}
		date {
			match => [ "[nginx][access][time]", "dd/MMM/YYYY:H:m:s Z" ]
			remove_field => "[nginx][access][time]"
		}
		useragent {
			source => "[nginx][access][agent]"
			target => "[nginx][access][user_agent]"
			remove_field => "[nginx][access][agent]"
		}
		geoip {
			source => "[nginx][access][remote_ip]"
		}
	}	
}

output {

	if [type] == "apache" {
		elasticsearch {
			hosts => ["localhost:9200"]
			index => "Apache"
		}
	}

	if [type] == "nginx"{
		elasticsearch {
			hosts => ["localhost:9200"]
			index => "nginx"
		}
	}
}

Unfortunately I had the same outcome and I don't know where to look now. The apache logs are sample downloaded online and the conf files are pasted from internet too, but I tried various conf for nginx (and previously rsyslog) and it wasn't working either.

Thanks for reading all of this and your future help !

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.