How to ingest different set of logs into single index

Hi Team,
I am working on a project, where i have multiple set of log files(apache,syslogs & java logs) coming from different set of sources, i was successful in creating separate set of indices and combined one's where both the inputs comes from same server(using fields) section, Now i am in need to create one single index for all these log files.

Can i achieve this by assigning a logtype input along with filei nput coming from beat port in input filter and by using if conditional in filter section!!!
Is this a good Idea!!

Yes, it sounds reasonable.

I have a question here, the apache combined logs index has both apache error and access logs in them coming from two different files in the same server, so when i parse them through filebeat , they are sent to specific port while the syslogs comes from different source(different server and different port) , how can i include all of them in logstash conf...

input {
  beats {
    port => "5070"
	input_type => combinedapache
        }
	beats {
    port => "5071"
	input_type => syslog
        }
}

How will i be able to represent them in filter section(since the combined apache has two sets of log in them).

You can e.g. use a grok filter that attempts to parse the event as an access log entry. If that's successful, add a field or tag to mark it as an access log. If the parsing fails the event will get the _grokparsefailure tag and you can attempt another grok filter.

(You can list multiple grok expressions in the same filter but then you can't set a tag or field value to indicate which expression matched.)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.