Help with filebeat

I have configured the Kibana on a working server and 3 machines with filebeat sending logs.
Everything works fine but I need to change the names that appear in the _index fields of the Kibana so that I get the names of the routes of the directory of each log, for example nowadays it is displayed "filebeat-2019.03.08" I need \ original_directoryxxxx-creation_date ...

and I also need to know how to visualize in the field @timestamp the date and time of the creation of the log not the one that comes out today that is the one that arrives at the Kibana

We appreciate help, advice, etc ... where to see, read, links where to see the solutions we are novatillos in this

Do you need to change the index that filebeat is sending?
If that is the case, then just update the index in your output of your filebeat.yml.

Please post examples of what you want, and what your pipelines look like so that we can help troubleshoot.

Hello, thanks for answering, you must excuse our English is very bad, we will try to explain ourselves as best as possible.

We are very new to ELK, and we are just starting, we have successfully installed Kibana on a server, elasticsearch and Logstar, and finally Filebeat ...

We have achieved that everything that works well from 3 different machines generates logs and we have managed to update them successfully in the Kibana viewer, until here we came well.

The problem we have is the following, in our logs that we see in Kibana for example this:

11th March 2019, 09:30:25.742 @timestamp: 11th March 2019, 09:30:25.742
log.file.path: C:\Program Files\Filebeat\envios\setupact.log
host: { "name": "toni-PC" }
message: AudMig: No audio endpoint migration settings found 0x2
error.message: Error decoding JSON: invalid character 'A' looking for beginning of value
error.type: json
_id: S7bfa2kBuioSuSzqsVMT
_type: doc
_index: filebeat-2019.03.11
_score: -

We have 2 problems, the first one is that the date and time that appears in @timestamp is not the correct date and time of the log generation, but it is the DATE that filebeat sent, in this case for example this log was Friday and when we turned on the server today and lit the terminals it sent us if the log but it appears the date of SEND not the date and generation of the log that we need it.

The problem 2 is that in the field "_index: filebeat-2019.03.11" always appears by default in the name of the "filebeat" log and to be able to make a classification of which program that computer generated the log we can not, in this field we would need to get the path of the program that generated the log, for example / Windows / log / etc / etc ... or failing any way that filebeat will generate the route automatically every time it generates the log BEFORE sending it ...

Our "filebeat.yml" configuration is as follows:

##################### Filebeat #####################################
filebeat:
prospectors:
-
paths:
- 'C:\Program Files\filebeat-ELK Local\envios*.log'
fields:
#logzio_codec: json
#token: yFImINDrvEpwKhCcNtBjCNhscvYCMAVD
type: log
fields_under_root: true
json.keys_under_root: true
json.add_error_key: true
encoding: utf-8
#ignore_older: 3h
registry_file: 'C:\Program Files\filebeat-ELK Local\registry'
################### Output ##########################################
output:
logstash:
hosts: ["192.168.0.211:5044"]

As you can see everything very basic we have so far, our main task now that before starting to deepen more we need to correct the 2 problems that we discussed before if it is possible, and if not, another alternative could be put into practice.

Please remember to be patient and very schematic with your explanation we are very new to ELK just a week ago we are starting with it.

Thank you.

Okay, so lets solve these one at a time then. Please note that I do not have a full ELK stack up and running right this moment, so I am unable to test these.

So your filebeat is creating a "source" which contains the full file path when it sends it to logstash.
So what you can do in your logstash filter is something along like this:

output {
	if "realtime-aflog" == [type] {
		elasticsearch {
			codec => "json"
			action => "index"
			index => "%{source}"
		}
	}
}

This should change your index to the source of the file itself. Of course you will have to add in the rest of your Elastic Search stuff to the output for it to work.

As for the date problem, in your example, you don't have an actual date field from Friday. The 3 dates that you have, are all from Monday. Do your log files actually contain a date field?

Hello, thanks for the help, on the subject of the date the test logs we are using do not contain date fields, for example with this log:

{"Jose manuel":1,"de envio 2":"cruel"}

In the kibana viewer we see this:

12th March 2019, 10:27:25.557 Jose manuel: 3

@timestamp: 12th March 2019, 10:27:25.557 de envio 2: cruel

host: PC-Jose-Manuel

_id: dLY6cWkBuioSuSzqL1Mj

_type: doc

_index: filebeat-2019.03.12

_score: -

We created this log yesterday 11/03 and when we upload it today it shows us with the date of shipment 12/03 ... but we need to visualize (if possible) the date of creation of the log, or in its defect some way to be able to know when it was created.

With the issue of the index we do not know exactly where that code is placed that you show us, I think you understand that it is placed in the logstash configuration file?

Can not fix it in the filebeat.yml?

Sorry but as you see we are very new with this ... thank you for your patience.

If you don't have a date field in your logs, I don't think you will be able to set a different date than the processing date.
If you always send the log files up a day after they are created, you may be able to subtract a day from the date, but I have never done that before.

As for the index, are you sending straight from filebeat to ElasticSearch or are you sending it to Logstash first?
If you are sending it to ElasticSearch then you can do it inside your filebeat.yml. If you are sending it to logstash, then you will need to change the filter on the logstash server to have that output.
So find the output section that is sending your files to ElasticSearch and post that, and I will help to edit that to include the index.

hello, I'm sending it to logstash ..
this is my filebeat.yml today is still the same as the one above.

Try with the metricbeat tool and in principle solve all the problems, I can see the two dates of creation and sending and the rest of the data I need, but another problem arose, in kibana the arrival or search does continuously in the time intervals selected and always brings back again and again logs and data that make me illegible in time to put filters, is a vicious circle ...

We are quite frustrated because I see that other people have the same problems and there are no solutions ... the creators should think more about simplicity to work and stop putting so many useless tools ... what the people ask is simple jejejeje ... the data, date of creation, date of shipment and origin! so difficult is to do something so simple! ???

While I agree there are some problems with the ELK stack, it is still fairly robust. There are little tricks that you can use to solve a lot of different problems.

I'm having a little bit of difficulty understanding your last post. What is the problem that you're having right now?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.