Dynamic log file updated automatically

Hello,

My problem is that I have a dynamic file with log entries, and there're changing always. For example :
a ticket with an id (never change) and a number of events, but this number can grow every minutes.

Is there any way to send dynamic log file (which are always changing, adding new lines, modyfing old ones etc ...) into elasticsearch to work on with kibana (visualize, dashboard etc ...) ?

If yes, how can I work on it ? Thank you.

You are going to have to find the patterns in the contents of the log file entries and decide which filters are best suited to those patterns. Without seeing the logs I don't think anyone is going to be able to help you.

Hello Badger,

Sorry for the inconveniance, don't know you needed the logs and conf. Here there are :

Here is a line of logs :

{"username_count": 1, "description": "Not a test", "rules": [{"id": 102, "type": "Concrete"}], "event_count": 2, "flow_count": 0, "assigned_to": "user1", "security_category_count": 2, "follow_up": false, "source_count": 2, "inactive": true, "protected": false, "category_count": 2, "source_network": "other", "closing_user": "user2", "close_time": 1561151914000, "remote_destination_count": 0, "start_time": 1559109647452, "credibility": 3, "magnitude": 2, "id": 23452, "categories": ["Login", "Database"], "severity": 5, "log_sources": [{"type_name": "Event", "type_id": 18, "name": "Custom", "id": 6}, {"type_name": "Dbt", "type_id": 4, "name": "Db", "id": 5}], "policy_category_count": 0, "device_count": 2, "closing_reason_id": 1, "offense_type": 3, "relevance": 0, "domain_id": 0, "offense_source": "localhost", "local_destination_count": 1, "status": "CLOSED", "client": "user3"}

Here is the conf file in logstash :

input {
file {
path => "<path_of_logs>"
start_position => "beginning"
sincedb_path => "<path_sincedb>"
}
}

filter {

json {
source => "message"
}

date {
match => ["start_time", "UNIX_MS"]
target => "@timestamp"
}
}

output {
elasticsearch {
hosts => "localhost"
index => "off"
}
}

What don't you like about the result of that configuration?

If the file is uploaded (not new lines, but modify old ones) we have duplicates.

Then you would need to set the document id based on some combination of the fields that identify a record. You can use a fingerprint filter to do that.

Hi Badger,

Thanks for your help. Could you explain a bit more with an example ? I don't understand completely the process.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.