Parse "message" field on Syslog

Hi Guys! I m new in ELK.

have managed to get the Stack up and send my syslogs from my API Manager.

I would like to be able to transform a Syslog field into JSON. The "message" field.

This is my logstash.conf

input {
tcp {
port => 5000
type => syslog
}
udp {
port => 5000
type => syslog
}
}

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
elasticsearch {
hosts => "elasticsearch:9200"
user => elastic
password => yourstrongpasswordhere
}
stdout { codec => rubydebug }
}

Hire mi log on ELK.

I understand that I am not the first to inquire about this issue. Where can I start reading, to solve it?

Thanks a lot.

Hi @Santiago_Fernandez Welcome to the community!

First it is really hard to help when you post screen shots instead if text it's harder to help and can't be tested / searched etc.

That said .. hard to tell but that looks like a json field so perhaps take a look at this

Hi @stephenb, thanks for your reply!

Here you are.

@timestamp
Feb 28, 2021 @ 13:00:09.681

@version
1

_id
K0Ze6XcBV6EJI8QRVqc8

_index
logstash-2021.02.28-000001

_score

_type
_doc

host
gateway

message
{"latencies":{"request":7,"kong":0,"proxy":7},"service":{"host":"api","created_at":1614459071,"connect_timeout":60000,"id":"da7fd962-317e-4757-a347-3f0da3886e6c","protocol":"http","name":"MyAPI","read_timeout":60000,"port":5000,"updated_at":1614459071,"ws_id":"6e2ab00c-1e33-448c-a1ca-180f5a0f57ba","retries":5,"write_timeout":60000},"request":{"querystring":{},"size":153,"uri":"/frase/29","url":"http://api.local:8000/frase/29","headers":{"host":"api.local:8000","accept-encoding":"gzip, deflate","user-agent":"python-requests/2.25.1","accept":"/","connection":"keep-alive"},"method":"GET"},"client_ip":"192.168.0.209","tries":[{"balancer_latency":0,"port":5000,"balancer_start":1614528009794,"ip":"172.27.0.7"}],"upstream_uri":"/frase/29","response":{"headers":{"via":"kong/2.3.2","content-type":"application/json","date":"Sun, 28 Feb 2021 16:00:09 GMT","server":"Werkzeug/1.0.1 Python/3.6.13","connection":"close","x-kong-proxy-latency":"0","x-kong-upstream-latency":"7","content-length":"248"},"status":200,"size":489},"route":{"created_at":1614459089,"ws_id":"6e2ab00c-1e33-448c-a1ca-180f5a0f57ba","id":"e6d9b7ae-a9f2-40c0-b1bf-a7f1aa14f3d9","path_handling":"v0","name":"main","request_buffering":true,"service":{"id":"da7fd962-317e-4757-a347-3f0da3886e6c"},"preserve_host":false,"regex_priority":0,"response_buffering":true,"updated_at":1614459089,"paths":["/"],"https_redirect_status_code":426,"protocols":["http","https"],"strip_path":true},"started_at":1614528009794}

port
59366

tags
_grokparsefailure

type
syslog

An going to read the link!

Hi i resolve with this!

input {
tcp {
port => 5000
type => syslog
}
udp {
port => 5000
type => syslog
}
}

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
json {
source => "message"
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
elasticsearch {
hosts => "elasticsearch:9200"
user => elastic
password => yourstrongpasswordhere
}
stdout { codec => rubydebug }
}

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.