Logstash is not parsing the data base on config

I have below config :

filter{
if [type] == "log" {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:logdate} %{LOGLEVEL:debugtype} %{DATA:source} %{TIMESTAMP_ISO8601:smsdate},%{WORD:sourceaddr},%{NUMBER:addrton},%{NUMBER:addrnpi},%{WORD:destaddr},%{NUMBER:sourceton},%{NUMBER:sourc
enpi},%{WORD:status}" }}

grok { match => [ "sourceaddr", "^(?.....)" ] }

translate {
field => "operator"
destination => "operator_name"
dictionary => [
"62811", "Telkomsel",
"62812", "Telkomsel",
"62813", "Telkomsel",
"62821", "Telkomsel",
"62822", "Telkomsel",
"62823", "Telkomsel",
"62851", "Telkomsel",
"62852", "Telkomsel",
"62853", "Telkomsel",
"62814", "Indosat",
"62815", "Indosat",
"62816", "Indosat",
"62855", "Indosat",
"62856", "Indosat",
"62857", "Indosat",
"62858", "Indosat",
"62817", "XL",
"62818", "XL",
"62819", "XL",
"62859", "XL",
"62877", "XL",
"62878", "XL",
"62831", "XL",
"62832", "XL",
"62833", "XL",
"62838", "XL",
"62895", "Tri",
"62896", "Tri",
"62897", "Tri",
"62898", "Tri",
"62899", "Tri",
"62881", "Smartfren",
"62882", "Smartfren",
"62883", "Smartfren",
"62884", "Smartfren",
"62885", "Smartfren",
"62886", "Smartfren",
"62887", "Smartfren",
"62888", "Smartfren",
"62889", "Smartfren",
"62828", "Net1"
]

}

date {
  match => [ "smsdate", "dd/MMM/yyyy:HH:mm:ss Z" ]
}

}}

It seems when sending the message into elasticsearch, message is not been parsing since I don't have all the field above. I only have the message field that contain everything. How to resolved this ?

Show an example document that's stored in ES. You can copy/paste from Kibana's JSON tab.

Hi,

Please find below.

{
"_index": "filebeat-2018.06.14",
"_type": "doc",
"_id": "sAOt_mMBRLh_m9zZFTO1",
"_version": 1,
"_score": null,
"_source": {
"message": "2018-06-14 10:22:24,870 DEBUG [org.mobicents.smsc.library.CdrGenerator] 2018-06-14 10:22:24.854,6287737178619,1,1,6282817000071,1,1,success_esme,SS7_HR,message,null,112878,0,null,null,null,null,62818445209,null,0,15,null,0,0,,,,7,"PAIF QUEwlhCIAEAAUW0","",,,",
"offset": 571042,
"source": "/opt/telestax/TelScale-smsc-jboss-7.5.1-95/jboss-5.1.0.GA/server/default/log/cdr.log",
"input": {
"type": "log"
},
"beat": {
"hostname": "localhost.localdomain",
"version": "6.3.0",
"name": "localhost.localdomain"
},
"@timestamp": "2018-06-14T14:22:25.194Z",
"prospector": {
"type": "log"
},
"host": {
"name": "localhost.localdomain"
},
"@version": "1",
"tags": [
"beats_input_codec_plain_applied"
]
},
"fields": {
"@timestamp": [
"2018-06-14T14:22:25.194Z"
]
},
"sort": [
1528986145194
]
}

I'm not sure your event are being processed by Logstash at all. What does your Filebeat configuration look like? Remove all commented lines and format the rest as preformatted text using Markdown notation or the </> toolbar button.

Please find below :

filebeat.inputs:

  • type: log

    enabled: true

    paths:
    /opt/telestax/TelScale-smsc-jboss-7.5.1-95/jboss-5.1.0.GA/server/default/log/cdr.log

filebeat.config.modules:
path: ${path.config}/modules.d/*.yml
reload.enabled: false

setup.template.settings:
index.number_of_shards: 3

setup.kibana:

output.logstash:
hosts: ["103.88.253.83:5044"]

logging.level: debug

You have

if [type] == "log" {

in your config file but your event doesn't have a type field.

Hi Magnus,

I agree with this. Temporary I have remove this "if" and all my field is coming. Previously, I have this in my previous version of elasticsearch where in filebeat I can configure document_type but since this method has been remove, is there any method can be use ?

You can set any fields you like via the fields option in the Filebeat configuration (you'll probably want to enable fields_under_root too).

I have add a field "sourcelog" under the field. How I am going to use it in the logstash config ? is it like below :

if [fields.sourcelog] == "cdrlog" { ?

Almost, see https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#logstash-config-field-references.

According to the reference, I should use :

if [fields][sourcelog] == "cdrlog" { ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.