Log file to json


(erkan) #1

Hi,
I installed elasticsearch-6.4.1, kibana-6.4.1, logstash-6.4.1 and filebeat.6-4.1

filebeat.yml
#=========================== Filebeat inputs =============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log
   

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - var/log/test/*.log


#-------------------------- Elasticsearch output ------------------------------
output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["192.168.1.10:9200"]
  # Optional protocol and basic auth credentials.
  #protocol: "https"
  #username: "elastic"
  #password: "changeme"
 
-----------logstash.conf----------------
input {
  beats {
    port => 5044
  }
}
output {
  elasticsearch {
    hosts => ["192.168.1.10:9200"]
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

everything okey. when send message show kibana.

I have log file for example;
09-11-2018 03:02 : DTC Install error = 0, InstallContacts, Exit, com\complus\dtc\dtc\adme\deployment.cpp (389)

I want show kibana;
time:09-11-2018 03:02 DTC
field1: install error = 0
field2: InstallContacts
field3: Exit
fields4: com\complus\dtc\dtc\adme\deployment.cpp (389)

how can I do this?(to json )

ty for help


(erkan) #2

hi again!
i change logstash.conf file but continous problem

---------------------logstash.conf-------------------------
input {
  beats {
    port => 5044 
  }
}
filter {
  grok { match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\]\[%{DATA:loglevel}%{SPACE}\]\[%{DATA:node}\s\]\s\[%{DATA:client}]\ %{GREEDYDATA:message}"}
}
}
output {
  elasticsearch {
    hosts => ["192.168.1.10:9200"]
    manage_template => false
    index => "%{[@metadata][beat]}"
   document_type => "%{[@metadata][json]}"
  }
}


kibana grok debugger result


(Pjanzen) #3

You are sending your data directly to elasticsearch so it does not matter what filter you have in logstash, it will never be used. Inside filebeat setup a logstash output.


(erkan) #4

thank you so much.


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.