Logstash file for parsing JSON data


(Akshay Patil) #1

I want to parse the below-given JSON data through logstash and load it to the KIbana

Data

{"PhoneNumber":{"0":27817768541},"TimeStamp":{"0":1471433451000},"ncalls":{"0":20}}

can anyone help me with the logstash config file?


(Christian Dahlqvist) #2

What have you tried so far?


(Akshay Patil) #3

input {
file {
start_position => beginning
path => ["C:/Users/akshay.patil/Desktop/test.log"]
codec => "json"
}
}

filter {
grok {
match => ["message", "[%{WORD}:%{LOGLEVEL}] %{TIMESTAMP_ISO8601:tstamp} :: %{GREEDYDATA:msg}"]
}
json{
source => "message"
target => "parsedJson"
}
mutate {
add_field => {
"PhoneNumber" => "%{[parsedJson][PhoneNumber]}"
"TimeStamp" => "%{[parsedJson][TimeStamp]}"
"NumberOfCalls" => "%{[parsedJson][ncalls]}"
}
}
}

output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["localhost:9200"]
index => "logstash-%{+YYYY.MM.dd}"
}
}


(Christian Dahlqvist) #4

What does a full line in the file you are processing look like?

The recommended way to create a config is to remove the Elasticsearch output and just output to stdout, then start with a minimal config, e.g. file input with json codec, and inspect the result. Then add filter after filter until complete, while continuously inspecting how the format of the data changes.


(Akshay Patil) #5

How will the index get created?


(Christian Dahlqvist) #6

Once you have completed the configuration and the events look like they should, you switch from the stdout plugin to the elasticsearch plugin.


(system) #7

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.