With below settings im unable to index the data .
Logstash pipeline starts and stands still without showing any error or producing output.
Please correct me where it is going wrong
Input csv:
cust_name,state1,city1,state2,city2
ab,CA,LA,IL,Chicago
Mapping used :
"mappings": {
"info": {
"properties": {
"cust_name": {
"type": "string"
},
"address": {
"properties": {
"address1": {
"properties": {
"city": {
"type": "string"
},
"state": {
"type": "string"
}
}
},
"address2": {
"properties": {
"city": {
"type": "string"
},
"state": {
"type": "string"
}
}
}
},
"type": "nested"
}
}
}
}
conf file :
input {
file {
path => "/home/cloudera/Desktop/nested_csv.csv"
type => "core2"
start_position => "beginning"
}
}
filter {
csv {
columns => ["cust_name", "state1","city1","state2","city2"]
separator => ","
}
mutate{
rename => {
"state1" => "[address][address1][state]"
"city1" => "[address][address1][city]"
"state2" => "[address][address2][state]"
"city2" => "[address][address2][city]"
}
}
}
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "nested_sample"
document_type => "info"
workers => 1
}
}
Desired output :
cust_name: "ab",
address : {
address1 : {
state:"CA",
city:"LA"
},
address2:{
state:"IL",
city:"Chicago"
}
}
Thanks in Advance.