Hi,
My index looks like
indexDoc = {
"mappings": {
"test": {
"properties": {
"timestamp": {
"type":
"date",
"doc_values":
True,
"format":
"strict_date_optional_time||epoch_millis||dd/MMM/YYYY :HH:mm:ss Z"
},
"created_on": {
"type": "date",
"index": "not_analyzed",
"doc_values": True,
"format": "date_optional_time"
},
"logmsg": {
"type": "object",
"dynamic": True
},
"logtype": {
"type": "string",
"index": "not_analyzed",
"doc_values": True
},
}
},
"_default_": {
"_all": {
"enabled": True,
"norms": {
"enabled": False
}
},
"properties": {
"timestamp": {
"type":
"date",
"doc_values":
True,
"format":
"strict_date_optional_time||epoch_millis||dd/MMM/YYYY:HH:mm:ss Z"
}
},
"dynamic_templates": [{
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"type": "string",
"norms": {
"enabled": False
},
"fielddata": {
"format": "disabled"
},
"fields": {
"raw": {
"type": "string",
"index": "not_analyzed",
"doc_values": True
}
}
}
}
}, {
"other_fields": {
"match": "*",
"match_mapping_type": "*",
"mapping": {
"doc_values": True
}
}
}]
}
},
"settings": {
"refresh_interval": "5s",
"number_of_shards": 5,
"number_of_replicas": 2
}
}
I have elastricsearch and kibana only. I am not using logstash.
Basically I need to capture full log file in logmsg.
In python I can only use 32000 char in the string, so I am breaking my logfile msg into chunks of 32000.
my final list looks like
output = [ {'created_on': 'sometime', 'logtype': 'sometype', logmsg: { '_msg1: 'aaa', _msg2:'bbb', _msg3': 'ccc'}}]
I am using bulk api to store the values.
Is this the correct way of doing it, or can I achieve this result in some other way.
Problem is in kibana discover tab, in selected field I get infinite field.
logmsg._msg1, logmsg._msg2, logmsg._msg3---------