Hi,
I'm trying to read a CSV file and insert in ES, all in 6.3.2. Should be easy right?
I created an index template:
PUT _template/mytemplate
{
"index_patterns": ["test*"],
"settings": {
"index.mapping.total_fields.limit": 10000,
"number_of_shards": 1
},
"mappings": {
"doc": {
"properties": {
"sex": { "type": "text" },
"country": { "type": "text" }
}
}
}
}
Then I'm using this logstash.conf:
input {
file {
path => "C:\Users\Vincent\Documents\tech\logstash-6.3.2\data\wholedf_100_colon_small.csv"
start_position => "beginning"
close_older => 60
sincedb_path => "/dev/null"
}
}
filter {
csv {
separator => ","
autodetect_column_names => true
autogenerate_column_names => true
remove_field => [ "path", "message", "host", "@timestamp", "@version" ]
}
}
output {
elasticsearch {
hosts => ["https://ec_id.europe-west1.gcp.cloud.es.io:9243"]
user => "elastic"
password => "mypassword"
index => "test"
manage_template => false
template_name => "mytemplate"
}
}
And this sample CSV
"sex","country"
"M","FR"
"F","UK"
And after logstash execution, I run a GET test/_search
and I get:
...
"hits": {
"total": 2,
"max_score": 1,
"hits": [
{
"_index": "test",
"_type": "doc",
"_id": "14b84WQBmdGmeuM4UGwp",
"_score": 1,
"_source": {
"country": "FR",
"sex": "M"
}
},
{
"_index": "test",
"_type": "doc",
"_id": "2Ib84WQBmdGmeuM4UGwp",
"_score": 1,
"_source": {
"country": "UK",
"sex": "F"
}
}
]
}
What should I do to get the fields as fields and not in the _source field?
Thanks a lot in advance for your help!