I am new to ElasticSearch and Logstash
i have a DB with tables, I exported them to CSVs then imported to ElasticSearch using Logstash
sample:
{
input {
file {
path => "C:\logstash-6.2.2\bin\profile.csv"
start_position => "beginning"
}
}
filter {
csv {
separator => ","
#"country","user","user_id","birth_date","profile","age"
columns => ["country","user","user_id","birth_date","profile","age"]
skip_header => true
}
mutate
{
remove_field => [ "message" ]
remove_field => [ "path" ]
remove_field => [ "host" ]
}
}
output {
elasticsearch {
hosts => "http://127.0.0.1:9200"
index => "user_profile"
doc_as_upsert => true
document_id => "%{country}_%{user}"
}
stdout {}
}
}
then i started query on it using simple query:
{
"query": {
"term": {
"country": "HK"
}
},
"sort": [
{
"name": {
"order": "asc"
}
}
]
}
but it returns: Fielddata is disabled on text fields by default.
i read some doc, and find that we need to set fields to "keyword"
but how can we do it using Logstash?
thanks