How to use logstash to upload data in elasticsearch in specific format


(radhe) #1

Both mapping and index are as below:

PUT somki
{
    "mappings": {
        "_doc" : {
            "properties" : {
                "suggest" : {
                    "type" : "completion",
                    "contexts": [
                        { 
                            "name": "userid",
                            "type": "category"
                        }
                    ]
                }
            }
        }
    }
}

POST somki/_doc/
{
    "suggest": {
        "input": ["123","asb"],
        "contexts": {
            "userid": ["99568656541578","abhi"] 
        }
    }
}

And I have a csv file which contains four columns and value of first two columns I want to put in input and value of remaining two column's value I want to put in userid as shown in above indexing. How can I do this?


(radhe) #2

Anyone?


(radhe) #3

Someone?


(Christian Dahlqvist) #4

Try something like this:

input {
  generator {
    lines => ['123,asb,99568656541578,abhi']
    count => 1
  } 
} 

filter {
  grok {
    match => { "message" => "%{NOTSPACE:[suggest][input]},%{NOTSPACE:[suggest][input]},%{NOTSPACE:[suggest][contexts][userid]},%{NOTSPACE:[suggest][contexts][userid]}" }
  }
}

output {
  stdout { codec => rubydebug }
}

(radhe) #5

Thanks for reply.
I tried this and it worked but when I look for suggestion I don't get any. Here is my suggester format.

GET somki/_search
{
    "suggest": {
        "suggestion" : {
            "prefix" : "asb",
            "completion" : {
                "field" : "suggest",
                "contexts": {
                    "userid": [ "99568656541578" ]
                },
                "fuzzy" : {
                  "fuzziness" : 2
                }
            }
        }
    }
}

When I upload each document individually following the format shown above in the question it works.
And my .conf file.

input {

file {
path => "C:\Users\ramesh\Downloads\elasticsearch\Data\Book1.csv"
start_position => "beginning"
 sincedb_path => "/dev/null"
}

  generator {
    lines => ['oNumber,oName,userids,userID']
    count => 1
  } 
} 

filter {
  grok {
    match => { "message" => "%{NOTSPACE:[suggest][input]},%{NOTSPACE:[suggest][input]},%{NOTSPACE:[suggest][contexts][userid]},%{NOTSPACE:[suggest][contexts][userid]}" }
  }
}

output {
elasticsearch {
     hosts => "http://localhost:9200"
     index => "somki"
	 user => elastic
     password => "elastic"
  }
  stdout { codec => rubydebug }
}

Here oNumber, oName, userids, userID are csv file column names and each row contains values like 123, asb, 99568656541578, abhi.
Please provide appropriate changes. Thank You!


(radhe) #6

Anyone?


(radhe) #7

Someone, anything?


(Christian Dahlqvist) #8

I would recommend posting the question related to search and suggestions under the Elasticsearch category as you are more likely to get a response there.


(radhe) #9

Thanks!


(system) #10

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.