Elastic search logstash

Hi all,

I want to use logstash for real time syncing of data in both mongodb and elasticsearch.

i am using mongoosastic for real time syncing and it is working fine but i am not able to trnasfer the existing mongodb documents into elasticsearch. I am able to send the existing mongodb documents into elasticsearch using logstash. so i want to use logstash for real time syncing instead of mongoosastic.

Please help me how i configure logstash in my nodejs and how to call the logstash function from nodejs itself instead of through command line.

Logstash isn't a library that can be called from NodeJS.

ok that means i have to run the command

.\bin\logstatsh -f "path of the config file" from the command line.

hello sir,

when i am loading the file using logstash after modifying some field, all the remaining entry is also getting injected 2nd time in elasticsearch. How to not allow the duplicate entry into elasticsearch.

Here is my config file:

input {
file {
path => "C:\Users\Downloads\node-todo-api\output.csv"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
csv {

  separator => ","
   
   columns => ["name", "age", "rollNo" ]
   
}

  mutate { convert => [ "age", "integer"] }
  mutate { convert => [ "rollNo", "integer"] }

}

output {

elasticsearch { 
    hosts => "localhost:9200" 
    index => "friends"
    document_type => "college"
}

 stdout {}

}

Please help me I am new to the ES and Logstash.

ok that means i have to run the command

.\bin\logstatsh -f "path of the config file" from the command line.

Yes, or you can use a wildcard in your file input (e.g. *.csv) and just copy new files to that directory. Logstash will pick up new files within seconds.

How to not allow the duplicate entry into elasticsearch.

Remove sincedb_path => "/dev/null".

Hello sir,

when i am removing sincedb_path => "/dev/null".

the documents is not getting indexed in Elasticsearch

Under which exact circumstances? Have you read the file input documentation and what it says about sincedb_path?

Hello sir,

i use sincedb_path so that i can index the data again.

Because i have a csv file i want to index it into elasticsearch and trying to perform update operation if the data is already in elasticsearch and add new record if that record is not present into elasticsearch.

Please Help !

Because i have a csv file i want to index it into elasticsearch and trying to perform update operation if the data is already in elasticsearch and add new record if that record is not present into elasticsearch.

Then you can't rely on Elasticsearch's automatic assignment of an id to each document. Instead, pick one or more fields from the input data that'll constitute a key to the document and define the document id in the document_id option of the elasticsearch output. If you reprocess the same file or another file where a line has the same key as an existing document it'll get overwritten. You may have to change the elasticsearch output's action option.

Hello sir,

with the configuration of output plugin

output {

elasticsearch { 
    hosts => "localhost:9200" 
    index => "candidate"
    document_type => "stud"
    document_id => "%{name}%{age}"
}

 stdout {}

}

the problem of updating and adding the new record is solved.

Thanks for the help

hello sir,

my csv file conatins these data

"name","age","rollNo"
"ajeet kumar",23,614
"rishu",23,6213
"abhishek",23,6180
"shaurya",25,689

when i am running the config file as an input to the logstash, it is not doing the index of the last
record. in this case , ("shaurya" , 25, 689) is not getting indexed into elasticsearch.

Please Help!

Does the file end with a newline character?

yes sir, file is ending with newline character.

hello sir,

This is the code i am using to create the csv file.

var json2csv = require('json2csv');
var fs = require('fs');
var fields = ['name', 'age', 'rollNo'];

var friends = [{
"name": "ajeet kumar",
"age": 23,
"rollNo": 614
},
{
"name": "rishu",
"age": 23,
"rollNo": 213
},
{
"name": "abhishek",
"age": 23,
"rollNo": 6180
},
{
"name": "shaurya",
"age": 22,
"rollNo": 689
},{
"name": "HOD",
"age": 24,
"rollNo": 45
}]

// console.log(friends)

var csv = json2csv({ data: friends, fields: fields });
fs.writeFile('output_test1.csv', csv, function(err) {
if (err) throw err;
console.log('file saved');
});

But while indexing the last json object is not getting indexed.

Please Help!

I just tested with your script and the resulting file does not end with a newline character.

$ hexdump -C < output_test1.csv      
00000000  22 6e 61 6d 65 22 2c 22  61 67 65 22 2c 22 72 6f  |"name","age","ro|
00000010  6c 6c 4e 6f 22 0a 22 61  6a 65 65 74 20 6b 75 6d  |llNo"."ajeet kum|
00000020  61 72 22 2c 32 33 2c 36  31 34 0a 22 72 69 73 68  |ar",23,614."rish|
00000030  75 22 2c 32 33 2c 32 31  33 0a 22 61 62 68 69 73  |u",23,213."abhis|
00000040  68 65 6b 22 2c 32 33 2c  36 31 38 30 0a 22 73 68  |hek",23,6180."sh|
00000050  61 75 72 79 61 22 2c 32  32 2c 36 38 39 0a 22 48  |aurya",22,689."H|
00000060  4f 44 22 2c 32 34 2c 34  35                       |OD",24,45|
00000069

ok sir, is it required to end with new line character ?

or can i add a dummy record at the last entry of json object, to solve it ?

Hello sir,

Thanks for the help
I modified the array of json object to

var friends = [{
  "name": "ajeet kumar",
  "age": 23,
  "rollNo": 600
},
{
    "name": "rishu",
    "age": 23,
    "rollNo": 6213
},
{
    "name": "abhishek",
    "age": 23,
    "rollNo": 6180
},
{
    "name": "shaurya",
    "age": 22,
    "rollNo": 689
},{
    "name": "HOD",
    "age": 25,
    "rollNo": 450
},
{
    "name": "sush",
    "age": 25,
    "rollNo": 580
},
{
    "name": "chandu",
    "age": 24,
    "rollNo": 215
},
"/n"
]

then i am now able to get all the input field into elasticsearch.

Is it any other way to do this without adding ("/n") at the end.

Thanks

ok sir, is it required to end with new line character ?

Yes.

Thanks for Help sir !

Hello sir,

  when i am adding a new data into the csv file , then the logstash is automatically reading that new input, but when i am modifing any field of csv file , in this case logstash is not automatically reading the file .

Please Help!