How to maintain the order in which records are read

M trying to read a csv file which is sorted by time-stamp.

m trying to add a field with incremental value as per the ascending order of time-stamp

But the incremental value is getting assigned in random order to the records.
how to assign the values in the order of the time-stamp.

Thanks

What do your filters look like? A first step would be to restrict the number of pipeline workers to 1 (one).

Here's my filter.

filter {
csv{
	separator => ","
	columns => ["loglevel","taskid","logger","label","duration","ttime","ptime"]
	convert => { 
		"duration" => "integer"
		"ptime" => "integer"
		"ttime" => "date"			
	}


aggregate {
	task_id => "%{taskid}"
	code => "
       map['test'] ||= 0;
       event.set('pp',map['test']);
		map['test'] += 1;
     "
	push_map_as_event_on_timeout => true
	timeout_task_id_field => "taskid"
	timeout => 30
	timeout_tags => ['_aggregatetimeout']
	timeout_code => ""
  }
 }

how can i restrict the number of pipeline workers?

Thanks

how can i restrict the number of pipeline workers?

https://www.elastic.co/guide/en/logstash/current/logstash-settings-file.html

Thanks it worked..
i changes the pipeline workers to 1.
will it affect for large amount of data(~600k records)?

If you don't allow filters to process events in parallel the performance will of course suffer.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.