Input Elastic Plugin in Logstash Error

Hello Everyone,
I want to retrieve data from elastic using logstash. the data that I want to get is sorted by the value column. when I retrieve data from elastic without setting a schedule, the results of data output are sorted asc or desc. but if the schedule setting is activated, the output data is not sorted at all.

This is the logstash config that I use with additional schedule settings :

input {
elasticsearch {
    hosts => "localhost:9200"
    index => "tes-2020.01"
    schedule => "*/30 * * * * * "
    query => '
    {
        "sort": [
            {
                "value": {
                    "order": "asc"
                }
            }
        ],
        "query": {
            "bool": {
                "must": [
                    {
                        "match": {
                            "id.keyword": "abcdef1235"
                        }
                    }
                ],
                "filter": {
                    "range": {
                        "@timestamp": {
                            "gte": "now-1m",
                            "lte": "now"
                        }
                    }
                }
            }
        }
    }
    '
}
filter {
  mutate {
    convert => {
		"value" => "integer"
	}
  }
}

output {
    csv {
        # elastic field name
        fields => ["@timestamp","id", "service_name", "metric_type", "metric_info", "category","node","value"]
        path => "csv-export-desc.csv"
    }

	stdout {
		codec => "rubydebug"
	}
}

This is the output without schedule settings :
image

This is a logstash configuration that I use without setting additional schedule:

input {
elasticsearch {
    hosts => "localhost:9200"
    index => "tes-2020.01"
    query => '
    {
        "sort": [
            {
                "value": {
                    "order": "asc"
                }
            }
        ],
        "query": {
            "bool": {
                "must": [
                    {
                        "match": {
                            "id.keyword": "abcdef1235"
                        }
                    }
                ],
                "filter": {
                    "range": {
                        "@timestamp": {
                            "gte": "now-1m",
                            "lte": "now"
                        }
                    }
                }
            }
        }
    }
    '
}
filter {
  mutate {
    convert => {
		"value" => "integer"
	}
  }
}

output {
    csv {
        # elastic field name
        fields => ["@timestamp","id", "service_name", "metric_type", "metric_info", "category","node","value"]
        path => "csv-export-desc.csv"
    }

	stdout {
		codec => "rubydebug"
	}
}

This is output with setting schedule :
image

help please. thanks

logstash generally does not preserve the order of events. If you set pipeline.workers to 1 and disable the java_execution engine then it will preserve order.

thank you, it works for me

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.