Logstash not picking the file after placing in the input path

Let me check that Badger that might be the issue though :slight_smile:

Have changed that to just _298 and then drop

But still no luck and it is not pciking it :frowning:

You are restarting your service to pick up the config changes right?

Yes Jasonespo have done that still no luck

Can you forward the update you made to:

I believe Badger gave you the answer but I am not sure you implemented it correctly.

I have put filebeat now and it is working fine

May i know where i can post a new query ? I need to know how do we restrict the no of days data in elastic search

That's really an elasticsearch question more than a logstash question.

I would start by reading about Index Lifecycle Management. If you are running an old version of the stack then you can do it using curator.

Can i get a sample on how on how to put the delete API, i want to see the index logs in Kibana only for say last 3 days and delete others

I would ask that question in the elasticsearch forum.

Hi Badger,

May i know the link for Elasticsearch forum please?

https://discuss.elastic.co/c/elasticsearch

Thank you have written a query there

List all indices with:

GET _cat/indices

green open index1          qi6sY0mEQFi5dFKx1WUGeA 1 1     0  0   7.4mb 179.6kb
green open index2          W65-VCneQ8CmjBhB-0JRxQ 5 1 61383  5  18.8mb   9.4mb

Delete an index with:

DELETE index1

I actually want to delete the data inside index prior to last two days and keep only last two days data how do i acheive it

Well.. you should make sure you index your data with a date timestamp on it.

Our production env uses index.date

So we can differentiate the indexes daily

delete_by_query can do that. but if you have too much data it will be expensive.
for eaxmple

POST <index_name>/_delete_by_query
{
  "query": {
    "range": {
      "@timestamp": {
        "gte": "01-01-2019",
        "lte": "01-02-2019",
        "format": "MM-dd-yyyy"
       }
    }
  }
}

This can delete data which has timestamp two days. you just have to config first if your @timestamp field is same pattern MM-dd-yyyy

1 Like

Are you rolling indexes over daily? Is a solution that deletes all indexes older than two days useful, or do you need to delete the subset of data in an index that is older than two days? The first is really cheap, the second is expensive.

Hi Badger,

I want to delete the subset of data in the index which is two days old, also may I know in which file do we need to place this will it come in Logstash or Elasticsearch and what is the file name is it yml or properties file