How can I make logstash automatically send my information to elasticsearch?

Hi

I was wondering if there is a method on how I could make logstash automatically send information to my elasticsearch.I have my config file :

input {
  stdin {}
}

filter {
  grok {
    match => { "message" => "time=%{TIMESTAMP_ISO8601:time} url=%{URIPATH:url} clientIp=%{IP:clientIp} useragent=\"%{DATA:useragent}\" message=\"%{GREEDYDATA:message}\"" }
  }
  date {
    match => [ "time", "ISO8601" ]
  }
  json {
    source => "message"
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "indexforlogstash"
  }
  stdout {}
}

Can I add anything more in my config file so that logstash automatically sends my logs to elasticsearch so I don't have to send them manually every time through command prompt?
Or is there any other way to solve this problem?The point is I don't want to have to add my log files manually everytime,i want to automate this proccess if possible.

You need to use a different input for the data. Have a look at the file input plugin and use this to have Logstash follow and ingest files in the file system.

Like,could you give me an example please?

Have a look at this old blog post. You can set the path parameter to a directory or pattern and have Logstash pick up new files matching this.

If you can provide some more details about the files you are looking to read (format, location and how they are created) it may be easier to help with an example.

I'm trying to make logstash pick "error.log" file located at "C:\ELK Stack" which is written in JSON,this is what it looks like

{ "time": "2023-03-08 17:20:25.3306", "machinename": "RO-ROMOCEAT", "url": " https://london.cylex-uk.co.uk/GB/cylex-uk.co.uk/london/company/test-linda-18856944.html", "clientIp": "127.0.0.1", "useragent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:109.0) Gecko/20100101 Firefox/110.0", "message": "\"GetGeoByIdAsync\" | geoId: {\"CountryCode\":\"GB\", \"Identity\":20417, \"Language\":\"en-GB\"}", "ex": "System.NotImplementedException: The method or operation is not implemented.\r\n   at CylexBDDataAccessLayer.MongoDBAccess.GetGeoByIdAsync(GeoId geoId) in D:\\Projects\\CylexBDForSyncronizing\\CylexBDDataAccessLayer\\Mongo\\GeosDbAccessPartial.cs:line 45" }

I tried it even with this config file :

input {
  file {
    path => "C:/ELK Stack"
    start_position => "beginning"
    sincedb_path => "nul"
  }
}

filter {
  grok {
    match => { "message" => "time=%{TIMESTAMP_ISO8601:time} url=%{URIPATH:url} clientIp=%{IP:clientIp} useragent=\"%{DATA:useragent}\" message=\"%{GREEDYDATA:message}\"" }
  }
  date {
    match => [ "time", "ISO8601" ]
  }
  json {
    source => "message"
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "logs"
  }
}

And it still doesn't work.
On Windows 10 Pro 64-bit btw :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.