Newbie question - can you create a new index and parser for a log that has a certain string?

Hi there,

I apologize for the newbie question but I’m ingesting CCURE badging logs that I’ve tagged by appending the “CCURE” string at the end of each log that is just CSV output. For example:

2025-08-19 16:04:00.000,Smith, John,BREAKROOM HALLWAY,CardAdmitted,CCURE

There’s also the following field that identifies these logs:

log.file.path: /var/log/ccure.csv

Is there a way to create an index based on one of these criteria?

Secondly, how do you go about creating a parser for that index? I don’t know if that’s actually the correct approach so please let me know if there’s a “right” way to do it.

For example:

@timestamp: 2025-08-19 16:04:00.000
ccure.name.last: Smith
ccure.name.first: John
ccure.door: BREAKROOM HALLWAY
ccure.action: CardAdmitted

Again, I’m just learning this so please forgive me if I’m doing this incorrectly but I’m eager to learn how to achieve this correctly.

Thanks!

Hello @meatwad

Could you please share as to how you are indexing this data into elasticsearch? via filebeat/logstash/elastic-agent based on this your query could be answered about creating the index name based on the log.file.path => ccure

Thanks!!

Hi there and thank you very much for the reply.

The CCURE logs are being written to S3, synced to a syslog server, and ingesting via the elastic-agent that’s managed with Fleet and the “System” integration:

Thank you again!