Unable to add tag with filename

Hey everyone,

I'm quite stuck trying to add a custom field to my parsed data. Basically, I have a path from where I'm reading the content of csv files, and sending it to elastic. I'd like that content of .csv files receives a fieldwhich is actually the filename. This is my code. I'd really appreciate some help.
The path where csv files are located is C:\apache-jmeter-5.4.3\bin\logs\SomeFilename.csv

I'd be even more happy if I could add a tag to the data with the same content (filename) but I also couldn't get that to work.

input {
  file {
    path => "C:/apache-jmeter-5.4.3/bin/logs/*.csv"
    start_position => "beginning"
  }
}
filter {
    csv {
	separator=> ","
    	columns => ["timeStamp", "elapsed", "label", "responseCode", "responseMessage", "threadName", "dataType",
	            "success", "failureMessage", "bytes",  "sentBytes", "grpThreads", "allThreads", "URL", "Latency", 
		    "SampleCount", "ErrorCount", "IdleTime", "Connect"]
    }
	grok {
    match => { "path" => "%{GREEDYDATA}/%{GREEDYDATA:app}.csv" }
    add_field => { "BuildNumber" => "%{app}" }
	}	

}
output {
  elasticsearch {
    hosts => ["topsecret:9200"]
    index => "logstash-jmeter"
}
}

What does the [path] field look like? What does the [BuildNumber] field look like, and what do you not like about the result?

Show us either an event from output { stdout { codec => rubydebug } } or else copy and paste from the JSON tab from an expanded event in the Kibana Discover pane.

Hi Badger,
Thank you for your reply.

Well, the [path] field is not existing, The only path I have is the location of the csv files, so

"C:/apache-jmeter-5.4.3/bin/logs/*.csv"

I'd like to extract the file name from there, and send it to the new field called BuildNumber (which at the moment doesn't exist in the csv or in elastic).

Here's the expanded event from Kibana:

{
  "_index": "logstash-jmeter",
  "_type": "_doc",
  "_id": "2q25mYABdbRHtk_0bQqL",
  "_version": 1,
  "_score": null,
  "fields": {
    "allThreads.keyword": [
      "1"
    ],
    "Connect.keyword": [
      "0"
    ],
    "IdleTime.keyword": [
      "0"
    ],
    "host.name.keyword": [
      "filippc"
    ],
    "bytes.keyword": [
      "11316"
    ],
    "grpThreads": [
      "1"
    ],
    "IdleTime": [
      "0"
    ],
    "Latency.keyword": [
      "424"
    ],
    "responseCode.keyword": [
      "400"
    ],
    "label.keyword": [
      "Import document"
    ],
    "sentBytes.keyword": [
      "0"
    ],
    "ErrorCount": [
      "1"
    ],
    "timeStamp.keyword": [
      "2022/05/06 16:05:16.471"
    ],
    "responseMessage.keyword": [
      "Bad Request"
    ],
    "URL": [
      "http://filippc/api/documentservice//api/v2.1/documentService/thin//Document/Import/1/1"
    ],
    "dataType.keyword": [
      "text"
    ],
    "responseCode": [
      "400"
    ],
    "grpThreads.keyword": [
      "1"
    ],
    "elapsed": [
      "425"
    ],
    "success.keyword": [
      "false"
    ],
    "@version": [
      "1"
    ],
    "SampleCount": [
      "1"
    ],
    "host.name": [
      "filippc"
    ],
    "log.file.path.keyword": [
      "C:/apache-jmeter-5.4.3/bin/logs/90089.csv"
    ],
    "allThreads": [
      "1"
    ],
    "threadName.keyword": [
      "Thread group - Import document 5-1"
    ],
    "event.original": [
      "2022/05/06 16:05:16.471,425,Import document,400,Bad Request,Thread group - Import document 5-1,text,false,,11316,0,1,1,http://filippc/api/documentservice//api/v2.1/documentService/thin//Document/Import/1/1,424,1,1,0,0\r"
    ],
    "Connect": [
      "0"
    ],
    "dataType": [
      "text"
    ],
    "ErrorCount.keyword": [
      "1"
    ],
    "@version.keyword": [
      "1"
    ],
    "elapsed.keyword": [
      "425"
    ],
    "label": [
      "Import document"
    ],
    "message": [
      "2022/05/06 16:05:16.471,425,Import document,400,Bad Request,Thread group - Import document 5-1,text,false,,11316,0,1,1,http://filippc/api/documentservice//api/v2.1/documentService/thin//Document/Import/1/1,424,1,1,0,0\r"
    ],
    "SampleCount.keyword": [
      "1"
    ],
    "threadName": [
      "Thread group - Import document 5-1"
    ],
    "Latency": [
      "424"
    ],
    "timeStamp": [
      "2022/05/06 16:05:16.471"
    ],
    "@timestamp": [
      "2022-05-06T14:15:07.445Z"
    ],
    "success": [
      "false"
    ],
    "bytes": [
      "11316"
    ],
    "message.keyword": [
      "2022/05/06 16:05:16.471,425,Import document,400,Bad Request,Thread group - Import document 5-1,text,false,,11316,0,1,1,http://filippc/api/documentservice//api/v2.1/documentService/thin//Document/Import/1/1,424,1,1,0,0\r"
    ],
    "event.original.keyword": [
      "2022/05/06 16:05:16.471,425,Import document,400,Bad Request,Thread group - Import document 5-1,text,false,,11316,0,1,1,http://filippc/api/documentservice//api/v2.1/documentService/thin//Document/Import/1/1,424,1,1,0,0\r"
    ],
    "log.file.path": [
      "C:/apache-jmeter-5.4.3/bin/logs/90089.csv"
    ],
    "URL.keyword": [
      "http://filippc/api/documentservice//api/v2.1/documentService/thin//Document/Import/1/1"
    ],
    "responseMessage": [
      "Bad Request"
    ],
    "sentBytes": [
      "0"
    ]
  },
  "sort": [
    1651846507445
  ]
}

Exactly. You do not have a [path] field, so the grok filter is a no-op and the add_field does not happen. You have a [log][file][path] field. Change the grok to use that.

Thank you very much! This resolved my issue :slight_smile: