Logstash configuration not working on read csv file

Hello Everyone, Logstash not reading csv file, i have tried all possible case but unable to pick data from the csv file. my ultimate goal is read data from REST API and mapping with CSV file. Please suggest, Configuration as below -

input{
  http{
      host => "127.0.0.1"
      port => "*****"
      ecs_compatibility => disabled
      type => "input"
     }

}
filter {
json{   source => "message"
ecs_compatibility => disabled }
split { field => "result" }
mutate {        add_field => {"ids" => "%{[result][server]}"}  }
translate {
        dictionary_path => "/etc/logstash/conf.d/myfilename.csv"
        source => "[ids]"
        target => "[OS]"
        fallback => "I'm a teapot"
    }
mutate {remove_field => ["server","http","url","agent"] }
}
output {
  stdout { codec => rubydebug }
}

You need to provide more context.

What is not working? What does your input message looks like? What is your output? What is the expected output?

You do not have a file plugin, so by reading csv file you mean the csv file used in the translate filter? Share this as well.

Hi, I'm fetching hostname from one of the event and search in csv file, on the match it should populate value field on display.
yes i have a plugin, but here in configuration am using translate.

Data present in csv file as below -
Hostname,Value
server_1,Windows
server_2,Linux
server_3,Unix
server_4,Windows
server_5,Linux
server_6,Unix

Getting below error while executing this configuration -
[ERROR] 2022-06-07 10:52:12.251 [[main]-pipeline-manager] javapipeline - Pipeline error {:pipeline_id=>"main", :exception=>#<LogStash::Filters::Dictionary::DictionaryFileError: Translate: Missing or stray quote in line 1 when loading dictionary file at /etc/logstash/conf.d/filename.csv>, :backtrace=>["/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/csv.rb:1899:in block in shift'"`

[ERROR] 2022-06-07 10:52:12.274 [Converge PipelineAction::Create<main>] agent - Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create<main>, action_result: false", :backtrace=>nil}

Can you check the file? This is saying that you have some issue with quotes.

Hi, I have tried all possible formats/quotes but error remain persists.
Please suggest if there is any desired format for save.
below which i have tried.

"Hostname","Value"
"server_1","Windows"
"server_2","Linux"

"Hostname","Value"
"server_1","Windows",
"server_2","Linux"

"Hostname", "Value",
"server_1", "Windows",
"server_2","Linux",

Your dictionary should look something like this:

server_1,Windows
server_2,Linux
server_3,Unix
server_4,Windows
server_5,Linux
server_6,Unix

According to the documentation you only need quotes when using integers as keys, so you doesn't need to use it when your keys are strings.

Is your dictionary looking as the one I shared and you are still getting the exactly same error? Can you enable debug log to get more information?

Hi Leandro, I have tried all possible ways but can't do a lookup of a field in a csv file.
can you please suggest the translate plugin logic, herewith am pasting the error logs with debug mode.

input{
  http{
      host => "127.0.0.1"
      port => "*****"
      ecs_compatibility => disabled
      type => "input"
     }
}
filter {
	json{   source => "message"
    	ecs_compatibility => disabled }
	    split { field => "result" }
	    mutate {        add_field => {"ids" => "%{[result][server]}"}  }
    csv {
        separator => ","
        skip_header => "true"
        columns => ["server", "os"]
        ecs_compatibility => disabled
    }
    translate {
        dictionary_path => "/etc/logstash/conf.d/myfilename.csv"
        source => "[ids]"
        target => "[os]"
        fallback => "I'm a teapot"
    }
mutate {remove_field => ["server","http","url","agent"] }
}
output {
  stdout { codec => rubydebug }
}
![translate-plugin|690x339](upload://xG0nRedmlJ2wMtNZLL0EHiOANxK.png)

It is still giving the same error, that it is something wrong in your csv file, in the first line.

Do you have anything else in your file besides the key value pairs? A header, maybe? You can't have anything in the file that is not in the format:

key,value

Can you open your file in a terminal text editor and share a print of it?

Hi Leandro, PFA print of that csv file. i'm not using any header in the file.
translate1

It looks like a normal csv, I have no hints why Logstash would give you such error.

I would try to manually create a new file in the yml format.

"server_1": "Windows"
"server_2": "Linux"
"server_3": "Unix"
"server_4": "Windows"
"server_5": "Linux"
"server_6": "Unix"

And then use this file in your translate filter.

Hi Leandro, We have tested with yaml and json format and it works perfectly fine. the problem occurs only on csv format. so can we conclude that csv format is not supported by translate plugin/within logstash.

1 Like

This is not entirely right, the csv format is supported, but for some reason your csv wasn't working, maybe there was something related on how it is created in your system or things like that.

But to know why it wasn't working you would need to open a bug report in the Logstash Github.

Particularly I prefer to use yml files in my dictionaries, so if you changed to yml and it worked, you should keep using yml.