Ignore this ,this topic is not valid any more

Hi Team,

Logstash configuraiton run as below
input { file{ path => "C:/elkstack/elasticsearch-6.5.1/logs/userscenario.csv"
start_position => "beginning"
sincedb_path => "C:/elkstack/elasticsearch-6.5.1/sincedb/sincedb.txt" }}

  filter { 
          csv { columns => [ "when",
                             "httpsessionId",
                             "module",
                             "page",
                             "userId",
                             "actionType"]
               separator => ","
               skip_header => "true"}
   
       aggregate {  task_id => "%{httpsessionId}"
                    code => ' map["userscenario"] ||= ""
                              map["userscenario"] += event.get("actionType") + "->"
                              map["userId"] = event.get("userId")
                              event.cancel '
                    push_map_as_event_on_timeout => true
                    timeout_task_id_field => "httpsessionId"
                    timeout => 3600
                    timeout_code => ' event.set("userscenario", event.get("userscenario").chomp("->")) ' }}


output { file { path => "C:/elkstack/elasticsearch-6.5.1/logs/agguserscenario.csv" 
                codec => plain { format => "%{message}" } 
			  } 
	     stdout { codec => rubydebug }
	   }

But the output file "agguserscenario.csv" isn't created in the specified path. the source file has been opened and read according to below trace log

    [2019-04-29T11:00:18,000][TRACE][filewatch.tailmode.handlers.grow] handling: userscenario.csv
[2019-04-29T11:00:18,025][TRACE][filewatch.tailmode.handlers.grow] reading... {"iterations"=>1, "amount"=>779, "filename"=>"userscenario.csv"}
[2019-04-29T11:00:18,029][DEBUG][filewatch.tailmode.handlers.grow] read_to_eof: get chunk

It also says aggregrate successful.

[2019-04-29T11:00:18,747][DEBUG][logstash.filters.aggregate] Aggregate successful filter code execution {:code=>" map[\"userscenario\"] ||= \"\"\n map[\"userscenario\"] += event.get(\"actionType\") + \"->\"\n                                  map[\"userId\"] = event.get(\"userId\")\n                                  event.cancel "}

Here is the log of logstash.output.file

    [2019-04-29T11:00:11,385][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@path = "C:/elkstack/elasticsearch-6.5.1/logs/agguserscenario.csv"
[2019-04-29T11:00:11,399][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@codec = <LogStash::Codecs::Plain format=>"%{message}", id=>"66e8d89a-568f-4a75-b2e8-05e49134b155", enable_metric=>true, charset=>"UTF-8">
[2019-04-29T11:00:11,399][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@id = "b904b353b33333d65f0520db5861c423c73f26b040282b44d621d5ea00617594"
[2019-04-29T11:00:11,402][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@enable_metric = true
[2019-04-29T11:00:11,403][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@workers = 1
[2019-04-29T11:00:11,404][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@flush_interval = 2
[2019-04-29T11:00:11,409][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@gzip = false
[2019-04-29T11:00:11,411][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@filename_failure = "_filepath_failures"
[2019-04-29T11:00:11,416][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@create_if_deleted = true
[2019-04-29T11:00:11,432][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@dir_mode = -1
[2019-04-29T11:00:11,434][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@file_mode = -1
[2019-04-29T11:00:11,448][DEBUG][logstash.outputs.file    ] config LogStash::Outputs::File/@write_behavior = "append"

By the way, here is my source file logstash to process

when,httpsessionId,module,page,userId,actionType
2019-02-13 10:01:30,sid1,succession,talentsearch,cgrant1,scm.ts.list_saved_search
2019-02-13 10:01:31,sid3,calibration,ManageCalibrationTemplates,hr1,cal.mct.create
2019-02-13 10:01:31,sid1,succession,talentsearch,cgrant1,scm.ts.start_over
2019-02-13 10:01:33,sid1,succession,talentsearch,cgrant1,scm.ts.delete_saved_search
2019-02-13 10:01:33,sid3,calibration,ManageCalibrationTemplates,hr1,cal.mct.edit
2019-02-13 10:01:30,sid2,succession,talentsearch,lokamoto1,scm.ts.list_saved_search
2019-02-13 10:01:33,sid2,succession,talentsearch,lokamoto1,scm.ts.search
2019-02-13 10:01:35,sid2,succession,talentsearch,lokamoto1,scm.ts.nominate
2019-02-13 10:01:35,sid3,calibration,ManageCalibrationTemplates,hr1,cal.mct.delete

Moved to Logstash topic

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.