CSV file with mutated fields does not get indexed

Hi,
I am pretty new to ELK stack. We are trying to build some performance test result dashboards by indexing results csv fils. I am trying to index a .csv file with columns mutated to float data types. Here is my config file:

input {
file {
path => "C:\Test4\Beacon&Ecommerce.csv"
start_position => beginning

}

}
filter {
csv {
columns => [
"Minute_of_Test",
"Time",
"EC_T01_BrowsingCatalouge_Homepage",
"EC_T01_CreateNewOrderLink_Homepage",
"EC_T01_ViewOrderHistory_Homepage",
"EC_T02_BrowsingCatalouge_Login",
"EC_T02_CreateNewOrderLink_Login",
"EC_T02_ViewOrderHistory_Login",
"EC_T03_BrowsingCatalouge_ClickOnTestProduct",
"EC_T03_CreateNewOrderLink_ClickNewOrder",
"EC_T03_ViewOrderHistory_ClickOrderHistory",
"EC_T04_BrowsingCatalouge_LogOff",
"EC_T04_CreateNewOrderLink_EnterDetailsClickNext",
"EC_T04_ViewOrderHistory_ClickViewOrder",
"EC_T05_CreateNewOrderLink_ClickSubmitOrder",
"EC_T05_QuerySavedOrders_LogOff",
"EC_T06_CreateNewOrderLink_LogOff",
"FetchAcountDetails",
"Order_History_User_Search_for_Order_History",
"Order_Request_Order_Simulate",
"Order_Request_Order_Submit",
"Practice_Search_Against_SAP",
"Search_and_Load_Practice_I3"
]
separator => ","
remove_field => ["message"]
}
date{
match=>["Time","yyyy-MM-dd HH:mm:ss"]
}
mutate {
convert => {
"Minute_of_Test"=>"integer"
"EC_T01_BrowsingCatalouge_Homepage"=>"float"
"EC_T01_CreateNewOrderLink_Homepage"=>"float"
"EC_T01_ViewOrderHistory_Homepage"=>"float"
"EC_T02_BrowsingCatalouge_Login"=>"float"
"EC_T02_CreateNewOrderLink_Login"=>"float"
"EC_T02_ViewOrderHistory_Login"=>"float"
"EC_T03_BrowsingCatalouge_ClickOnTestProduct"=>"float"
"EC_T03_CreateNewOrderLink_ClickNewOrder"=>"float"
"EC_T03_ViewOrderHistory_ClickOrderHistory"=>"float"
"EC_T04_BrowsingCatalouge_LogOff"=>"float"
"EC_T04_CreateNewOrderLink_EnterDetailsClickNext"=>"float"
"EC_T04_ViewOrderHistory_ClickViewOrder"=>"float"
"EC_T05_CreateNewOrderLink_ClickSubmitOrder"=>"float"
"EC_T05_QuerySavedOrders_LogOff"=>"float"
"EC_T06_CreateNewOrderLink_LogOff"=>"float"
"FetchAcountDetails"=>"float"
"Order_History_User_Search_for_Order_History"=>"float"
"Order_Request_Order_Simulate"=>"float"
"Order_Request_Order_Submit"=>"float"
"Practice_Search_Against_SAP"=>"float"
"Search_and_Load_Practice_I3"=>"float"
}
}

}

output {
elasticsearch {
action => "index"
index => "soasta_result_1"
}
stdout { }
}

Logstash says my config file is OK. But, it doesn't index the data into elasticsearch. Can you please let me know where I am going wrong?

Logstash is probably waiting for more data to be added to Beacon&Ecommerce.csv. Unless Logstash is seeing a file for the first time start_position => beginning doesn't matter.

Disable the elasticsearch output for now and just use a stdout { codec => rubydebug } output. When that proves to give the output you need you can enable the elasticsearch output again. Practice divide and conquer.

Thank you, Magnus.