Logstash error in config file

I'm trying to feed data in csv files into elastic search using logstash. My logsatsh config file looks like this:

    input {
    file {
        path => "C:\Users\shreya\Data\RetailData.csv"
        start_position => "beginning" 
        #sincedb_path => "C:\Users\shreya\null"

    }
}
filter {
    csv {
        separator => ","
        id => "Store_ID"
        columns => ["Store","Date","Temperature","Fuel_Price", "MarkDown1", "MarkDown2", "MarkDown3", "MarkDown4", "CPI", "Unemployment", "IsHoliday"]
    }
    mutate {convert => ["Store", "integer"]}
    mutate {convert => ["Date", "date"]}
    mutate {convert => ["Temperature", "float"]}
    mutate {convert => ["Fuel_Price", "float"]}
    mutate {convert => ["CPI", "float"]}
    mutate {convert => ["Unemployment", "float"]}


}
output {
    elasticsearch {
        action => "index"
        hosts => "localhost:9200" 
        index => "store" 
        document_type => "store_retail"     
    }
    stdout {} 
    #stdout {
  #       codec => rubydebug
  #}
}

But I'm getting an error and not able to figure out a way to solve that. I'm new to logstash. My error log looks like this:

[2017-12-02T15:56:38,150][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration"} [2017-12-02T15:56:38,165][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration"} [2017-12-02T15:56:38,243][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2017-12-02T15:56:39,117][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2017-12-02T15:56:42,965][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=>"index", hosts=>["localhost:9200"], index=>"store", document_type=>"store_retail", id=>"91a4406a13e9377abb312acf5f6be8e609a685f9c84a5906af957e956119798c">} [2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2017-12-02T15:56:43,604][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} [2017-12-02T15:56:43,854][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"} [2017-12-02T15:56:43,932][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} [2017-12-02T15:56:43,933][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} [2017-12-02T15:56:43,964][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]} [2017-12-02T15:56:44,011][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#>, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - namespace: [stats, pipelines, main, plugins, filters, e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, events] key: duration_in_millis value:0, @id=\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", @klass=LogStash::Filters::Mutate, @metric_events=#, @structured_lookup_mutex=#, @fast_lookup=#>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb, :events]>, @filter={\"Date\"=>\"date\"}, id=>\"e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb\", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#"} [2017-12-02T15:56:44,042][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#, :backtrace=>["C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:186:in block in register'", "org/jruby/RubyHash.java:1343:ineach'", "C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.1.6/lib/logstash/filters/mutate.rb:184:in register'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:388:inregister_plugin'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in block in register_plugins'", "org/jruby/RubyArray.java:1734:ineach'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in register_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:801:inmaybe_setup_out_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:409:in start_workers'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:333:inrun'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:293:inblock in start'"], :thread=>"#"} [2017-12-02T15:56:44,058][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create`/pipeline_id:main, action_result: false", :backtrace=>nil}

1 Like

The mutate / convert should look like this:

  mutate {
    convert => {
      "Store" => "integer"
      "Temperature" => "float"
      "Fuel_Price" => "float"
      "CPI" => "float"
      "Unemployment" => "float"
    }
  }

To convert the date field use a date filter. What is your date format?

date was in mm/dd/yyyy format. I removed the date field though to be on safer side and tried.

My csv file looks like as below:

Store Temperature Fuel_Price MarkDown1 CPI Unemployment IsHoliday
1 42.31 2.572 NA 211.0963582 8.106 FALSE
1 38.51 2.548 NA 211.2421698 8.106 TRUE
1 39.93 2.514 NA 211.2891429 8.106 FALSE
1 46.63 2.561 NA 211.3196429 8.106 FALSE

my .conf file looks like:

input {
	file {
		path => "C:\Users\shreya\Data\RetailData.csv"
		start_position => "beginning" 
		ignore_older => 0	
	}
}
filter {
	mutate {
       gsub => ["[message]", "\s", " "]
     }
  }
	csv {
		separator => ","
		columns => ["Store","Temperature","Fuel_Price", "MarkDown1", "CPI", "Unemployment", "IsHoliday"]
	}
	mutate {
    		convert => {
     		"Store" => "integer"
   		"Temperature" => "float"
      		"Fuel_Price" => "float"
      		"CPI" => "float"
      		"Unemployment" => "float"
    }
  }
}
output {
	elasticsearch {
		action => "index"
		hosts => ["localhost:9200"] 
		index => "store" 
		document_type => "store_retail" 	
	}
	stdout {
         codec => rubydebug
  }
}

now the log error looks like:

[2017-12-02T22:03:23,065][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-12-02T22:03:23,070][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x3c842517 @module_name="fb_apache", @directory="C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2017-12-02T22:03:23,121][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration"}
[2017-12-02T22:03:23,135][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x7f670dfb @module_name="netflow", @directory="C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2017-12-02T22:03:23,556][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"C:/Users/shreya/logstash-6.0.0/config/pipelines.yml"}
1 Like

log contd.

[2017-12-02T22:03:23,568][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-02T22:03:23,619][DEBUG][logstash.agent           ] Agent: Configuring metric collection
[2017-12-02T22:03:23,842][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-02T22:03:23,902][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-02T22:03:24,026][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-02T22:03:24,029][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-02T22:03:24,041][DEBUG][logstash.agent           ] starting agent
[2017-12-02T22:03:24,572][DEBUG][logstash.agent           ] Starting puma
[2017-12-02T22:03:24,610][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2017-12-02T22:03:24,649][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["C:/Users/shreya/Data/RetailData.csv"]}
[2017-12-02T22:03:24,652][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"C:/Users/shreya/Data/retailevents.conf"}
[2017-12-02T22:03:24,740][DEBUG][logstash.agent           ] Converging pipelines
[2017-12-02T22:03:24,740][DEBUG][logstash.agent           ] Needed actions to converge {:actions_count=>1}
[2017-12-02T22:03:24,740][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2017-12-02T22:03:24,787][DEBUG][logstash.api.service     ] [api-service] start
[2017-12-02T22:03:25,653][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}

log contd.

[2017-12-02T22:03:25,716][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, input, filter, output at line 13, column 2 (byte 196) after ", :backtrace=>["C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:42:in compile_ast'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:50:incompile_imperative'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:54:in compile_graph'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:12:inblock in compile_sources'", "org/jruby/RubyArray.java:2486:in map'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/compiler.rb:11:incompile_sources'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:107:in compile_lir'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:49:ininitialize'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:215:in initialize'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline_action/create.rb:35:inexecute'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:335:in block in converge_state'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:141:inwith_pipelines'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:332:in block in converge_state'", "org/jruby/RubyArray.java:1734:ineach'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:319:in converge_state'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:166:inblock in converge_state_and_update'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:141:in with_pipelines'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:164:inconverge_state_and_update'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/agent.rb:90:in execute'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/runner.rb:362:inblock in execute'", "C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2017-12-02T22:03:25,786][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Stopping
[2017-12-02T22:03:25,790][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Stopping
[2017-12-02T22:03:25,790][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Stopping
[2017-12-02T22:03:25,792][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] PeriodicPoller: Stopping
[2017-12-02T22:03:25,811][DEBUG][logstash.agent ] Shutting down all pipelines {:pipelines_count=>0}
[2017-12-02T22:03:25,814][DEBUG][logstash.agent ] Converging pipelines
[2017-12-02T22:12:08,040][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration"}
[2017-12-02T22:12:08,041][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x17b1b45 @module_name="fb_apache", @directory="C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2017-12-02T22:12:08,047][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration"}
[2017-12-02T22:12:08,079][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x504e6ad3 @module_name="netflow", @directory="C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2017-12-02T22:12:08,188][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"C:/Users/shreya/logstash-6.0.0/config/pipelines.yml"}
[2017-12-02T22:12:08,204][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2017-12-02T22:12:08,313][DEBUG][logstash.agent ] Agent: Configuring metric collection
[2017-12-02T22:12:08,329][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-02T22:12:08,391][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-02T22:12:08,532][DEBUG][logstash.instrument.periodicpoller.persistentqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-02T22:12:08,781][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] PeriodicPoller: Starting {:polling_interval=>5, :polling_timeout=>120}
[2017-12-02T22:12:08,844][DEBUG][logstash.agent ] starting agent
[2017-12-02T22:12:08,953][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["C:/Users/shreya/Data/RetailData.csv"]}
[2017-12-02T22:12:08,968][DEBUG][logstash.agent ] Starting puma
[2017-12-02T22:12:08,968][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"C:/Users/shreya/Data/retailevents.conf"}
[2017-12-02T22:12:09,000][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2017-12-02T22:12:09,031][DEBUG][logstash.agent ] Converging pipelines
[2017-12-02T22:12:09,124][DEBUG][logstash.agent ] Needed actions to converge {:actions_count=>1}
[2017-12-02T22:12:09,124][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2017-12-02T22:12:09,140][DEBUG][logstash.api.service ] [api-service] start

Not able to understand what the following line from the error log means:

[2017-12-02T22:03:24,649][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>["C:/Users/shreya/Data/RetailData.csv"]}

Problem solved! there was actually brace mis-match in my filter block.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.