Create a custom id

"_index": "access_log",
"_type": "access_logs",
"_id": "0udbmGEBevLW4jI4We7N",
"_score": 1,
"_source": {
"@timestamp": "2018-02-15T07:26:18.252Z",
"path": "V:/cc/7.7/mcs_7_indonesia_dev/mpower/out/log/MessagingBroker_access.log",

i want to change above id "_id": "0udbmGEBevLW4jI4We7N", to 1 and auto increment it.

input{
file{
path => "V:/cc/7.7/mcs_7_indonesia_dev/mpower/out/log/MessagingBroker_access.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter {
if [message] =~ /^=/ {
drop { }
}
}

output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "access_log"
document_type => "access_logs"
}

stdout { }
}
Above is the config file and the above _id is auto generating.

Why do you need an autoincrementing id?

Just for the reference like as we do in mysql.

This id 0udbmGEBevLW4jI4We7N i am not able to understand.

is there any way change the above to 1 and keep on auto increment it

Then you will need to generate the ID externally in your application. Maintaining and generating strictly incrementing IDs in a distributed system can quickly become the bottleneck as it requires a lot of coordination across a cluster, which is why the current scheme is used.

Thank you

 "_index": "access_log",
    "_type": "access_logs",
    "_id": "0udbmGEBevLW4jI4We7N",
    "_score": 1,
    "_source": {
      "@timestamp": "2018-02-15T07:26:18.252Z",
      "path": "V:/cc/7.7/mcs_7_indonesia_dev/mpower/out/log/MessagingBroker_access.log",
      "message": "14/02/2018 13:14:56:800|MessagingRepository |INF|RGS-Nayaz-LT,MessagingBroker,PhilMoOptServlet,MO Request,4896,2018-02-14,null,639155146696,null,null,null,null,1,120,0,null,SUCCESS##MO Response : [200],null\r",
      "@version": "1",
      "host": "RGS-Nayaz-LT"
    }
  }

"message": "14/02/2018 13:14:56:800|MessagingRepository |INF|RGS-Nayaz-LT,MessagingBroker,PhilMoOptServlet,MO Request,4896,2018-02-14,null,639155146696,null,null,null,null,1,120,0,null,SUCCESS##MO Response : [200],null\r",

i want to storte above fields in specified columns.
how i can do it

Use a combination of filters to extract the data into appropriate fields, e.g. grok, dissect and/or csv filters. It should be possible to find good examples by searching this forum.

i used csv i am favcing the errors below is my config file

input{
file{
path => "V:/cc/7.7/mcs_7_indonesia_dev/mpower/out/log/MessagingBroker_access.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}

filter{

if [message] =~ /^=/ {
drop { }
}
csv{
separator => ","
columns => ["HOST","PROCESS","INTERFACE","METHOD","DURATION",
"END_TIME","URL","SUBSCRIBER_ID","DEVICE","SESSION_ID",
"ENTITY_ID","ENTITY_NAME","CALL_DIRECTION","TIMEZONE_OFFSET",
"STATUS","SERVICE_PROVIDER","INPUT","NUMBER_OF_RESULTS",
"DEVICE_MODEL","STATUS_DESC","ORIGIN"]
}
mutate {
convert =>{
"DURATION" => "integer"
"END_TIME" =>"timestamp"
"CALL_DIRECTION" => "integer"
"TIMEZONE_OFFSET" => "integer"
"STATUS" => "integer"
"NUMBER_OF_RESULTS" => "integer"
}
}
}

output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "access_log"
document_type => "access_logs"
}

stdout { }
}


the attached screen shot is my log file contains the following data

It would help if you explain what is not working and what error you are seeing. Note that timestamp is not a valid conversion type. For this field you will need to use the date filter.

Please do not post screenshots of text or data as it is hard to see and impossible to search.

[2018-02-15T15:45:00,256][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x39e26b91 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="116ee375390e8eca6f8c6da6b952c1a682d7acd0274595dd6ebacfad717f1a9e", @klass=LogStash::Filters::Mutate, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x29efd3b3 @metric=#<LogStash::Instrument::Metric:0x4a40761e @collector=#<LogStash::Instrument::Collector:0x7bc75cb6 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x51530ebc @store=#<Concurrent::map:0x00000000000fb0 entries=3 default_proc=nil>, @structured_lookup_mutex=#Mutex:0xe2c8c9b, @fast_lookup=#<Concurrent::map:0x00000000000fb4 entries=70 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :"116ee375390e8eca6f8c6da6b952c1a682d7acd0274595dd6ebacfad717f1a9e", :events]>, @filter=<LogStash::Filters::Mutate convert=>{"DURATION"=>"integer", "END_TIME"=>"date", "CALL_DIRECTION"=>"integer", "TIMEZONE_OFFSET"=>"integer", "STATUS"=>"integer", "NUMBER_OF_RESULTS"=>"integer"}, id=>"116ee375390e8eca6f8c6da6b952c1a682d7acd0274595dd6ebacfad717f1a9e", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#<Thread:0x70e90dd7 run>"}
[2018-02-15T15:45:00,301][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["C:/Users/Nayaz/Downloads/logstash-6.2.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.2.0/lib/logstash/filters/mutate.rb:190:in block in register'", "org/jruby/RubyHash.java:1343:ineach'", "C:/Users/Nayaz/Downloads/logstash-6.2.0/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.2.0/lib/logstash/filters/mutate.rb:188:in register'", "C:/Users/Nayaz/Downloads/logstash-6.2.0/logstash-core/lib/logstash/pipeline.rb:341:inregister_plugin'", "C:/Users/Nayaz/Downloads/logstash-6.2.0/logstash-core/lib/logstash/pipeline.rb:352:in block in register_plugins'", "org/jruby/RubyArray.java:1734:ineach'", "C:/Users/Nayaz/Downloads/logstash-6.2.0/logstash-core/lib/logstash/pipeline.rb:352:in register_plugins'", "C:/Users/Nayaz/Downloads/logstash-6.2.0/logstash-core/lib/logstash/pipeline.rb:736:inmaybe_setup_out_plugins'", "C:/Users/Nayaz/Downloads/logstash-6.2.0/logstash-core/lib/logstash/pipeline.rb:362:in start_workers'", "C:/Users/Nayaz/Downloads/logstash-6.2.0/logstash-core/lib/logstash/pipeline.rb:289:inrun'", "C:/Users/Nayaz/Downloads/logstash-6.2.0/logstash-core/lib/logstash/pipeline.rb:249:in `block in start'"], :thread=>"#<Thread:0x70e90dd7 run>"}
[2018-02-15T15:45:00,349][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}

Please consult the documentation I linked to. You can not cast it to a date - you have to use a separate filter.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.