Parse UNIX timestamp as human readable date while moving the data into elasticsearch

Hi, I have a JSON file which has a field named lastUpdated. It has the UNIX timestamp value for the time when the specified record was last updated. I wish to parse this JSON file and feed it to elasticsearch where this UNIX timestamp becomes a human-readable date. And also, I wish to save this date in the same variable i.e. lastUpdated.

Sample JSON file: test.json

    {"name":"Jonathan","score":"9.9","address":"New Delhi","lastUpdated":"1545078074640", "firstUpdated":"1545078074640"}
    {"name":"Sam","score":"8.9","address":"New York","lastUpdated":"1545078074640", "firstUpdated":"1545078074640"}
    {"name":"Michelle","score":"9.0","address":"California","lastUpdated":"1545078074640", "firstUpdated":"1545078074640"}

My logstash configuration file: test.config

input{
file{
		path => "/Users/amsing/Study/data/test.json"
		codec => json
		sincedb_path => "/dev/null"
		start_position => "beginning"
	}
 }

filter{
json{
	source => "message"
}

date{
		match => ["lastUpdated", "UNIX_MS"]
}

   mutate{
	convert => { 
		"name" => "string"
		"score" => "float"
		"address" => "string"
		"lastupdated" => "date"
	}
   }
}

output{
elasticsearch{
	hosts => "localhost:9200"
	index => "test"
}
stdout { codec => rubydebug }
}

I load the data into elasticsearch with the command:

bin/logstash -f ../../data/test.config

And I get the following error:

[2018-12-28T13:58:24,130][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x1c4c8d8 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id="d392c2b542d17cc9e7e38b1343c08cc18d0de592a4047e69aa47bba14c089811", @klass=LogStash::Filters::Mutate, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x4aeecaae>, @filter=<LogStash::Filters::Mutate convert=>{"name"=>"string", "score"=>"float", "address"=>"string", "lastupdated"=>"date"}, id=>"d392c2b542d17cc9e7e38b1343c08cc18d0de592a4047e69aa47bba14c089811", enable_metric=>true, periodic_flush=>false>>", :error=>"translation missing: en.logstash.agent.configuration.invalid_plugin_register", :thread=>"#<Thread:0x3d44e76c run>"}

[2018-12-28T13:58:24,144][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<LogStash::ConfigurationError: translation missing: en.logstash.agent.configuration.invalid_plugin_register>, :backtrace=>["/Users/amsing/Study/logstash/logstash-6.5.3/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.4/lib/logstash/filters/mutate.rb:219:in `block in register'", "org/jruby/RubyHash.java:1343:in `each'", "/Users/amsing/Study/logstash/logstash-6.5.3/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.4/lib/logstash/filters/mutate.rb:217:in `register'", "/Users/amsing/Study/logstash/logstash-6.5.3/logstash-core/lib/logstash/pipeline.rb:242:in `register_plugin'", "/Users/amsing/Study/logstash/logstash-6.5.3/logstash-core/lib/logstash/pipeline.rb:253:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "/Users/amsing/Study/logstash/logstash-6.5.3/logstash-core/lib/logstash/pipeline.rb:253:in `register_plugins'", "/Users/amsing/Study/logstash/logstash-6.5.3/logstash-core/lib/logstash/pipeline.rb:595:in `maybe_setup_out_plugins'", "/Users/amsing/Study/logstash/logstash-6.5.3/logstash-core/lib/logstash/pipeline.rb:263:in `start_workers'", "/Users/amsing/Study/logstash/logstash-6.5.3/logstash-core/lib/logstash/pipeline.rb:200:in `run'", "/Users/amsing/Study/logstash/logstash-6.5.3/logstash-core/lib/logstash/pipeline.rb:160:in `block in start'"], :thread=>"#<Thread:0x3d44e76c run>"}

Also, I wish to parse the firstUpdated field the same way as lastUpdated and have both the date fields in human-readable form when viewed in kibana. So what all changes are to be made in the test.config file to achieve this?

You can only convert to types that exist in JSON (this only affects how the JSON documents being sent to Elasticsearch are formatted, not how Elasticsearch interprets and indexes these).

Your current date filter populates the @timestamp field. If you also want to store the other fields as dates I would probably do something like this (not tested):

date{
    match => ["lastUpdated", "UNIX_MS"]
    target => "lastUpdated"
}

date{
    match => ["firstUpdated", "UNIX_MS"]
    target => "firstUpdated"
}
1 Like

Thanks Christian_Dahlqvist. It works :grinning:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.